"
167 | ],
168 | "text/plain": [
169 | " u.name\n",
170 | "0 SQLSVC@TOKYO.JAPAN.LOCAL\n",
171 | "1 SCANSERVICE@TOKYO.JAPAN.LOCAL\n",
172 | "2 KRBTGT@JAPAN.LOCAL\n",
173 | "3 BACKUPLDAP@TOKYO.JAPAN.LOCAL\n",
174 | "4 KRBTGT@TOKYO.JAPAN.LOCAL\n",
175 | "5 KRBTGT@SINGAPORE.LOCAL"
176 | ]
177 | },
178 | "execution_count": 5,
179 | "metadata": {},
180 | "output_type": "execute_result"
181 | }
182 | ],
183 | "source": [
184 | "g.run(\"\"\"\n",
185 | "MATCH (u:User {hasspn:true})\n",
186 | "RETURN u.name\n",
187 | "\"\"\").to_data_frame()"
188 | ]
189 | },
190 | {
191 | "cell_type": "markdown",
192 | "metadata": {},
193 | "source": [
194 | "## Retrieve Kerberoastable Users with Path to DA "
195 | ]
196 | },
197 | {
198 | "cell_type": "markdown",
199 | "metadata": {},
200 | "source": [
201 | "We can limit our results and return only Kereberoastable users with paths to DA. We can find Kerberoastable users with a path to DA and also see the length of the path to see which one is the closest."
202 | ]
203 | },
204 | {
205 | "cell_type": "code",
206 | "execution_count": 6,
207 | "metadata": {},
208 | "outputs": [],
209 | "source": [
210 | "krb_users_path_to_DA = g.run(\"\"\"\n",
211 | "MATCH (u:User {hasspn:true})\n",
212 | "MATCH (g:Group {name:'DOMAIN ADMINS@JAPAN.LOCAL'})\n",
213 | "MATCH p = shortestPath(\n",
214 | " (u)-[*1..]->(g)\n",
215 | ")\n",
216 | "RETURN u.name,LENGTH(p)\n",
217 | "ORDER BY LENGTH(p) ASC\n",
218 | "\"\"\").to_data_frame()"
219 | ]
220 | },
221 | {
222 | "cell_type": "code",
223 | "execution_count": 7,
224 | "metadata": {},
225 | "outputs": [
226 | {
227 | "data": {
228 | "text/html": [
229 | "
\n",
230 | "\n",
243 | "
\n",
244 | " \n",
245 | "
\n",
246 | "
\n",
247 | "
u.name
\n",
248 | "
LENGTH(p)
\n",
249 | "
\n",
250 | " \n",
251 | " \n",
252 | "
\n",
253 | "
0
\n",
254 | "
SQLSVC@TOKYO.JAPAN.LOCAL
\n",
255 | "
3
\n",
256 | "
\n",
257 | "
\n",
258 | "
1
\n",
259 | "
BACKUPLDAP@TOKYO.JAPAN.LOCAL
\n",
260 | "
5
\n",
261 | "
\n",
262 | " \n",
263 | "
\n",
264 | "
"
265 | ],
266 | "text/plain": [
267 | " u.name LENGTH(p)\n",
268 | "0 SQLSVC@TOKYO.JAPAN.LOCAL 3\n",
269 | "1 BACKUPLDAP@TOKYO.JAPAN.LOCAL 5"
270 | ]
271 | },
272 | "execution_count": 7,
273 | "metadata": {},
274 | "output_type": "execute_result"
275 | }
276 | ],
277 | "source": [
278 | "krb_users_path_to_DA"
279 | ]
280 | },
281 | {
282 | "cell_type": "markdown",
283 | "metadata": {},
284 | "source": [
285 | "## Return Most Privileged Kerberoastable users"
286 | ]
287 | },
288 | {
289 | "cell_type": "markdown",
290 | "metadata": {},
291 | "source": [
292 | "What if we do not have kerberoastable users with a path to DA? We can still look for most privileged Kerberoastable users based on how many computers they have local admins rights on. "
293 | ]
294 | },
295 | {
296 | "cell_type": "code",
297 | "execution_count": 8,
298 | "metadata": {},
299 | "outputs": [],
300 | "source": [
301 | "privileged_kerberoastable_users = g.run(\"\"\"\n",
302 | "MATCH (u:User {hasspn:true})\n",
303 | "OPTIONAL MATCH (u)-[:AdminTo]->(c1:Computer)\n",
304 | "OPTIONAL MATCH (u)-[:MemberOf*1..]->(:Group)-[:AdminTo]->(c2:Computer)\n",
305 | "WITH u,COLLECT(c1) + COLLECT(c2) AS tempVar\n",
306 | "UNWIND tempVar AS comps\n",
307 | "RETURN u.name,COUNT(DISTINCT(comps))\n",
308 | "ORDER BY COUNT(DISTINCT(comps)) DESC\n",
309 | "\"\"\").to_data_frame()"
310 | ]
311 | },
312 | {
313 | "cell_type": "code",
314 | "execution_count": 9,
315 | "metadata": {},
316 | "outputs": [
317 | {
318 | "data": {
319 | "text/html": [
320 | "
\n",
321 | "\n",
334 | "
\n",
335 | " \n",
336 | "
\n",
337 | "
\n",
338 | "
u.name
\n",
339 | "
COUNT(DISTINCT(comps))
\n",
340 | "
\n",
341 | " \n",
342 | " \n",
343 | "
\n",
344 | "
0
\n",
345 | "
SQLSVC@TOKYO.JAPAN.LOCAL
\n",
346 | "
1
\n",
347 | "
\n",
348 | " \n",
349 | "
\n",
350 | "
"
351 | ],
352 | "text/plain": [
353 | " u.name COUNT(DISTINCT(comps))\n",
354 | "0 SQLSVC@TOKYO.JAPAN.LOCAL 1"
355 | ]
356 | },
357 | "execution_count": 9,
358 | "metadata": {},
359 | "output_type": "execute_result"
360 | }
361 | ],
362 | "source": [
363 | "privileged_kerberoastable_users"
364 | ]
365 | },
366 | {
367 | "cell_type": "code",
368 | "execution_count": null,
369 | "metadata": {},
370 | "outputs": [],
371 | "source": []
372 | }
373 | ],
374 | "metadata": {
375 | "kernelspec": {
376 | "display_name": "Python 3",
377 | "language": "python",
378 | "name": "python3"
379 | },
380 | "language_info": {
381 | "codemirror_mode": {
382 | "name": "ipython",
383 | "version": 3
384 | },
385 | "file_extension": ".py",
386 | "mimetype": "text/x-python",
387 | "name": "python",
388 | "nbconvert_exporter": "python",
389 | "pygments_lexer": "ipython3",
390 | "version": "3.7.3"
391 | }
392 | },
393 | "nbformat": 4,
394 | "nbformat_minor": 4
395 | }
396 |
--------------------------------------------------------------------------------
/docker/jupyter-graphframes/Dockerfile:
--------------------------------------------------------------------------------
1 | # Author: Roberto Rodriguez (@Cyb3rWard0g)
2 | # License: GPL-3.0
3 |
4 | FROM cyb3rward0g/jupyter-pyspark:0.0.6
5 | LABEL maintainer="Roberto Rodriguez @Cyb3rWard0g"
6 | LABEL description="Notebooks Forge Jupyter Project"
7 |
8 | ENV DEBIAN_FRONTEND noninteractive
9 |
10 | # *********** Setting Environment Variables ***************
11 | ENV GRAPHFRAMES_VERSION=0.8.0
12 | ENV SPARK_GF_VERSION=3.0
13 | ENV SCALA_GF_VERSION=2.12
14 |
15 | USER $USER
16 | # *********** Download Graphframes Jar ***************
17 | RUN wget http://dl.bintray.com/spark-packages/maven/graphframes/graphframes/${GRAPHFRAMES_VERSION}-spark${SPARK_GF_VERSION}-s_${SCALA_GF_VERSION}/graphframes-${GRAPHFRAMES_VERSION}-spark${SPARK_GF_VERSION}-s_${SCALA_GF_VERSION}.jar -P ${SPARK_HOME}/jars/ \
18 | && cp ${SPARK_HOME}/jars/graphframes* ${SPARK_HOME}/graphframes.zip
--------------------------------------------------------------------------------
/docker/jupyter-hunt/Dockerfile:
--------------------------------------------------------------------------------
1 | # Notebooks Forge script: Jupyter Hunt Environment Dockerfile
2 | # Author: Roberto Rodriguez (@Cyb3rWard0g)
3 | # License: GPL-3.0
4 |
5 | FROM cyb3rward0g/jupyter-pyspark:0.0.6
6 | LABEL maintainer="Roberto Rodriguez @Cyb3rWard0g"
7 | LABEL description="Notebooks Forge Jupyter Project."
8 |
9 | ENV DEBIAN_FRONTEND noninteractive
10 |
11 | # *********** Setting Environment Variables ***************
12 | ENV GRAPHFRAMES_VERSION=0.7.0
13 | ENV KAFKA_VERSION=2.4.0
14 | ENV SCALA_VERSION=2.11
15 | ENV SLF4J_API_VERSION=1.7.29
16 | ENV LZ4_JAVA=1.6.0
17 | ENV SNAPPY_JAVA=1.1.7.3
18 | ENV ESHADOOP_VERSION=7.5.2
19 | ENV ESHADOOP_DIR=${JUPYTER_DIR}/es-hadoop
20 |
21 | USER $USER
22 | # **** Current Channels ***********
23 | #- https://repo.anaconda.com/pkgs/main/linux-64
24 | #- https://repo.anaconda.com/pkgs/main/noarch
25 | #- https://repo.anaconda.com/pkgs/free/linux-64
26 | #- https://repo.anaconda.com/pkgs/free/noarch
27 | #- https://repo.anaconda.com/pkgs/r/linux-64
28 | #- https://repo.anaconda.com/pkgs/r/noarch
29 | RUN mkdir -v ${ESHADOOP_DIR} \
30 | # *********** Install Jupyter Notebook & Extra Packages with Conda *************
31 | && conda install --quiet --yes \
32 | 'altair=4.1.0' \
33 | 's3fs=0.4.2' \
34 | 'elasticsearch-dsl=7.0.0' \
35 | 'matplotlib=3.2.1' \
36 | 'networkx=2.4' \
37 | 'nxviz=0.6.2' \
38 | && conda update --all --quiet --yes \
39 | # *********** Clean *****************
40 | && conda clean -tipy \
41 | && conda build purge-all \
42 | && rm -rf /home/$USER/.cache/yarn \
43 | # *********** Install Pip packages not availabe via conda ************
44 | && python3 -m pip install ksql==0.5.1.1 confluent-kafka==1.4.1 splunk-sdk==1.6.12 Kqlmagic==0.1.111.post15 neo4j==1.7.6 openhunt==1.6.5 pyarrow==0.17.0 msticpy==0.4.0 \
45 | # *********** Download ES-Hadoop ***************
46 | && wget https://artifacts.elastic.co/downloads/elasticsearch-hadoop/elasticsearch-hadoop-${ESHADOOP_VERSION}.zip -P ${ESHADOOP_DIR}/ \
47 | && unzip -j ${ESHADOOP_DIR}/*.zip -d ${ESHADOOP_DIR}/ \
48 | && rm ${ESHADOOP_DIR}/*.zip \
49 | # *********** Download Graphframes Jar ***************
50 | && wget http://dl.bintray.com/spark-packages/maven/graphframes/graphframes/${GRAPHFRAMES_VERSION}-spark2.4-s_2.11/graphframes-${GRAPHFRAMES_VERSION}-spark2.4-s_2.11.jar -P ${SPARK_HOME}/jars/ \
51 | && cp ${SPARK_HOME}/jars/graphframes* ${SPARK_HOME}/graphframes.zip \
52 | # *********** Download Extra Jars ***************
53 | && wget https://repo1.maven.org/maven2/org/apache/spark/spark-sql-kafka-0-10_${SCALA_VERSION}/${SPARK_VERSION}/spark-sql-kafka-0-10_${SCALA_VERSION}-${SPARK_VERSION}.jar -P ${SPARK_HOME}/jars/ \
54 | && wget https://repo1.maven.org/maven2/org/apache/kafka/kafka-clients/${KAFKA_VERSION}/kafka-clients-${KAFKA_VERSION}.jar -P ${SPARK_HOME}/jars/ \
55 | && wget https://repo1.maven.org/maven2/org/slf4j/slf4j-api/${SLF4J_API_VERSION}/slf4j-api-${SLF4J_API_VERSION}.jar -P ${SPARK_HOME}/jars/ \
56 | && wget https://repo1.maven.org/maven2/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar -P ${SPARK_HOME}/jars/ \
57 | && wget https://repo1.maven.org/maven2/org/lz4/lz4-java/${LZ4_JAVA}/lz4-java-${LZ4_JAVA}.jar -P ${SPARK_HOME}/jars \
58 | && wget https://repo1.maven.org/maven2/org/xerial/snappy/snappy-java/${SNAPPY_JAVA}/snappy-java-${SNAPPY_JAVA}.jar -P ${SPARK_HOME}/jars/
59 |
60 | # *********** Adding HELK scripts and files to Container ***************
61 | COPY spark/* ${SPARK_HOME}/conf/
62 |
63 | USER root
64 |
65 | RUN chown -R ${USER} ${JUPYTER_DIR} ${HOME} ${SPARK_HOME}
66 |
67 | USER ${USER}
--------------------------------------------------------------------------------
/docker/jupyter-hunt/docker-compose.yml:
--------------------------------------------------------------------------------
1 | version: '3.5'
2 |
3 | services:
4 | jupyter-hunt:
5 | #image: cyb3rwardog/jupyter-hunt:0.0.7
6 | build: ./
7 | container_name: jupyter-hunt
8 | volumes:
9 | - notebooks:/opt/jupyter/notebooks
10 | environment:
11 | JUPYTER_TYPE: lab
12 | JUPYTER_BASE_URL: /jupyter
13 | ports:
14 | - "8888:8888"
15 | restart: always
16 | networks:
17 | hunting:
18 |
19 | networks:
20 | hunting:
21 | driver: bridge
22 |
23 | volumes:
24 | notebooks:
25 | driver: local
26 |
--------------------------------------------------------------------------------
/docker/jupyter-hunt/spark/spark-defaults.conf:
--------------------------------------------------------------------------------
1 | # HELK build Stage: Alpha
2 | # Author: Roberto Rodriguez (@Cyb3rWard0g)
3 | # License: GPL-3.0
4 |
5 | # HELK References:
6 | # https://spark.apache.org/docs/latest/configuration.html
7 | # https://graphframes.github.io/quick-start.html
8 | # https://spark-packages.org/package/graphframes/graphframes
9 | # https://spark.apache.org/docs/latest/sql-programming-guide.html#pyspark-usage-guide-for-pandas-with-apache-arrow
10 |
11 | # ************ Application Properties ****************
12 | # Logs the effective SparkConf as INFO when a SparkContext is started. Default: false
13 | spark.logConf true
14 | # The cluster manager to connect to.
15 | # spark.master spark://helk-spark-master:7077
16 | # Restarts the driver automatically if it fails with a non-zero exit status
17 | spark.driver.supervise true
18 |
19 | # ************ Runtime Environment ****************
20 | # Sets the number of latest rolling log files that are going to be retained by the system. Older log files will be deleted.
21 | spark.executor.logs.rolling.maxRetainedFiles 20
22 | # Set the strategy of rolling of executor logs.
23 | spark.executor.logs.rolling.strategy spark.executor.logs.rolling.time.interval
24 | # Comma-separated list of jars to include on the driver and executor classpaths. Globs are allowed.
25 | spark.jars /opt/jupyter/es-hadoop/elasticsearch-hadoop-7.5.2.jar
26 | # Comma-separated list of Maven coordinates of jars to include on the driver and executor classpaths.
27 | # The coordinates should be groupId:artifactId:version.
28 | #spark.jars.packages graphframes:graphframes:0.7.0-spark2.4-s_2.11,org.apache.spark:spark-sql-kafka-0-10_2.11:2.4.0
29 | #spark.jars.packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.3.1,databricks:spark-sklearn:0.2.3
30 |
31 | # ************ Spark UI ****************
32 | # Base directory in which Spark events are logged
33 | spark.eventLog.dir /opt/jupyter/spark/logs
34 | # Whether to log Spark events, useful for reconstructing the Web UI after the application has finished.
35 | spark.eventLog.enabled true
36 | # Enable running Spark Master as reverse proxy for worker and application UIs.
37 | # In this mode, Spark master will reverse proxy the worker and application UIs to enable access without requiring direct access to their hosts.
38 | spark.ui.reverseProxy true
39 |
40 | spark.sql.execution.arrow.enabled true
41 |
42 | # Enables the external shuffle service. This service preserves the shuffle files written by executors so the executors can be safely removed
43 | spark.shuffle.service.enabled true
44 |
45 | # ************ Dynamic Allocation **************
46 | # Whether to use dynamic resource allocation, which scales the number of executors registered with this application up and down based on the workload
47 | spark.dynamicAllocation.enabled true
48 | # If dynamic allocation is enabled and an executor has been idle for more than this duration, the executor will be removed
49 | spark.dynamicAllocation.executorIdleTimeout 15s
50 |
51 | # Amount of memory to use per executor process, in MiB unless otherwise specified. (e.g. 2g, 8g).
52 | spark.executor.memory 1g
--------------------------------------------------------------------------------
/docker/jupyter-pwsh/Dockerfile:
--------------------------------------------------------------------------------
1 | # Notebooks Forge script: Jupyter Powershell Dockerfile
2 | # Notebooks Forge Stage: Alpha
3 | # Author: Roberto Rodriguez (@Cyb3rWard0g)
4 | # License: GPL-3.0
5 |
6 | FROM cyb3rward0g/jupyter-base:0.0.5
7 | LABEL maintainer="Roberto Rodriguez @Cyb3rWard0g"
8 | LABEL description="Notebooks Forge Jupyter Project."
9 |
10 | ENV DEBIAN_FRONTEND noninteractive
11 |
12 | USER root
13 |
14 | RUN curl -sLO https://packages.microsoft.com/config/ubuntu/18.04/packages-microsoft-prod.deb \
15 | && dpkg -i packages-microsoft-prod.deb \
16 | && apt-get update --fix-missing \
17 | && apt-get install -y --no-install-recommends powershell \
18 | # ********** Clean APT **********
19 | && apt-get -qy clean autoremove \
20 | && rm -rf /var/lib/apt/lists/*
21 |
22 | USER ${USER}
23 |
24 | RUN python3 -m pip install --upgrade pip \
25 | # *********** Install powershell kernel ************
26 | && python3 -m pip install powershell-kernel==0.1.2 \
27 | && python3 -m powershell_kernel.install --powershell-command pwsh
--------------------------------------------------------------------------------
/docker/jupyter-pwsh/docker-compose.yml:
--------------------------------------------------------------------------------
1 | version: '3.5'
2 |
3 | services:
4 | jupyter-pwsh:
5 | build: ./
6 | container_name: jupyter-pwsh
7 | volumes:
8 | - notebooks:/opt/helk/jupyter/notebooks
9 | environment:
10 | JUPYTER_TYPE: lab
11 | JUPYTER_BASE_URL: /jupyter
12 | ports:
13 | - "8888:8888"
14 | restart: always
15 | networks:
16 | hunting:
17 |
18 | networks:
19 | hunting:
20 | driver: bridge
21 |
22 | volumes:
23 | notebooks:
24 | driver: local
--------------------------------------------------------------------------------
/docker/jupyter-pyspark/Dockerfile:
--------------------------------------------------------------------------------
1 | # Notebooks Forge script: Jupyter PySpark Environment Dockerfile
2 | # Author: Roberto Rodriguez (@Cyb3rWard0g)
3 | # License: GPL-3.0
4 |
5 | FROM cyb3rward0g/jupyter-base:0.0.7
6 | LABEL maintainer="Roberto Rodriguez @Cyb3rWard0g"
7 | LABEL description="Notebooks Forge Jupyter Project."
8 |
9 | ENV DEBIAN_FRONTEND noninteractive
10 |
11 | USER root
12 |
13 | # *********** Spark Env Variables ***************
14 | ENV SPARK_VERSION=3.0.1
15 | ENV APACHE_HADOOP_VERSION=2.7
16 | ENV SPARK_HOME=/opt/jupyter/spark
17 |
18 | # *********** Installing Prerequisites ***************
19 | # -qq : No output except for errors
20 | RUN apt-get update -qq \
21 | && apt-get install -qqy openjdk-8-jre-headless ca-certificates-java \
22 | && apt-get -qy clean autoremove \
23 | && rm -rf /var/lib/apt/lists/* \
24 | # *********** Installing Spark ***************
25 | && bash -c 'mkdir -pv /opt/jupyter/spark/logs' \
26 | && wget -c http://apache.claz.org/spark/spark-${SPARK_VERSION}/spark-${SPARK_VERSION}-bin-hadoop${APACHE_HADOOP_VERSION}.tgz -O - | tar xvz -C ${SPARK_HOME} --strip-components=1 \
27 | && chown -R ${USER} ${JUPYTER_DIR} ${HOME}
28 |
29 | # *********** Adding scripts and files to Container ***************
30 | COPY spark/* ${SPARK_HOME}/conf/
31 | COPY kernels/pyspark_kernel.json /usr/local/share/jupyter/kernels/pyspark3/kernel.json
32 |
33 | RUN chown -R ${USER} ${JUPYTER_DIR} ${HOME} ${SPARK_HOME} \
34 | && chown ${USER} /usr/local/share/jupyter/kernels/pyspark3/kernel.json
35 |
36 | EXPOSE 8000
37 |
38 | USER ${USER}
--------------------------------------------------------------------------------
/docker/jupyter-pyspark/docker-compose.yml:
--------------------------------------------------------------------------------
1 | version: '3.5'
2 |
3 | services:
4 | jupyter-pyspark:
5 | image: cyb3rward0g/jupyter-pyspark:0.0.4
6 | container_name: jupyter-pyspark
7 | environment:
8 | JUPYTER_TYPE: notebook
9 | JUPYTER_BASE_URL: /jupyter
10 | ports:
11 | - "8888:8888"
12 | restart: always
13 | networks:
14 | hunting:
15 |
16 | networks:
17 | hunting:
18 | driver: bridge
19 |
20 | volumes:
21 | notebooks:
22 | driver: local
--------------------------------------------------------------------------------
/docker/jupyter-pyspark/kernels/pyspark_kernel.json:
--------------------------------------------------------------------------------
1 | {
2 | "display_name": "PySpark_Python3",
3 | "language": "python",
4 | "argv": [
5 | "/opt/conda/bin/python3",
6 | "-m",
7 | "ipykernel_launcher",
8 | "-f",
9 | "{connection_file}"
10 | ],
11 | "env": {
12 | "SPARK_HOME": "/opt/jupyter/spark/",
13 | "PYTHONPATH": "/opt/jupyter/spark/python/:/opt/jupyter/spark/python/lib/py4j-0.10.9-src.zip:/opt/jupyter/spark/graphframes.zip",
14 | "PYSPARK_PYTHON": "/opt/conda/bin/python3"
15 | }
16 | }
--------------------------------------------------------------------------------
/docker/jupyter-pyspark/spark/log4j.properties:
--------------------------------------------------------------------------------
1 | #
2 | # Licensed to the Apache Software Foundation (ASF) under one or more
3 | # contributor license agreements. See the NOTICE file distributed with
4 | # this work for additional information regarding copyright ownership.
5 | # The ASF licenses this file to You under the Apache License, Version 2.0
6 | # (the "License"); you may not use this file except in compliance with
7 | # the License. You may obtain a copy of the License at
8 | #
9 | # http://www.apache.org/licenses/LICENSE-2.0
10 | #
11 | # Unless required by applicable law or agreed to in writing, software
12 | # distributed under the License is distributed on an "AS IS" BASIS,
13 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
14 | # See the License for the specific language governing permissions and
15 | # limitations under the License.
16 | #
17 |
18 | # Set everything to be logged to the console
19 | log4j.rootCategory=WARN, console
20 | log4j.appender.console=org.apache.log4j.ConsoleAppender
21 | log4j.appender.console.target=System.err
22 | log4j.appender.console.layout=org.apache.log4j.PatternLayout
23 | log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
24 |
25 | # Set the default spark-shell log level to WARN. When running the spark-shell, the
26 | # log level for this class is used to overwrite the root logger's log level, so that
27 | # the user can have different defaults for the shell and regular Spark apps.
28 | log4j.logger.org.apache.spark.repl.Main=WARN
29 |
30 | # Settings to quiet third party logs that are too verbose
31 | log4j.logger.org.spark_project.jetty=WARN
32 | log4j.logger.org.spark_project.jetty.util.component.AbstractLifeCycle=ERROR
33 | log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
34 | log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
35 | log4j.logger.org.apache.parquet=ERROR
36 | log4j.logger.parquet=ERROR
37 |
38 | # SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL with Hive support
39 | log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
40 | log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR
41 |
--------------------------------------------------------------------------------
/docker/jupyter-pyspark/spark/spark-defaults.conf:
--------------------------------------------------------------------------------
1 | # HELK build Stage: Alpha
2 | # Author: Roberto Rodriguez (@Cyb3rWard0g)
3 | # License: GPL-3.0
4 |
5 | # HELK References:
6 | # https://spark.apache.org/docs/latest/configuration.html
7 | # https://graphframes.github.io/quick-start.html
8 | # https://spark-packages.org/package/graphframes/graphframes
9 | # https://spark.apache.org/docs/latest/sql-programming-guide.html#pyspark-usage-guide-for-pandas-with-apache-arrow
10 |
11 | # ************ Application Properties ****************
12 | # Logs the effective SparkConf as INFO when a SparkContext is started. Default: false
13 | spark.logConf true
14 | # The cluster manager to connect to.
15 | # spark.master spark://helk-spark-master:7077
16 | # Restarts the driver automatically if it fails with a non-zero exit status
17 | spark.driver.supervise true
18 |
19 | # ************ Runtime Environment ****************
20 | # Sets the number of latest rolling log files that are going to be retained by the system. Older log files will be deleted.
21 | spark.executor.logs.rolling.maxRetainedFiles 20
22 | # Set the strategy of rolling of executor logs.
23 | spark.executor.logs.rolling.strategy spark.executor.logs.rolling.time.interval
24 | # Comma-separated list of jars to include on the driver and executor classpaths. Globs are allowed.
25 |
26 | # Comma-separated list of Maven coordinates of jars to include on the driver and executor classpaths.
27 | # The coordinates should be groupId:artifactId:version.
28 | #spark.jars.packages graphframes:graphframes:0.7.0-spark2.4-s_2.11,org.apache.spark:spark-sql-kafka-0-10_2.11:2.4.0
29 | #spark.jars.packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.3.1,databricks:spark-sklearn:0.2.3
30 |
31 | # ************ Spark UI ****************
32 | # Base directory in which Spark events are logged
33 | spark.eventLog.dir /opt/jupyter/spark/logs
34 | # Whether to log Spark events, useful for reconstructing the Web UI after the application has finished.
35 | spark.eventLog.enabled true
36 | # Enable running Spark Master as reverse proxy for worker and application UIs.
37 | # In this mode, Spark master will reverse proxy the worker and application UIs to enable access without requiring direct access to their hosts.
38 | #spark.ui.reverseProxy true
39 |
40 | #spark.sql.execution.arrow.enabled true
41 |
42 | # Enables the external shuffle service. This service preserves the shuffle files written by executors so the executors can be safely removed
43 | #spark.shuffle.service.enabled true
44 |
45 | # ************ Dynamic Allocation **************
46 | # Whether to use dynamic resource allocation, which scales the number of executors registered with this application up and down based on the workload
47 | #spark.dynamicAllocation.enabled true
48 | # If dynamic allocation is enabled and an executor has been idle for more than this duration, the executor will be removed
49 | #spark.dynamicAllocation.executorIdleTimeout 15s
50 |
51 | # Amount of memory to use per executor process, in MiB unless otherwise specified. (e.g. 2g, 8g).
52 | #spark.executor.memory 1g
--------------------------------------------------------------------------------
/docker/jupyter-rto/Dockerfile:
--------------------------------------------------------------------------------
1 | # Notebooks Forge script: Jupyter RTO Dockerfile
2 | # Notebooks Forge Stage: Alpha
3 | # Author: Roberto Rodriguez (@Cyb3rWard0g)
4 | # License: GPL-3.0
5 |
6 | FROM cyb3rward0g/jupyter-base:0.0.5
7 | LABEL maintainer="Roberto Rodriguez @Cyb3rWard0g"
8 | LABEL description="Notebooks Forge Jupyter Project."
9 |
10 | ENV DEBIAN_FRONTEND noninteractive
11 |
12 | USER ${USER}
13 |
14 | RUN python3 -m pip install --upgrade pip \
15 | # *********** Install neo4j ************
16 | && python3 -m pip install neo4j==1.7.6 \
17 | # *********** Download pycobalt ***************
18 | && mkdir -p /opt/pycobalt \
19 | && git clone https://github.com/dcsync/pycobalt.git /opt/pycobalt \
20 | && cd /opt/pycobalt \
21 | && python3 setup.py install \
22 | # *********** Download Faction C2 Client ***************
23 | && mkdir -pv /opt/faction/cli \
24 | && git clone --single-branch --branch=master https://github.com/FactionC2/CLI /opt/faction/cli \
25 | && cd /opt/faction/cli \
26 | && python3 -m pip install pipenv \
27 | && python3 -m pipenv install --system \
28 | && python3 setup.py install
--------------------------------------------------------------------------------
/docker/jupyter-rto/docker-compose.yml:
--------------------------------------------------------------------------------
1 | version: '3.5'
2 |
3 | services:
4 | jupyter-rto:
5 | image: cyb3rward0g/jupyter-rto:0.0.2
6 | container_name: jupyter-rto
7 | volumes:
8 | - notebooks:/opt/helk/jupyter/notebooks
9 | environment:
10 | JUPYTER_TYPE: lab
11 | JUPYTER_BASE_URL: /jupyter
12 | ports:
13 | - "8888:8888"
14 | restart: always
15 | networks:
16 | hunting:
17 |
18 | networks:
19 | hunting:
20 | driver: bridge
21 |
22 | volumes:
23 | notebooks:
24 | driver: local
--------------------------------------------------------------------------------
/docs/Makefile:
--------------------------------------------------------------------------------
1 | # Minimal makefile for Sphinx documentation
2 | #
3 |
4 | # You can set these variables from the command line.
5 | SPHINXOPTS =
6 | SPHINXBUILD = sphinx-build
7 | SOURCEDIR = source
8 | BUILDDIR = build
9 |
10 | # Put it first so that "make" without argument is like "make help".
11 | help:
12 | @$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
13 |
14 | .PHONY: help Makefile
15 |
16 | # Catch-all target: route all unknown targets to Sphinx using the new
17 | # "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
18 | %: Makefile
19 | @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
--------------------------------------------------------------------------------
/docs/build/doctrees/docker.doctree:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/build/doctrees/docker.doctree
--------------------------------------------------------------------------------
/docs/build/doctrees/environment.pickle:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/build/doctrees/environment.pickle
--------------------------------------------------------------------------------
/docs/build/doctrees/index.doctree:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/build/doctrees/index.doctree
--------------------------------------------------------------------------------
/docs/build/doctrees/jupyter.doctree:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/build/doctrees/jupyter.doctree
--------------------------------------------------------------------------------
/docs/build/doctrees/jupyter_hunt.doctree:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/build/doctrees/jupyter_hunt.doctree
--------------------------------------------------------------------------------
/docs/build/doctrees/jupyter_rto.doctree:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/build/doctrees/jupyter_rto.doctree
--------------------------------------------------------------------------------
/docs/build/doctrees/jupyter_spark.doctree:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/build/doctrees/jupyter_spark.doctree
--------------------------------------------------------------------------------
/docs/build/doctrees/license.doctree:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/build/doctrees/license.doctree
--------------------------------------------------------------------------------
/docs/build/doctrees/zeppelin.doctree:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/build/doctrees/zeppelin.doctree
--------------------------------------------------------------------------------
/docs/build/html/.buildinfo:
--------------------------------------------------------------------------------
1 | # Sphinx build info version 1
2 | # This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
3 | config: 1d004c778a20e39a161d51665d26d32c
4 | tags: 645f666f9bcd5a90fca523b33c5a78b7
5 |
--------------------------------------------------------------------------------
/docs/build/html/.nojekyll:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/build/html/.nojekyll
--------------------------------------------------------------------------------
/docs/build/html/_images/docker-containers.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/build/html/_images/docker-containers.png
--------------------------------------------------------------------------------
/docs/build/html/_images/jupyter-design.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/build/html/_images/jupyter-design.png
--------------------------------------------------------------------------------
/docs/build/html/_images/jupyter-evolution.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/build/html/_images/jupyter-evolution.png
--------------------------------------------------------------------------------
/docs/build/html/_images/jupyter-login.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/build/html/_images/jupyter-login.png
--------------------------------------------------------------------------------
/docs/build/html/_images/jupyter-main.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/build/html/_images/jupyter-main.png
--------------------------------------------------------------------------------
/docs/build/html/_images/jupyter-samples.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/build/html/_images/jupyter-samples.png
--------------------------------------------------------------------------------
/docs/build/html/_sources/docker.rst.txt:
--------------------------------------------------------------------------------
1 | Docker Notebook Deployments
2 | ===========================
3 |
4 | Docker technology allows the project to package notebook applications with all its libraries and dependencies in "containers" and make them portable among any operating system.
5 | This allows security analytst to deploy the notebook servers on any system they use daily for hunting research.
6 |
7 | What are Docker Containers?
8 | ###########################
9 |
10 | According to `Docker docs `_, a container is a standard unit of software that packages up code and all its dependencies so the application runs quickly and reliably from one computing environment to another.
11 | A Docker container image is a lightweight, standalone, executable package of software that includes everything needed to run an application: code, runtime, system tools, system libraries and settings.
12 |
13 | .. image:: _static/docker-containers.png
14 | :alt: Docker Containers
15 | :scale: 50%
16 |
17 | There are two notebook environments being supported by the project.
18 |
19 | Jupyter Notebooks Install
20 | #########################
21 |
22 | Requirements
23 | ************
24 |
25 | * `Git `_ : Git is a free and open source distributed version control system designed to handle everything from small to very large projects with speed and efficiency.
26 | * `Docker CE `_ : Docker Community Edition (CE) is ideal for developers and small teams looking to get started with Docker and experimenting with container-based apps.
27 | * `Docker Compose `_ : a tool for defining and running multi-container Docker applications.
28 |
29 | Steps
30 | *****
31 |
32 | Git clone the `Notebooks Forge project `_ and change your current directory to the projects directory.
33 |
34 | .. code-block:: console
35 |
36 | $ git clone https://github.com/Cyb3rWard0g/notebooks-forge.git
37 | $ cd notebooks-forge/
38 |
39 | Change your current directory to the specific notebook you want to work with (``jupyter-hunt`` or ``jupyter-rto``)
40 |
41 | .. code-block:: console
42 |
43 | $ cd jupyter-hunt/
44 |
45 | Run docker-compose pointing to the default compose file available in the folder.
46 |
47 | .. code-block:: console
48 |
49 | $ sudo docker-compose -f docker-compose.yml up --buil -d
50 |
51 | Once your container gets downloaded/run, you can check it if is running or not with the following commands:
52 |
53 | .. code-block:: console
54 |
55 | $ sudo docker ps
56 |
57 | Before accessing the Jupyter notebook server via your favorite web browser, you will have to get the access token the application initialized with.
58 | You can get it with the following command:
59 |
60 | .. code-block:: console
61 |
62 | $ sudo docker exec -ti jupyter-hunt jupyter notebook list | grep "token" | sed 's/.*token=\([^ ]*\).*/\1/'
63 |
64 | Open your favorite browser at ``http://:8888```. You will then be prompted with a login box to enter the token.
65 |
66 | .. image:: _static/jupyter-login.png
67 | :alt: Jupyter Login
68 | :scale: 50%
69 |
70 | That's it! You are now ready to use your Jupyter Notebook server.
71 |
72 | .. image:: _static/jupyter-main.png
73 | :alt: Jupyter Main
74 | :scale: 40%
75 |
76 | Zeppelin Notebooks Install
77 | ##########################
78 |
79 | Requirements
80 | ************
81 |
82 | * `Git `_ : Git is a free and open source distributed version control system designed to handle everything from small to very large projects with speed and efficiency.
83 | * `Docker CE `_ : Docker Community Edition (CE) is ideal for developers and small teams looking to get started with Docker and experimenting with container-based apps.
84 | * `Docker Compose `_ : a tool for defining and running multi-container Docker applications.
85 |
86 | Steps
87 | *****
88 |
89 | Coming soon..
--------------------------------------------------------------------------------
/docs/build/html/_sources/index.rst.txt:
--------------------------------------------------------------------------------
1 | .. Notebooks Forge documentation master file, created by
2 | sphinx-quickstart on Wed Apr 17 11:44:45 2019.
3 | You can adapt this file completely to your liking, but it should at least
4 | contain the root `toctree` directive.
5 |
6 | Notebooks Forge
7 | ===============
8 |
9 | A project dedicated to build and provide ``Notebooks`` servers for ``Defensive`` and ``Offensive`` operators to:
10 |
11 | * Design playbooks
12 | * Demonstrate how techniques can be used
13 | * Showcase when and why an operator would want to use a technique
14 | * Document engagements procedures
15 | * Prototype new ways to analyze data extracted from endpoints in a more dynamic, flexible and language-agnostic way.
16 |
17 | This project supports two notebook server types such as `Jupyter `_ and `Zeppelin `_.
18 |
19 |
20 | What is a Notebook?
21 | *******************
22 |
23 | Think of a notebook as a document that you can access via a web interface that allows you to save input (i.e live code) and output (i.e code execution results / evaluated code output) of interactive sessions as well as important notes needed to explain the methodology and steps taken to perform specific tasks (i.e data analysis).
24 |
25 | .. toctree::
26 | :maxdepth: 2
27 | :caption: Notebook Environments:
28 |
29 | Jupyter Notebook
30 | Zeppelin Notebook
31 |
32 | .. toctree::
33 | :maxdepth: 2
34 | :caption: Notebook Deployments:
35 |
36 | Docker
37 |
38 | .. toctree::
39 | :maxdepth: 2
40 | :caption: Licenses:
41 |
42 | GNU General Public License V3
--------------------------------------------------------------------------------
/docs/build/html/_sources/jupyter.rst.txt:
--------------------------------------------------------------------------------
1 | Jupyter Notebook
2 | ================
3 |
4 | The Jupyter Notebook is an open-source web application that allows you to create and share documents that contain live code, equations, visualizations and narrative text.
5 | Uses include: data cleaning and transformation, numerical simulation, statistical modeling, data visualization, machine learning, and much more.
6 |
7 | The Jupyter Notebook project is the evolution of the IPython Notebook library which was developed primarily to enhance the default python interactive console by enabling scientific operations and advanced data analytics capabilities via sharable web documents.
8 |
9 | .. image:: _static/jupyter-evolution.png
10 | :alt: Jupyter Evolution
11 | :scale: 50%
12 |
13 | Nowadays, the Jupyter Notebook project not only supports Python but also over 40 programming languages such as R, Julia, Scala and PySpark.
14 | In fact, its name was originally derived from three programming languages: Julia, Python and R which made it one of the first language-agnostic notebook applications, and now considered one of the most preferred environments for data scientists and engineers in the community to explore and analyze data.
15 |
16 | .. image:: _static/jupyter-samples.png
17 | :alt: Jupyter Sample
18 | :scale: 50%
19 |
20 | How do Jupyter Notebooks Work?
21 | ##############################
22 |
23 | Jupyter Notebooks work with what is called a two-process model based on a kernel-client infrastructure.
24 | This model applies a similar concept to the `Read-Evaluate-Print Loop (REPL) `_ programming environment that takes a single user's inputs, evaluates them, and returns the result to the user.
25 |
26 | .. image:: _static/jupyter-design.png
27 | :alt: Jupyter Design
28 | :scale: 70%
29 |
30 | Based on the two-process model concept, we can explain the main components of Jupyter the following way:
31 |
32 | Jupyter Client
33 | **************
34 |
35 | * It allows a user to send code to the kernel and it could be in a form of a `Qt Console `_ or a browser via notebook documents.
36 | * From a REPL perspective, the client does the read and print operations.
37 | * Notebooks are hosted by the Jupyter web server which uses Tornado to serve HTTP requests.
38 |
39 | Jupyter Kernel
40 | **************
41 |
42 | * It receives the code sent by the client, executes it, and returns the results back to the client for display. A kernel process can have multiple clients communicating with it which is why this model is also referred as the decoupled two-process model.
43 | * From a REPL perspective, the kernel does the evaluate operation.
44 | * kernel and clients communicate via an interactive computing protocol based on an asynchronous messaging library named `ZeroMQ `_ (low-level transport layer) and WebSockets (TCP-based)
45 |
46 | Jupyter Notebook Document
47 | *************************
48 |
49 | * Notebooks are automatically saved and stored on disk in the open source JavaScript Object Notation (JSON) format and with a .ipynb extension.
50 |
51 | Jupyter Notebooks Servers
52 | #########################
53 |
54 | .. toctree::
55 | :maxdepth: 2
56 |
57 | Jupyter Spark
58 | Jupyter Hunt
59 | Jupyter RTO
--------------------------------------------------------------------------------
/docs/build/html/_sources/jupyter_hunt.rst.txt:
--------------------------------------------------------------------------------
1 | Jupyter Hunt Server
2 | ===================
3 |
4 | A notebook server built for defensive operators with several tools to connect to known SIEMs and be able to analyze data to find potential adversaries in the network.
5 | This server is built on the top of the `Jupyter Spark` server available in this repo in order to provide advanced analytics capabilities via Apache Spark.
6 |
7 | Jupyter Python Libraries
8 | ########################
9 |
10 | Pandas
11 | ******
12 |
13 | `Pandas `_ is an open source, BSD-licensed library providing high-performance, easy-to-use data structures and data analysis tools for the Python programming language.
14 |
15 | Altair
16 | ******
17 |
18 | `Altair `_ is a declarative statistical visualization library for Python.
19 | With Altair, you can spend more time understanding your data and its meaning.
20 | Altair's API is simple, friendly and consistent and built on top of the powerful `Vega-Lite `_ JSON specification.
21 |
22 | S3Fs
23 | ****
24 |
25 | `S3Fs `_ is a Pythonic file interface to S3. It builds on top of `boto3 `_.
26 | The top-level class S3FileSystem holds connection information and allows typical file-system style operations like cp, mv, ls, du, glob, etc., as well as put/get of local files to/from S3.
27 |
28 | Elasticsearch-DSL
29 | *****************
30 |
31 | `Elasticsearch DSL `_ is a high-level library whose aim is to help with writing and running queries against Elasticsearch.
32 | It is built on top of the official low-level client (`elasticsearch-py `_).
33 | It provides a more convenient and idiomatic way to write and manipulate queries.
34 | It stays close to the Elasticsearch JSON DSL, mirroring its terminology and structure.
35 | It exposes the whole range of the DSL from Python either directly using defined classes or a queryset-like expressions.
36 |
37 | Matplotlib
38 | **********
39 |
40 | `Matplotlib `_ is a Python 2D plotting library which produces publication-quality figures in a variety of hardcopy formats and interactive environments across platforms.
41 | Matplotlib can be used in Python scripts, the Python and IPython shell (à la MATLAB or Mathematica), web application servers, and various graphical user interface toolkits.
42 |
43 | Scikit-learn
44 | ************
45 |
46 | `Scikit-learn `_ is a Python module for machine learning built on top of SciPy and distributed under the 3-Clause BSD license.
47 |
48 | KSQL-Python
49 | ***********
50 |
51 | `KSQL-Python `_ is a python wrapper for the KSQL REST API. Easily interact with the KSQL REST API using this library.
52 |
53 | Confluent-Kafka-Python
54 | **********************
55 |
56 | `Confluent-kafka-python `_ is Confluent's Python client for `Apache Kafka `_ and the `Confluent Platform `_.
57 |
58 | Splunk-SDK
59 | **********
60 |
61 | The `Splunk Software Development Kit (SDK) `_ for Python contains library code and examples designed to enable developers to build applications using Splunk.
62 |
63 | Kqlmagic
64 | ********
65 |
66 | The `Kqlmagic `_ magic extension enables notebook experience, exploring Microsoft Azure Monitor data: Azure Data Explorer (Kusto), ApplicationInsights, and LogAnalytics data, from Jupyter notebook (Python3 kernel), using kql (Kusto Query language).
67 |
68 | Neo4j
69 | *****
70 |
71 | The official `Neo4j driver for Python `_ supports Neo4j 3.0 and above and Python versions 2.7, 3.4, 3.5, 3.6, and 3.7.
72 | It connects to the database using the binary protocol. It aims to be minimal, while being idiomatic to Python.
73 |
74 | Networkx
75 | ********
76 |
77 | `NetworkX `_ is a Python package for the creation, manipulation, and study of the structure, dynamics, and functions of complex networks.
78 |
79 | Nxviz
80 | *****
81 |
82 | `Nxviz `_ is a graph visualization package for NetworkX. With nxviz, you can create beautiful graph visualizations by a declarative API.
83 |
84 | Jupyter Kernels Available
85 | #########################
86 |
87 | IPython Kernel (Python)
88 | *************************
89 |
90 | The Jupyter team maintains the `IPython kernel `_ since the Jupyter notebook server depends on the IPython kernel functionality.
91 | Many other languages, in addition to Python, may be used in the notebook.
92 |
93 | PySpark Kernel (Python)
94 | ************************
95 |
96 | A python Kernel to enable `Apache Spark for python `_.
97 | Writing PySpark Applications is really no different than writing normal Python applications or packages.
98 | It’s quite similar to writing command-line applications in particular.
99 | Spark doesn’t have a build concept, just Python scripts, so to run an application, you simply execute the script against the cluster.
100 |
101 | Syplon Kernel (Scala/Python)
102 | *****************************
103 |
104 | A Scala kernel for Apache Spark that uses `metakernel `_ in combination with `py4j `_.
105 |
106 | R Kernel (R)
107 | ************
108 |
109 | An R kernel for `Apache SparkR `_.
110 | SparkR is an R package that provides a light-weight frontend to use Apache Spark from R.
111 | In Spark 2.4.1, SparkR provides a distributed data frame implementation that supports operations like selection, filtering, aggregation etc. (similar to R data frames, dplyr) but on large datasets.
112 | SparkR also supports distributed machine learning using MLlib.
--------------------------------------------------------------------------------
/docs/build/html/_sources/jupyter_rto.rst.txt:
--------------------------------------------------------------------------------
1 | Jupyter Red Team Operations (RTO) Server
2 | ========================================
3 |
4 | A notebook server built for offensive operators with a few libraries to connect to known tools such as Bloodhound and Cobalt Strike.
5 |
6 | Jupyter Python Libraries
7 | ########################
8 |
9 | Neo4j Pytho Driver
10 | ******************
11 |
12 | `Neo4j Bolt driver `_ for Python
13 |
14 | PyCobalt
15 | ********
16 |
17 | `PyCobalt `_ is a Python API for Cobalt Strike
18 |
19 | Jupyter Kernels Available
20 | #########################
21 |
22 | IPython Kernel (Python)
23 | *************************
24 |
25 | The Jupyter team maintains the `IPython kernel `_ since the Jupyter notebook server depends on the IPython kernel functionality.
26 | Many other languages, in addition to Python, may be used in the notebook.
--------------------------------------------------------------------------------
/docs/build/html/_sources/jupyter_spark.rst.txt:
--------------------------------------------------------------------------------
1 | Jupyter Spark Server
2 | ====================
3 |
4 | A notebook server built for any operator looking to leverage advanced analytics provided by Apache Spark.
5 |
6 | Jupyter Python Libraries
7 | ########################
8 |
9 | Pandas
10 | ******
11 |
12 | `Pandas `_ is an open source, BSD-licensed library providing high-performance, easy-to-use data structures and data analysis tools for the Python programming language.
13 |
14 | Jupyter Kernels Available
15 | #########################
16 |
17 | IPython Kernel (Python)
18 | *************************
19 |
20 | The Jupyter team maintains the `IPython kernel `_ since the Jupyter notebook server depends on the IPython kernel functionality.
21 | Many other languages, in addition to Python, may be used in the notebook.
22 |
23 | PySpark Kernel (Python)
24 | ************************
25 |
26 | A python Kernel to enable `Apache Spark for python `_.
27 | Writing PySpark Applications is really no different than writing normal Python applications or packages.
28 | It’s quite similar to writing command-line applications in particular.
29 | Spark doesn’t have a build concept, just Python scripts, so to run an application, you simply execute the script against the cluster.
30 |
31 | Syplon Kernel (Scala/Python)
32 | *****************************
33 |
34 | A Scala kernel for Apache Spark that uses `metakernel `_ in combination with `py4j `_.
35 |
36 | R Kernel (R)
37 | ************
38 |
39 | An R kernel for `Apache SparkR `_.
40 | SparkR is an R package that provides a light-weight frontend to use Apache Spark from R.
41 | In Spark 2.4.1, SparkR provides a distributed data frame implementation that supports operations like selection, filtering, aggregation etc. (similar to R data frames, dplyr) but on large datasets.
42 | SparkR also supports distributed machine learning using MLlib.
--------------------------------------------------------------------------------
/docs/build/html/_sources/zeppelin.rst.txt:
--------------------------------------------------------------------------------
1 | Zeppelin Notebook
2 | =================
3 |
4 | Coming soon..
--------------------------------------------------------------------------------
/docs/build/html/_static/ajax-loader.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/build/html/_static/ajax-loader.gif
--------------------------------------------------------------------------------
/docs/build/html/_static/comment-bright.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/build/html/_static/comment-bright.png
--------------------------------------------------------------------------------
/docs/build/html/_static/comment-close.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/build/html/_static/comment-close.png
--------------------------------------------------------------------------------
/docs/build/html/_static/comment.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/build/html/_static/comment.png
--------------------------------------------------------------------------------
/docs/build/html/_static/css/badge_only.css:
--------------------------------------------------------------------------------
1 | .fa:before{-webkit-font-smoothing:antialiased}.clearfix{*zoom:1}.clearfix:before,.clearfix:after{display:table;content:""}.clearfix:after{clear:both}@font-face{font-family:FontAwesome;font-weight:normal;font-style:normal;src:url("../fonts/fontawesome-webfont.eot");src:url("../fonts/fontawesome-webfont.eot?#iefix") format("embedded-opentype"),url("../fonts/fontawesome-webfont.woff") format("woff"),url("../fonts/fontawesome-webfont.ttf") format("truetype"),url("../fonts/fontawesome-webfont.svg#FontAwesome") format("svg")}.fa:before{display:inline-block;font-family:FontAwesome;font-style:normal;font-weight:normal;line-height:1;text-decoration:inherit}a .fa{display:inline-block;text-decoration:inherit}li .fa{display:inline-block}li .fa-large:before,li .fa-large:before{width:1.875em}ul.fas{list-style-type:none;margin-left:2em;text-indent:-0.8em}ul.fas li .fa{width:.8em}ul.fas li .fa-large:before,ul.fas li .fa-large:before{vertical-align:baseline}.fa-book:before{content:""}.icon-book:before{content:""}.fa-caret-down:before{content:""}.icon-caret-down:before{content:""}.fa-caret-up:before{content:""}.icon-caret-up:before{content:""}.fa-caret-left:before{content:""}.icon-caret-left:before{content:""}.fa-caret-right:before{content:""}.icon-caret-right:before{content:""}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:"Lato","proxima-nova","Helvetica Neue",Arial,sans-serif;z-index:400}.rst-versions a{color:#2980B9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27AE60;*zoom:1}.rst-versions .rst-current-version:before,.rst-versions .rst-current-version:after{display:table;content:""}.rst-versions .rst-current-version:after{clear:both}.rst-versions .rst-current-version .fa{color:#fcfcfc}.rst-versions .rst-current-version .fa-book{float:left}.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#E74C3C;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#F1C40F;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:gray;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:solid 1px #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .icon-book{float:none}.rst-versions.rst-badge .fa-book{float:none}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book{float:left}.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge .rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width: 768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}}
2 |
--------------------------------------------------------------------------------
/docs/build/html/_static/docker-containers.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/build/html/_static/docker-containers.png
--------------------------------------------------------------------------------
/docs/build/html/_static/doctools.js:
--------------------------------------------------------------------------------
1 | /*
2 | * doctools.js
3 | * ~~~~~~~~~~~
4 | *
5 | * Sphinx JavaScript utilities for all documentation.
6 | *
7 | * :copyright: Copyright 2007-2019 by the Sphinx team, see AUTHORS.
8 | * :license: BSD, see LICENSE for details.
9 | *
10 | */
11 |
12 | /**
13 | * select a different prefix for underscore
14 | */
15 | $u = _.noConflict();
16 |
17 | /**
18 | * make the code below compatible with browsers without
19 | * an installed firebug like debugger
20 | if (!window.console || !console.firebug) {
21 | var names = ["log", "debug", "info", "warn", "error", "assert", "dir",
22 | "dirxml", "group", "groupEnd", "time", "timeEnd", "count", "trace",
23 | "profile", "profileEnd"];
24 | window.console = {};
25 | for (var i = 0; i < names.length; ++i)
26 | window.console[names[i]] = function() {};
27 | }
28 | */
29 |
30 | /**
31 | * small helper function to urldecode strings
32 | */
33 | jQuery.urldecode = function(x) {
34 | return decodeURIComponent(x).replace(/\+/g, ' ');
35 | };
36 |
37 | /**
38 | * small helper function to urlencode strings
39 | */
40 | jQuery.urlencode = encodeURIComponent;
41 |
42 | /**
43 | * This function returns the parsed url parameters of the
44 | * current request. Multiple values per key are supported,
45 | * it will always return arrays of strings for the value parts.
46 | */
47 | jQuery.getQueryParameters = function(s) {
48 | if (typeof s === 'undefined')
49 | s = document.location.search;
50 | var parts = s.substr(s.indexOf('?') + 1).split('&');
51 | var result = {};
52 | for (var i = 0; i < parts.length; i++) {
53 | var tmp = parts[i].split('=', 2);
54 | var key = jQuery.urldecode(tmp[0]);
55 | var value = jQuery.urldecode(tmp[1]);
56 | if (key in result)
57 | result[key].push(value);
58 | else
59 | result[key] = [value];
60 | }
61 | return result;
62 | };
63 |
64 | /**
65 | * highlight a given string on a jquery object by wrapping it in
66 | * span elements with the given class name.
67 | */
68 | jQuery.fn.highlightText = function(text, className) {
69 | function highlight(node, addItems) {
70 | if (node.nodeType === 3) {
71 | var val = node.nodeValue;
72 | var pos = val.toLowerCase().indexOf(text);
73 | if (pos >= 0 &&
74 | !jQuery(node.parentNode).hasClass(className) &&
75 | !jQuery(node.parentNode).hasClass("nohighlight")) {
76 | var span;
77 | var isInSVG = jQuery(node).closest("body, svg, foreignObject").is("svg");
78 | if (isInSVG) {
79 | span = document.createElementNS("http://www.w3.org/2000/svg", "tspan");
80 | } else {
81 | span = document.createElement("span");
82 | span.className = className;
83 | }
84 | span.appendChild(document.createTextNode(val.substr(pos, text.length)));
85 | node.parentNode.insertBefore(span, node.parentNode.insertBefore(
86 | document.createTextNode(val.substr(pos + text.length)),
87 | node.nextSibling));
88 | node.nodeValue = val.substr(0, pos);
89 | if (isInSVG) {
90 | var bbox = span.getBBox();
91 | var rect = document.createElementNS("http://www.w3.org/2000/svg", "rect");
92 | rect.x.baseVal.value = bbox.x;
93 | rect.y.baseVal.value = bbox.y;
94 | rect.width.baseVal.value = bbox.width;
95 | rect.height.baseVal.value = bbox.height;
96 | rect.setAttribute('class', className);
97 | var parentOfText = node.parentNode.parentNode;
98 | addItems.push({
99 | "parent": node.parentNode,
100 | "target": rect});
101 | }
102 | }
103 | }
104 | else if (!jQuery(node).is("button, select, textarea")) {
105 | jQuery.each(node.childNodes, function() {
106 | highlight(this, addItems);
107 | });
108 | }
109 | }
110 | var addItems = [];
111 | var result = this.each(function() {
112 | highlight(this, addItems);
113 | });
114 | for (var i = 0; i < addItems.length; ++i) {
115 | jQuery(addItems[i].parent).before(addItems[i].target);
116 | }
117 | return result;
118 | };
119 |
120 | /*
121 | * backward compatibility for jQuery.browser
122 | * This will be supported until firefox bug is fixed.
123 | */
124 | if (!jQuery.browser) {
125 | jQuery.uaMatch = function(ua) {
126 | ua = ua.toLowerCase();
127 |
128 | var match = /(chrome)[ \/]([\w.]+)/.exec(ua) ||
129 | /(webkit)[ \/]([\w.]+)/.exec(ua) ||
130 | /(opera)(?:.*version|)[ \/]([\w.]+)/.exec(ua) ||
131 | /(msie) ([\w.]+)/.exec(ua) ||
132 | ua.indexOf("compatible") < 0 && /(mozilla)(?:.*? rv:([\w.]+)|)/.exec(ua) ||
133 | [];
134 |
135 | return {
136 | browser: match[ 1 ] || "",
137 | version: match[ 2 ] || "0"
138 | };
139 | };
140 | jQuery.browser = {};
141 | jQuery.browser[jQuery.uaMatch(navigator.userAgent).browser] = true;
142 | }
143 |
144 | /**
145 | * Small JavaScript module for the documentation.
146 | */
147 | var Documentation = {
148 |
149 | init : function() {
150 | this.fixFirefoxAnchorBug();
151 | this.highlightSearchWords();
152 | this.initIndexTable();
153 | if (DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) {
154 | this.initOnKeyListeners();
155 | }
156 | },
157 |
158 | /**
159 | * i18n support
160 | */
161 | TRANSLATIONS : {},
162 | PLURAL_EXPR : function(n) { return n === 1 ? 0 : 1; },
163 | LOCALE : 'unknown',
164 |
165 | // gettext and ngettext don't access this so that the functions
166 | // can safely bound to a different name (_ = Documentation.gettext)
167 | gettext : function(string) {
168 | var translated = Documentation.TRANSLATIONS[string];
169 | if (typeof translated === 'undefined')
170 | return string;
171 | return (typeof translated === 'string') ? translated : translated[0];
172 | },
173 |
174 | ngettext : function(singular, plural, n) {
175 | var translated = Documentation.TRANSLATIONS[singular];
176 | if (typeof translated === 'undefined')
177 | return (n == 1) ? singular : plural;
178 | return translated[Documentation.PLURALEXPR(n)];
179 | },
180 |
181 | addTranslations : function(catalog) {
182 | for (var key in catalog.messages)
183 | this.TRANSLATIONS[key] = catalog.messages[key];
184 | this.PLURAL_EXPR = new Function('n', 'return +(' + catalog.plural_expr + ')');
185 | this.LOCALE = catalog.locale;
186 | },
187 |
188 | /**
189 | * add context elements like header anchor links
190 | */
191 | addContextElements : function() {
192 | $('div[id] > :header:first').each(function() {
193 | $('\u00B6').
194 | attr('href', '#' + this.id).
195 | attr('title', _('Permalink to this headline')).
196 | appendTo(this);
197 | });
198 | $('dt[id]').each(function() {
199 | $('\u00B6').
200 | attr('href', '#' + this.id).
201 | attr('title', _('Permalink to this definition')).
202 | appendTo(this);
203 | });
204 | },
205 |
206 | /**
207 | * workaround a firefox stupidity
208 | * see: https://bugzilla.mozilla.org/show_bug.cgi?id=645075
209 | */
210 | fixFirefoxAnchorBug : function() {
211 | if (document.location.hash && $.browser.mozilla)
212 | window.setTimeout(function() {
213 | document.location.href += '';
214 | }, 10);
215 | },
216 |
217 | /**
218 | * highlight the search words provided in the url in the text
219 | */
220 | highlightSearchWords : function() {
221 | var params = $.getQueryParameters();
222 | var terms = (params.highlight) ? params.highlight[0].split(/\s+/) : [];
223 | if (terms.length) {
224 | var body = $('div.body');
225 | if (!body.length) {
226 | body = $('body');
227 | }
228 | window.setTimeout(function() {
229 | $.each(terms, function() {
230 | body.highlightText(this.toLowerCase(), 'highlighted');
231 | });
232 | }, 10);
233 | $('
Think of a notebook as a document that you can access via a web interface that allows you to save input (i.e live code) and output (i.e code execution results / evaluated code output) of interactive sessions as well as important notes needed to explain the methodology and steps taken to perform specific tasks (i.e data analysis).
The Jupyter team maintains the IPython kernel since the Jupyter notebook server depends on the IPython kernel functionality.
193 | Many other languages, in addition to Python, may be used in the notebook.
Pandas is an open source, BSD-licensed library providing high-performance, easy-to-use data structures and data analysis tools for the Python programming language.
The Jupyter team maintains the IPython kernel since the Jupyter notebook server depends on the IPython kernel functionality.
189 | Many other languages, in addition to Python, may be used in the notebook.
A python Kernel to enable Apache Spark for python.
194 | Writing PySpark Applications is really no different than writing normal Python applications or packages.
195 | It’s quite similar to writing command-line applications in particular.
196 | Spark doesn’t have a build concept, just Python scripts, so to run an application, you simply execute the script against the cluster.
An R kernel for Apache SparkR.
205 | SparkR is an R package that provides a light-weight frontend to use Apache Spark from R.
206 | In Spark 2.4.1, SparkR provides a distributed data frame implementation that supports operations like selection, filtering, aggregation etc. (similar to R data frames, dplyr) but on large datasets.
207 | SparkR also supports distributed machine learning using MLlib.
198 |
199 |
200 |
201 |
206 |
207 |
208 |
209 |
210 |
211 |
212 |
213 |
--------------------------------------------------------------------------------
/docs/make.bat:
--------------------------------------------------------------------------------
1 | @ECHO OFF
2 |
3 | pushd %~dp0
4 |
5 | REM Command file for Sphinx documentation
6 |
7 | if "%SPHINXBUILD%" == "" (
8 | set SPHINXBUILD=sphinx-build
9 | )
10 | set SOURCEDIR=source
11 | set BUILDDIR=build
12 |
13 | if "%1" == "" goto help
14 |
15 | %SPHINXBUILD% >NUL 2>NUL
16 | if errorlevel 9009 (
17 | echo.
18 | echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
19 | echo.installed, then set the SPHINXBUILD environment variable to point
20 | echo.to the full path of the 'sphinx-build' executable. Alternatively you
21 | echo.may add the Sphinx directory to PATH.
22 | echo.
23 | echo.If you don't have Sphinx installed, grab it from
24 | echo.http://sphinx-doc.org/
25 | exit /b 1
26 | )
27 |
28 | %SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS%
29 | goto end
30 |
31 | :help
32 | %SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS%
33 |
34 | :end
35 | popd
36 |
--------------------------------------------------------------------------------
/docs/source/_static/docker-containers.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/source/_static/docker-containers.png
--------------------------------------------------------------------------------
/docs/source/_static/jupyter-design.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/source/_static/jupyter-design.png
--------------------------------------------------------------------------------
/docs/source/_static/jupyter-evolution.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/source/_static/jupyter-evolution.png
--------------------------------------------------------------------------------
/docs/source/_static/jupyter-installed-token.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/source/_static/jupyter-installed-token.png
--------------------------------------------------------------------------------
/docs/source/_static/jupyter-login.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/source/_static/jupyter-login.png
--------------------------------------------------------------------------------
/docs/source/_static/jupyter-main.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/source/_static/jupyter-main.png
--------------------------------------------------------------------------------
/docs/source/_static/jupyter-samples.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/OTRF/notebooks-forge/e964a2fc636de2f24bd418e51b80bfe7b04549fe/docs/source/_static/jupyter-samples.png
--------------------------------------------------------------------------------
/docs/source/conf.py:
--------------------------------------------------------------------------------
1 | # -*- coding: utf-8 -*-
2 | #
3 | # Configuration file for the Sphinx documentation builder.
4 | #
5 | # This file does only contain a selection of the most common options. For a
6 | # full list see the documentation:
7 | # http://www.sphinx-doc.org/en/master/config
8 |
9 | # -- Path setup --------------------------------------------------------------
10 |
11 | # If extensions (or modules to document with autodoc) are in another directory,
12 | # add these directories to sys.path here. If the directory is relative to the
13 | # documentation root, use os.path.abspath to make it absolute, like shown here.
14 | #
15 | # import os
16 | # import sys
17 | # sys.path.insert(0, os.path.abspath('.'))
18 |
19 |
20 | # -- Project information -----------------------------------------------------
21 |
22 | project = 'Notebooks Forge'
23 | copyright = '2019, Roberto Rodriguez, Jose Luis Rodriguez'
24 | author = 'Roberto Rodriguez, Jose Luis Rodriguez'
25 |
26 | # The short X.Y version
27 | version = ''
28 | # The full version, including alpha/beta/rc tags
29 | release = '0.0.1'
30 |
31 |
32 | # -- General configuration ---------------------------------------------------
33 |
34 | # If your documentation needs a minimal Sphinx version, state it here.
35 | #
36 | # needs_sphinx = '1.0'
37 |
38 | # Add any Sphinx extension module names here, as strings. They can be
39 | # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
40 | # ones.
41 | extensions = [
42 | 'sphinx.ext.autodoc',
43 | 'sphinx.ext.viewcode',
44 | 'sphinx.ext.githubpages',
45 | ]
46 |
47 | # Add any paths that contain templates here, relative to this directory.
48 | templates_path = ['_templates']
49 |
50 | # The suffix(es) of source filenames.
51 | # You can specify multiple suffix as a list of string:
52 | #
53 | # source_suffix = ['.rst', '.md']
54 | source_suffix = '.rst'
55 |
56 | # The master toctree document.
57 | master_doc = 'index'
58 |
59 | # The language for content autogenerated by Sphinx. Refer to documentation
60 | # for a list of supported languages.
61 | #
62 | # This is also used if you do content translation via gettext catalogs.
63 | # Usually you set "language" from the command line for these cases.
64 | language = None
65 |
66 | # List of patterns, relative to source directory, that match files and
67 | # directories to ignore when looking for source files.
68 | # This pattern also affects html_static_path and html_extra_path.
69 | exclude_patterns = []
70 |
71 | # The name of the Pygments (syntax highlighting) style to use.
72 | pygments_style = None
73 |
74 |
75 | # -- Options for HTML output -------------------------------------------------
76 |
77 | # The theme to use for HTML and HTML Help pages. See the documentation for
78 | # a list of builtin themes.
79 | #
80 | html_theme = 'sphinx_rtd_theme'
81 |
82 | # Theme options are theme-specific and customize the look and feel of a theme
83 | # further. For a list of options available for each theme, see the
84 | # documentation.
85 | #
86 | # html_theme_options = {}
87 |
88 | # Add any paths that contain custom static files (such as style sheets) here,
89 | # relative to this directory. They are copied after the builtin static files,
90 | # so a file named "default.css" will overwrite the builtin "default.css".
91 | html_static_path = ['_static']
92 |
93 | # Custom sidebar templates, must be a dictionary that maps document names
94 | # to template names.
95 | #
96 | # The default sidebars (for documents that don't match any pattern) are
97 | # defined by theme itself. Builtin themes are using these templates by
98 | # default: ``['localtoc.html', 'relations.html', 'sourcelink.html',
99 | # 'searchbox.html']``.
100 | #
101 | # html_sidebars = {}
102 |
103 |
104 | # -- Options for HTMLHelp output ---------------------------------------------
105 |
106 | # Output file base name for HTML help builder.
107 | htmlhelp_basename = 'notebooksforgedoc'
108 |
109 |
110 | # -- Options for LaTeX output ------------------------------------------------
111 |
112 | latex_elements = {
113 | # The paper size ('letterpaper' or 'a4paper').
114 | #
115 | # 'papersize': 'letterpaper',
116 |
117 | # The font size ('10pt', '11pt' or '12pt').
118 | #
119 | # 'pointsize': '10pt',
120 |
121 | # Additional stuff for the LaTeX preamble.
122 | #
123 | # 'preamble': '',
124 |
125 | # Latex figure (float) alignment
126 | #
127 | # 'figure_align': 'htbp',
128 | }
129 |
130 | # Grouping the document tree into LaTeX files. List of tuples
131 | # (source start file, target name, title,
132 | # author, documentclass [howto, manual, or own class]).
133 | latex_documents = [
134 | (master_doc, 'NotebooksForge.tex', 'Notebooks Forge Documentation',
135 | 'Roberto Rodriguez, Jose Luis Rodriguez', 'manual'),
136 | ]
137 |
138 |
139 | # -- Options for manual page output ------------------------------------------
140 |
141 | # One entry per manual page. List of tuples
142 | # (source start file, name, description, authors, manual section).
143 | man_pages = [
144 | (master_doc, 'NotebooksForge', 'Notebooks Forge Documentation',
145 | [author], 1)
146 | ]
147 |
148 |
149 | # -- Options for Texinfo output ----------------------------------------------
150 |
151 | # Grouping the document tree into Texinfo files. List of tuples
152 | # (source start file, target name, title, author,
153 | # dir menu entry, description, category)
154 | texinfo_documents = [
155 | (master_doc, 'NotebooksForge', 'Notebooks Forge Documentation',
156 | author, 'NotebooksForge', 'One line description of project.',
157 | 'Miscellaneous'),
158 | ]
159 |
160 |
161 | # -- Options for Epub output -------------------------------------------------
162 |
163 | # Bibliographic Dublin Core info.
164 | epub_title = project
165 |
166 | # The unique identifier of the text. This can be a ISBN number
167 | # or the project homepage.
168 | #
169 | # epub_identifier = ''
170 |
171 | # A unique identification for the text.
172 | #
173 | # epub_uid = ''
174 |
175 | # A list of files that should not be packed into the epub file.
176 | epub_exclude_files = ['search.html']
177 |
178 |
179 | # -- Extension configuration -------------------------------------------------
180 |
--------------------------------------------------------------------------------
/docs/source/docker.rst:
--------------------------------------------------------------------------------
1 | Docker Notebook Deployments
2 | ===========================
3 |
4 | Docker technology allows the project to package notebook applications with all its libraries and dependencies in "containers" and make them portable among any operating system.
5 | This allows security analytst to deploy the notebook servers on any system they use daily for hunting research.
6 |
7 | What are Docker Containers?
8 | ###########################
9 |
10 | According to `Docker docs `_, a container is a standard unit of software that packages up code and all its dependencies so the application runs quickly and reliably from one computing environment to another.
11 | A Docker container image is a lightweight, standalone, executable package of software that includes everything needed to run an application: code, runtime, system tools, system libraries and settings.
12 |
13 | .. image:: _static/docker-containers.png
14 | :alt: Docker Containers
15 | :scale: 50%
16 |
17 | There are two notebook environments being supported by the project.
18 |
19 | Jupyter Notebooks Install
20 | #########################
21 |
22 | Requirements
23 | ************
24 |
25 | * `Git `_ : Git is a free and open source distributed version control system designed to handle everything from small to very large projects with speed and efficiency.
26 | * `Docker CE `_ : Docker Community Edition (CE) is ideal for developers and small teams looking to get started with Docker and experimenting with container-based apps.
27 | * `Docker Compose `_ : a tool for defining and running multi-container Docker applications.
28 |
29 | Steps
30 | *****
31 |
32 | Git clone the `Notebooks Forge project `_ and change your current directory to the projects directory.
33 |
34 | .. code-block:: console
35 |
36 | $ git clone https://github.com/Cyb3rWard0g/notebooks-forge.git
37 | $ cd notebooks-forge/
38 |
39 | Change your current directory to the specific notebook you want to work with (``jupyter-hunt`` or ``jupyter-rto``)
40 |
41 | .. code-block:: console
42 |
43 | $ cd jupyter-hunt/
44 |
45 | Run docker-compose pointing to the default compose file available in the folder.
46 |
47 | .. code-block:: console
48 |
49 | $ sudo docker-compose -f docker-compose.yml up --buil -d
50 |
51 | Once your container gets downloaded/run, you can check it if is running or not with the following commands:
52 |
53 | .. code-block:: console
54 |
55 | $ sudo docker ps
56 |
57 | Before accessing the Jupyter notebook server via your favorite web browser, you will have to get the access token the application initialized with.
58 | You can get it with the following command:
59 |
60 | .. code-block:: console
61 |
62 | $ sudo docker exec -ti jupyter-hunt jupyter notebook list | grep "token" | sed 's/.*token=\([^ ]*\).*/\1/'
63 |
64 | Open your favorite browser at ``http://:8888```. You will then be prompted with a login box to enter the token.
65 |
66 | .. image:: _static/jupyter-login.png
67 | :alt: Jupyter Login
68 | :scale: 50%
69 |
70 | That's it! You are now ready to use your Jupyter Notebook server.
71 |
72 | .. image:: _static/jupyter-main.png
73 | :alt: Jupyter Main
74 | :scale: 40%
75 |
76 | Zeppelin Notebooks Install
77 | ##########################
78 |
79 | Requirements
80 | ************
81 |
82 | * `Git `_ : Git is a free and open source distributed version control system designed to handle everything from small to very large projects with speed and efficiency.
83 | * `Docker CE `_ : Docker Community Edition (CE) is ideal for developers and small teams looking to get started with Docker and experimenting with container-based apps.
84 | * `Docker Compose `_ : a tool for defining and running multi-container Docker applications.
85 |
86 | Steps
87 | *****
88 |
89 | Coming soon..
--------------------------------------------------------------------------------
/docs/source/index.rst:
--------------------------------------------------------------------------------
1 | .. Notebooks Forge documentation master file, created by
2 | sphinx-quickstart on Wed Apr 17 11:44:45 2019.
3 | You can adapt this file completely to your liking, but it should at least
4 | contain the root `toctree` directive.
5 |
6 | Notebooks Forge
7 | ===============
8 |
9 | A project dedicated to build and provide ``Notebooks`` servers for ``Defensive`` and ``Offensive`` operators to:
10 |
11 | * Design playbooks
12 | * Demonstrate how techniques can be used
13 | * Showcase when and why an operator would want to use a technique
14 | * Document engagements procedures
15 | * Prototype new ways to analyze data extracted from endpoints in a more dynamic, flexible and language-agnostic way.
16 |
17 | This project supports two notebook server types such as `Jupyter `_ and `Zeppelin `_.
18 |
19 |
20 | What is a Notebook?
21 | *******************
22 |
23 | Think of a notebook as a document that you can access via a web interface that allows you to save input (i.e live code) and output (i.e code execution results / evaluated code output) of interactive sessions as well as important notes needed to explain the methodology and steps taken to perform specific tasks (i.e data analysis).
24 |
25 | .. toctree::
26 | :maxdepth: 2
27 | :caption: Notebook Environments:
28 |
29 | Jupyter Notebook
30 | Zeppelin Notebook
31 |
32 | .. toctree::
33 | :maxdepth: 2
34 | :caption: Notebook Deployments:
35 |
36 | Docker
37 |
38 | .. toctree::
39 | :maxdepth: 2
40 | :caption: Licenses:
41 |
42 | GNU General Public License V3
--------------------------------------------------------------------------------
/docs/source/jupyter.rst:
--------------------------------------------------------------------------------
1 | Jupyter Notebook
2 | ================
3 |
4 | The Jupyter Notebook is an open-source web application that allows you to create and share documents that contain live code, equations, visualizations and narrative text.
5 | Uses include: data cleaning and transformation, numerical simulation, statistical modeling, data visualization, machine learning, and much more.
6 |
7 | The Jupyter Notebook project is the evolution of the IPython Notebook library which was developed primarily to enhance the default python interactive console by enabling scientific operations and advanced data analytics capabilities via sharable web documents.
8 |
9 | .. image:: _static/jupyter-evolution.png
10 | :alt: Jupyter Evolution
11 | :scale: 50%
12 |
13 | Nowadays, the Jupyter Notebook project not only supports Python but also over 40 programming languages such as R, Julia, Scala and PySpark.
14 | In fact, its name was originally derived from three programming languages: Julia, Python and R which made it one of the first language-agnostic notebook applications, and now considered one of the most preferred environments for data scientists and engineers in the community to explore and analyze data.
15 |
16 | .. image:: _static/jupyter-samples.png
17 | :alt: Jupyter Sample
18 | :scale: 50%
19 |
20 | How do Jupyter Notebooks Work?
21 | ##############################
22 |
23 | Jupyter Notebooks work with what is called a two-process model based on a kernel-client infrastructure.
24 | This model applies a similar concept to the `Read-Evaluate-Print Loop (REPL) `_ programming environment that takes a single user's inputs, evaluates them, and returns the result to the user.
25 |
26 | .. image:: _static/jupyter-design.png
27 | :alt: Jupyter Design
28 | :scale: 70%
29 |
30 | Based on the two-process model concept, we can explain the main components of Jupyter the following way:
31 |
32 | Jupyter Client
33 | **************
34 |
35 | * It allows a user to send code to the kernel and it could be in a form of a `Qt Console `_ or a browser via notebook documents.
36 | * From a REPL perspective, the client does the read and print operations.
37 | * Notebooks are hosted by the Jupyter web server which uses Tornado to serve HTTP requests.
38 |
39 | Jupyter Kernel
40 | **************
41 |
42 | * It receives the code sent by the client, executes it, and returns the results back to the client for display. A kernel process can have multiple clients communicating with it which is why this model is also referred as the decoupled two-process model.
43 | * From a REPL perspective, the kernel does the evaluate operation.
44 | * kernel and clients communicate via an interactive computing protocol based on an asynchronous messaging library named `ZeroMQ `_ (low-level transport layer) and WebSockets (TCP-based)
45 |
46 | Jupyter Notebook Document
47 | *************************
48 |
49 | * Notebooks are automatically saved and stored on disk in the open source JavaScript Object Notation (JSON) format and with a .ipynb extension.
50 |
51 | Jupyter Notebooks Servers
52 | #########################
53 |
54 | .. toctree::
55 | :maxdepth: 2
56 |
57 | Jupyter Spark
58 | Jupyter Hunt
59 | Jupyter RTO
--------------------------------------------------------------------------------
/docs/source/jupyter_hunt.rst:
--------------------------------------------------------------------------------
1 | Jupyter Hunt Server
2 | ===================
3 |
4 | A notebook server built for defensive operators with several tools to connect to known SIEMs and be able to analyze data to find potential adversaries in the network.
5 | This server is built on the top of the `Jupyter Spark` server available in this repo in order to provide advanced analytics capabilities via Apache Spark.
6 |
7 | Jupyter Python Libraries
8 | ########################
9 |
10 | Pandas
11 | ******
12 |
13 | `Pandas `_ is an open source, BSD-licensed library providing high-performance, easy-to-use data structures and data analysis tools for the Python programming language.
14 |
15 | Altair
16 | ******
17 |
18 | `Altair `_ is a declarative statistical visualization library for Python.
19 | With Altair, you can spend more time understanding your data and its meaning.
20 | Altair's API is simple, friendly and consistent and built on top of the powerful `Vega-Lite `_ JSON specification.
21 |
22 | S3Fs
23 | ****
24 |
25 | `S3Fs `_ is a Pythonic file interface to S3. It builds on top of `boto3 `_.
26 | The top-level class S3FileSystem holds connection information and allows typical file-system style operations like cp, mv, ls, du, glob, etc., as well as put/get of local files to/from S3.
27 |
28 | Elasticsearch-DSL
29 | *****************
30 |
31 | `Elasticsearch DSL `_ is a high-level library whose aim is to help with writing and running queries against Elasticsearch.
32 | It is built on top of the official low-level client (`elasticsearch-py `_).
33 | It provides a more convenient and idiomatic way to write and manipulate queries.
34 | It stays close to the Elasticsearch JSON DSL, mirroring its terminology and structure.
35 | It exposes the whole range of the DSL from Python either directly using defined classes or a queryset-like expressions.
36 |
37 | Matplotlib
38 | **********
39 |
40 | `Matplotlib `_ is a Python 2D plotting library which produces publication-quality figures in a variety of hardcopy formats and interactive environments across platforms.
41 | Matplotlib can be used in Python scripts, the Python and IPython shell (à la MATLAB or Mathematica), web application servers, and various graphical user interface toolkits.
42 |
43 | Scikit-learn
44 | ************
45 |
46 | `Scikit-learn `_ is a Python module for machine learning built on top of SciPy and distributed under the 3-Clause BSD license.
47 |
48 | KSQL-Python
49 | ***********
50 |
51 | `KSQL-Python `_ is a python wrapper for the KSQL REST API. Easily interact with the KSQL REST API using this library.
52 |
53 | Confluent-Kafka-Python
54 | **********************
55 |
56 | `Confluent-kafka-python `_ is Confluent's Python client for `Apache Kafka `_ and the `Confluent Platform `_.
57 |
58 | Splunk-SDK
59 | **********
60 |
61 | The `Splunk Software Development Kit (SDK) `_ for Python contains library code and examples designed to enable developers to build applications using Splunk.
62 |
63 | Kqlmagic
64 | ********
65 |
66 | The `Kqlmagic `_ magic extension enables notebook experience, exploring Microsoft Azure Monitor data: Azure Data Explorer (Kusto), ApplicationInsights, and LogAnalytics data, from Jupyter notebook (Python3 kernel), using kql (Kusto Query language).
67 |
68 | Neo4j
69 | *****
70 |
71 | The official `Neo4j driver for Python `_ supports Neo4j 3.0 and above and Python versions 2.7, 3.4, 3.5, 3.6, and 3.7.
72 | It connects to the database using the binary protocol. It aims to be minimal, while being idiomatic to Python.
73 |
74 | Networkx
75 | ********
76 |
77 | `NetworkX `_ is a Python package for the creation, manipulation, and study of the structure, dynamics, and functions of complex networks.
78 |
79 | Nxviz
80 | *****
81 |
82 | `Nxviz `_ is a graph visualization package for NetworkX. With nxviz, you can create beautiful graph visualizations by a declarative API.
83 |
84 | Jupyter Kernels Available
85 | #########################
86 |
87 | IPython Kernel (Python)
88 | *************************
89 |
90 | The Jupyter team maintains the `IPython kernel `_ since the Jupyter notebook server depends on the IPython kernel functionality.
91 | Many other languages, in addition to Python, may be used in the notebook.
92 |
93 | PySpark Kernel (Python)
94 | ************************
95 |
96 | A python Kernel to enable `Apache Spark for python `_.
97 | Writing PySpark Applications is really no different than writing normal Python applications or packages.
98 | It’s quite similar to writing command-line applications in particular.
99 | Spark doesn’t have a build concept, just Python scripts, so to run an application, you simply execute the script against the cluster.
100 |
101 | Syplon Kernel (Scala/Python)
102 | *****************************
103 |
104 | A Scala kernel for Apache Spark that uses `metakernel `_ in combination with `py4j `_.
105 |
106 | R Kernel (R)
107 | ************
108 |
109 | An R kernel for `Apache SparkR `_.
110 | SparkR is an R package that provides a light-weight frontend to use Apache Spark from R.
111 | In Spark 2.4.1, SparkR provides a distributed data frame implementation that supports operations like selection, filtering, aggregation etc. (similar to R data frames, dplyr) but on large datasets.
112 | SparkR also supports distributed machine learning using MLlib.
--------------------------------------------------------------------------------
/docs/source/jupyter_rto.rst:
--------------------------------------------------------------------------------
1 | Jupyter Red Team Operations (RTO) Server
2 | ========================================
3 |
4 | A notebook server built for offensive operators with a few libraries to connect to known tools such as Bloodhound and Cobalt Strike.
5 |
6 | Jupyter Python Libraries
7 | ########################
8 |
9 | Neo4j Pytho Driver
10 | ******************
11 |
12 | `Neo4j Bolt driver `_ for Python
13 |
14 | PyCobalt
15 | ********
16 |
17 | `PyCobalt `_ is a Python API for Cobalt Strike
18 |
19 | Jupyter Kernels Available
20 | #########################
21 |
22 | IPython Kernel (Python)
23 | *************************
24 |
25 | The Jupyter team maintains the `IPython kernel `_ since the Jupyter notebook server depends on the IPython kernel functionality.
26 | Many other languages, in addition to Python, may be used in the notebook.
--------------------------------------------------------------------------------
/docs/source/jupyter_spark.rst:
--------------------------------------------------------------------------------
1 | Jupyter Spark Server
2 | ====================
3 |
4 | A notebook server built for any operator looking to leverage advanced analytics provided by Apache Spark.
5 |
6 | Jupyter Python Libraries
7 | ########################
8 |
9 | Pandas
10 | ******
11 |
12 | `Pandas `_ is an open source, BSD-licensed library providing high-performance, easy-to-use data structures and data analysis tools for the Python programming language.
13 |
14 | Jupyter Kernels Available
15 | #########################
16 |
17 | IPython Kernel (Python)
18 | *************************
19 |
20 | The Jupyter team maintains the `IPython kernel `_ since the Jupyter notebook server depends on the IPython kernel functionality.
21 | Many other languages, in addition to Python, may be used in the notebook.
22 |
23 | PySpark Kernel (Python)
24 | ************************
25 |
26 | A python Kernel to enable `Apache Spark for python `_.
27 | Writing PySpark Applications is really no different than writing normal Python applications or packages.
28 | It’s quite similar to writing command-line applications in particular.
29 | Spark doesn’t have a build concept, just Python scripts, so to run an application, you simply execute the script against the cluster.
30 |
31 | Syplon Kernel (Scala/Python)
32 | *****************************
33 |
34 | A Scala kernel for Apache Spark that uses `metakernel `_ in combination with `py4j `_.
35 |
36 | R Kernel (R)
37 | ************
38 |
39 | An R kernel for `Apache SparkR `_.
40 | SparkR is an R package that provides a light-weight frontend to use Apache Spark from R.
41 | In Spark 2.4.1, SparkR provides a distributed data frame implementation that supports operations like selection, filtering, aggregation etc. (similar to R data frames, dplyr) but on large datasets.
42 | SparkR also supports distributed machine learning using MLlib.
--------------------------------------------------------------------------------
/docs/source/zeppelin.rst:
--------------------------------------------------------------------------------
1 | Zeppelin Notebook
2 | =================
3 |
4 | Coming soon..
--------------------------------------------------------------------------------
/scripts/docker_install.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 |
3 | # HuntingGrounds script: docker_install.sh
4 | # HuntingGrounnds description: Install docker and docker-compose
5 | # Author: Roberto Rodriguez (@Cyb3rWard0g)
6 | # License: GPL-3.0
7 |
8 | DOCKER_INFO_TAG="[DOCKER-INSTALLATION-INFO]"
9 | DOCKER_ERROR_TAG="[DOCKER-INSTALLATION-ERROR]"
10 |
11 | # *********** Check if user is root ***************
12 | if [[ $EUID -ne 0 ]]; then
13 | echo "$DOCKER_INFO_TAG YOU MUST BE ROOT TO RUN THIS SCRIPT!!!"
14 | exit 1
15 | fi
16 |
17 | # *********** Set Log File ***************
18 | LOGFILE="/var/log/helk-install.log"
19 | echoerror() {
20 | printf "$DOCKER_ERROR_TAG ${RC} * ERROR${EC}: $@\n" 1>&2;
21 | }
22 |
23 | # ********* Globals **********************
24 | SYSTEM_KERNEL="$(uname -s)"
25 |
26 | echo "$DOCKER_INFO_TAG Checking distribution list and product version"
27 | if [ "$SYSTEM_KERNEL" == "Linux" ]; then
28 | # *********** Check distribution list ***************
29 | LSB_DIST="$(. /etc/os-release && echo "$ID")"
30 | LSB_DIST="$(echo "$LSB_DIST" | tr '[:upper:]' '[:lower:]')"
31 | # *********** Check distribution version ***************
32 | case "$LSB_DIST" in
33 | ubuntu)
34 | if [ -x "$(command -v lsb_release)" ]; then
35 | DIST_VERSION="$(lsb_release --codename | cut -f2)"
36 | fi
37 | if [ -z "$DIST_VERSION" ] && [ -r /etc/lsb-release ]; then
38 | DIST_VERSION="$(. /etc/lsb-release && echo "$DISTRIB_CODENAME")"
39 | fi
40 | # ********* Commenting Out CDROM **********************
41 | sed -i "s/\(^deb cdrom.*$\)/\#/g" /etc/apt/sources.list
42 | ;;
43 | debian|raspbian)
44 | DIST_VERSION="$(sed 's/\/.*//' /etc/debian_version | sed 's/\..*//')"
45 | case "$DIST_VERSION" in
46 | 9) DIST_VERSION="stretch";;
47 | 8) DIST_VERSION="jessie";;
48 | 7) DIST_VERSION="wheezy";;
49 | esac
50 | # ********* Commenting Out CDROM **********************
51 | sed -i "s/\(^deb cdrom.*$\)/\#/g" /etc/apt/sources.list
52 | ;;
53 | centos)
54 | if [ -z "$DIST_VERSION" ] && [ -r /etc/os-release ]; then
55 | DIST_VERSION="$(. /etc/os-release && echo "$VERSION_ID")"
56 | fi
57 | ;;
58 | rhel|ol|sles)
59 | ee_notice "$LSB_DIST"
60 | exit 1
61 | ;;
62 | *)
63 | if [ -x "$(command -v lsb_release)" ]; then
64 | DIST_VERSION="$(lsb_release --release | cut -f2)"
65 | fi
66 | if [ -z "$DIST_VERSION" ] && [ -r /etc/os-release ]; then
67 | DIST_VERSION="$(. /etc/os-release && echo "$VERSION_ID")"
68 | fi
69 | ;;
70 | esac
71 | ERROR=$?
72 | if [ $ERROR -ne 0 ]; then
73 | echoerror "Could not verify distribution or version of the OS (Error Code: $ERROR)."
74 | fi
75 | echo "$DOCKER_INFO_TAG You're using $LSB_DIST version $DIST_VERSION"
76 | elif [ "$SYSTEM_KERNEL" == "Darwin" ]; then
77 | PRODUCT_NAME="$(sw_vers -productName)"
78 | PRODUCT_VERSION="$(sw_vers -productVersion)"
79 | BUILD_VERSION="$(sw_vers -buildVersion)"
80 | echo "$DOCKER_INFO_TAG You're using $PRODUCT_NAME version $PRODUCT_VERSION"
81 | else
82 | echo "$DOCKER_INFO_TAG We cannot figure out the SYSTEM_KERNEL, distribution or version of the OS"
83 | fi
84 |
85 |
86 | # ********** Install Curl ********************
87 | install_curl(){
88 | echo "$DOCKER_INFO_TAG Installing curl before installing docker.."
89 | case "$LSB_DIST" in
90 | ubuntu|debian|raspbian)
91 | apt-get install -y curl >> $LOGFILE 2>&1
92 | ;;
93 | centos|rhel)
94 | yum install curl >> $LOGFILE 2>&1
95 | ;;
96 | *)
97 | echo "$DOCKER_INFO_TAG Please install curl for $LSB_DIST $DIST_VERSION .."
98 | exit 1
99 | ;;
100 | esac
101 | ERROR=$?
102 | if [ $ERROR -ne 0 ]; then
103 | echoerror "Could not install curl for $lsb_dist $dist_version (Error Code: $ERROR)."
104 | exit 1
105 | fi
106 | }
107 |
108 | # ****** Installing docker via convenience script ***********
109 | install_docker(){
110 | echo "$DOCKER_INFO_TAG Installing docker via convenience script.."
111 | curl -fsSL get.docker.com -o get-docker.sh >> $LOGFILE 2>&1
112 | chmod +x get-docker.sh >> $LOGFILE 2>&1
113 | ./get-docker.sh >> $LOGFILE 2>&1
114 | ERROR=$?
115 | if [ $ERROR -ne 0 ]; then
116 | echoerror "Could not install docker via convenience script (Error Code: $ERROR)."
117 | if [ -x "$(command -v snap)" ]; then
118 | SNAP_VERSION=$(snap version | grep -w 'snap' | awk '{print $2}')
119 | echo "DOCKER_INFO_TAG Snap v$SNAP_VERSION is available. Trying to install docker via snap.."
120 | snap install docker >> $LOGFILE 2>&1
121 | ERROR=$?
122 | if [ $ERROR -ne 0 ]; then
123 | echoerror "Could not install docker via snap (Error Code: $ERROR)."
124 | exit 1
125 | fi
126 | echo "$DOCKER_INFO_TAG Docker successfully installed via snap."
127 | else
128 | echo "$DOCKER_INFO_TAG Docker could not be installed. Check /var/log/helk-install.log for details."
129 | exit 1
130 | fi
131 | fi
132 | }
133 |
134 | # ****** Installing docker compose from github.com/docker/compose ***********
135 | install_docker_compose(){
136 | echo "$DOCKER_INFO_TAG Installing docker-compose.."
137 | curl -L https://github.com/docker/compose/releases/download/1.23.2/docker-compose-`uname -s`-`uname -m` -o /usr/local/bin/docker-compose >> $LOGFILE 2>&1
138 | chmod +x /usr/local/bin/docker-compose >> $LOGFILE 2>&1
139 | ERROR=$?
140 | if [ $ERROR -ne 0 ]; then
141 | echoerror "Could not install docker-compose (Error Code: $ERROR)."
142 | exit 1
143 | fi
144 | }
145 |
146 | # *********** Main steps *********************
147 | if [ "$SYSTEM_KERNEL" == "Linux" ]; then
148 | # *********** Check if curl is installed ***************
149 | if [ -x "$(command -v curl)" ]; then
150 | echo "$DOCKER_INFO_TAG curl is already installed"
151 | else
152 | echo "$DOCKER_INFO_TAG curl is not installed"
153 | install_curl
154 | fi
155 |
156 | # *********** Check if docker is installed ***************
157 | if [ -x "$(command -v docker)" ]; then
158 | echo "$DOCKER_INFO_TAG Docker already installed"
159 | else
160 | echo "$DOCKER_INFO_TAG Docker is not installed"
161 | install_docker
162 | fi
163 | # ********** Check if docker-compose is installed *******
164 | if [ -x "$(command -v docker-compose)" ]; then
165 | echo "$DOCKER_INFO_TAG Docker-compose already installed"
166 | else
167 | echo "$DOCKER_INFO_TAG Docker-compose is not installed"
168 | install_docker_compose
169 | fi
170 | else
171 | # *********** Check if docker is installed ***************
172 | if [ -x "$(command -v docker)" ] && [ -x "$(command -v docker-compose)" ]; then
173 | echo "$DOCKER_INFO_TAG Docker & Docker-compose already installed"
174 | else
175 | echo "$DOCKER_INFO_TAG Please innstall Docker & Docker-compose for $SYSTEM_KERNEL"
176 | exit 1
177 | fi
178 | fi
--------------------------------------------------------------------------------