19 |
20 |
21 | ## Table of Contents
22 | 1. [Features](#Features)
23 | 2. [Overview](#Overview)
24 | 3. [Documentation and Demo](#Documentation-and-Demo)
25 | 1. [Two ways to generate sample kafka data](#Documentation-and-Demo)
26 | 1. [Manual data entry](#Running-the-demo-kafka-cluster-and-manually-enter-data)
27 | 2. [Streaming API](#Running-the-demo-kafka-cluster-and-using-the-API-for-constant-data-generation)
28 | 2. [Running the Dashboard Application](#Running-the-dashboard)
29 | 4. [Setup](#Connecting-to-an-existing-instance-of-Kafka)
30 | 1. [Connecting to an existing instance of Kafka](#Connecting-to-an-existing-instance-of-Kafka)
31 | 2. [Updating User Database](#Connecting-the-user-database)
32 | 5. [FAQ](#FAQ)
33 | 6. [License](#License)
34 | 7. [Authors](#Authors)
35 |
36 | ## Features
37 |
38 | - Cross-platform Kafka monitoring, real-time data display desktop application
39 | - Metrics monitored are based on feedback from real life Kafka deployment crashes and best practices
40 | - Fullstack integration, leveraging user authentication to ensure only authorized members can review the dashboard
41 |
42 | ## Overview
43 |
44 | Kafkare is a cross-platform Kafka monitoring dashboard application used to oversee the health of the Kafka cluster.
45 | Several key metrics are displayed including consumer lag time, number of topics, as well as system metrics like cpu usage, and available memory.
46 | Users can register for an account and login to access the dashboard. Passwords are encrypted with Bcrypt and stored in an external database.
47 |
48 | ## Documentation and Demo
49 |
50 |
51 | ### Demo Setup
52 |
53 |
QUICK START
54 |
55 |
56 | There are two ways to generate your Kafka data:
57 | 1. Manually - For a controlled amount of data produced
58 | 2. Using an API - For a constant stream of data produced
59 |
60 |
61 | #### Running the demo kafka cluster and manually enter data
62 |
63 | From root directory (Kafkare) go into the kafka-playground folder:
64 |
65 | In the terminal:
66 |
67 | Install all dependencies
68 |
69 | ```sh
70 | npm install
71 | ```
72 |
73 | Set up the docker containers, we have a prebuilt kafka cluster for the demo
74 |
75 |
76 | ```sh
77 | export HOST_IP=$(ifconfig | grep -E "([0-9]{1,3}\.){3}[0-9]{1,3}" | grep -v 127.0.0.1 | awk '{ print $2 }' | cut -f2 -d: | head -n1)
78 | docker-compose up
79 | ```
80 |
81 |
82 |
83 | Run the data generator application to create topics and consumers:
84 |
85 | ```sh
86 | npm run build-testbed
87 | npm run testbed
88 | ```
89 | You can go to your browser and enter in the address bar:
90 |
91 | ```sh
92 | localhost:8181
93 | ```
94 |
95 | to see the data generator
96 |
97 | In the data generator, put in a new topic for the broker and submit it.
98 | Then put in the number of messages you want to produce and submit. This will create that many messages to the kafka cluster.
99 |
100 |
101 | #### Running the demo kafka cluster and using the API for constant data generation
102 |
103 | From root directory (Kafkare) go into the kafka-playground folder/streaming_data:
104 |
105 | In the terminal:
106 |
107 | Install all dependencies
108 |
109 |
110 | ```sh
111 | npm install
112 | ```
113 |
114 | Go back to the kafka-playground folder and install the dependencies there as well:
115 | In the terminal:
116 |
117 | Install all dependencies
118 |
119 | ```sh
120 | npm install
121 | ```
122 |
123 |
124 | Set up the docker containers, we have a prebuilt kafka cluster for the demo
125 |
126 |
127 | ```sh
128 | export HOST_IP=$(ifconfig | grep -E "([0-9]{1,3}\.){3}[0-9]{1,3}" | grep -v 127.0.0.1 | awk '{ print $2 }' | cut -f2 -d: | head -n1)
129 | docker-compose up
130 | ```
131 |
132 |
133 | Run the data streaming application to create topics and consumers:
134 |
135 |
136 | You can start your consumer by going to a new terminal, then open the Kafkare/kafka-playground directory. Now run:
137 |
138 | ```sh
139 | npm run consumer
140 | ```
141 |
142 | Open a new terminal window. From the root directory (Kafkare) go into the kafka-playground directory.
143 | Now create the topic in Kafka by running:
144 |
145 | ```sh
146 | npm run topic
147 | ```
148 |
149 | Finally, to create the generate the stream, run in the terminal:
150 |
151 | ```sh
152 | npm run producer
153 | ```
154 |
155 | #### __*Running the dashboard*__
156 |
157 | Open a new terminal.
158 | In the root directory (Kafkare), run in the terminal
159 |
160 |
161 | ```sh
162 | npm install
163 | npm run build
164 | npm start
165 | ```
166 |
167 | You will see a login page where you can either login with an existing account or create a new account to login with.
168 |
169 | To create a new account, click the register button in the login page. After registering an account, you will be prompted to login in with the account.
170 |
171 | After successfully logging in, a desktop application with the Kafka monitoring dashboards will load and you can start monitoring the running kafka cluster.
172 |
173 | ## Connecting to an existing instance of Kafka
174 | In the kafka-playground/ directory, edit the docker-compose.yml file. Add the following environment variables with relavant information to Kafka-exporter:
175 | Environment Variable | Description
176 | ---------------------|------------
177 | KAFKA_SERVER | Addresses (host:port) of Kafka server.
178 | SASL_USERNAME | SASL user name.
179 | SASL_PASSWORD | SASL user password.
180 |
181 | ```yml
182 | kafka_exporter:
183 | image: danielqsj/kafka-exporter
184 | ports:
185 | - '9308:9308'
186 | environment:
187 | KAFKA_SERVER:
188 | SASL_USERNAME:
189 | SASL_PASSWORD:
190 | ```
191 |
192 | ## Connecting the user database
193 | In the server/db/ directory, edit the db.js file. Within the file, change the value of the PG_URI variable to the postgres database you are using.
194 |
195 | ```javascript
196 | const PG_URI =
197 | 'postgres://:@.db.elephantsql.com:5432/';
198 | ```
199 |
200 | ## FAQ
201 | #### Docker Compose Error
202 | **Q1.** I'm getting this error when I use docker-compose up
203 |
204 | ```sh
205 | During handling of the above exception, another exception occurred:
206 |
207 | Traceback (most recent call last):
208 | File "docker-compose", line 3, in
209 | File "compose\cli\main.py", line 67, in main
210 | File "compose\cli\main.py", line 123, in perform_command
211 | File "compose\cli\command.py", line 69, in project_from_options
212 | File "compose\cli\command.py", line 132, in get_project
213 | File "compose\cli\docker_client.py", line 43, in get_client
214 | File "compose\cli\docker_client.py", line 170, in docker_client
215 | File "site-packages\docker\api\client.py", line 188, in __init__
216 | File "site-packages\docker\api\client.py", line 213, in _retrieve_server_version
217 | docker.errors.DockerException: Error while fetching server API version: (2, 'CreateFile', 'The system cannot find the file specified.')
218 | ```
219 |
220 | **A1.** Make sure Docker Desktop is up and running.
221 |
222 |
223 |
224 | **Q2:** Why doesn't Kafka doesn't start when I use docker-compose.
225 |
226 | **A2:** Make sure your hostIP is defined.
227 | On iOS or Linux use:
228 | ```sh
229 | export HOST_IP=$(ifconfig | grep -E "([0-9]{1,3}\.){3}[0-9]{1,3}" | grep -v 127.0.0.1 | awk '{ print $2 }' | cut -f2 -d: | head -n1)
230 | ```
231 |
232 | For Windows Users use:
233 | ```sh
234 | export HOST_IP=$(ipconfig | grep -E "([0-9]{1,3}\.){3}[0-9]{1,3}" | grep -v 127.0.0.1 | awk '{ print $14 }' | cut -f2 -d: | head -n1)
235 | ```
236 |
237 | ## License
238 |
239 | The JavaScript Templates script is released under the
240 | [MIT license](https://opensource.org/licenses/MIT).
241 |
242 | ## Authors
243 |
244 | [Jenniel Figuereo](https://github.com/jfiguereo89)
245 | [Jiaxin Li](https://github.com/lijiaxingogo)
246 | [Joel Beger](https://github.com/jtbeger)
247 | [Wai Fai Lau](https://github.com/wlau8088/)
--------------------------------------------------------------------------------
/__tests__/user.test.js:
--------------------------------------------------------------------------------
1 | const supertest = require('supertest');
2 |
3 | const { app, server } = require('../server/server');
4 | const request = supertest(app);
5 |
6 | describe('User Endpoint Test Suite', () => {
7 | const userData = {};
8 |
9 | // test creation of user
10 | it('create a user', async (done) => {
11 | const payload = {
12 | name: 'TestName',
13 | email: 'testEmail@gmail.com',
14 | password: 'testPass',
15 | };
16 |
17 | const response = await request.post('/user/signup').send(payload);
18 |
19 | const { id, name, email, success } = response.body;
20 | userData.testId = id;
21 | expect(response.status).toBe(200);
22 | expect(typeof response.body).toBe('object');
23 |
24 | expect(typeof id).toBe('number');
25 | expect(name).toBe('TestName');
26 | expect(email).toBe('testEmail@gmail.com');
27 | expect(success).toBe(true);
28 | done();
29 | });
30 |
31 | // test user login
32 | it('get newly created user', async (done) => {
33 | const payload = {
34 | email: 'testEmail@gmail.com',
35 | password: 'testPass',
36 | };
37 | const response = await request.post('/user/login').send(payload);
38 | const { loginSuccess, userId } = response.body;
39 |
40 | expect(response.status).toBe(200);
41 | expect(typeof userId).toBe('number');
42 | expect(loginSuccess).toBe(true);
43 | done();
44 | });
45 |
46 | // test deletion of user
47 | it('delete the user', async (done) => {
48 | const response = await request.delete(`/user/${userData.testId}`);
49 | expect(response.body).toBe(1);
50 | done();
51 | });
52 | });
53 |
54 | server.close();
55 |
--------------------------------------------------------------------------------
/dist/bundle.js.LICENSE.txt:
--------------------------------------------------------------------------------
1 | /*
2 | object-assign
3 | (c) Sindre Sorhus
4 | @license MIT
5 | */
6 |
7 | /*!
8 | Copyright (c) 2017 Jed Watson.
9 | Licensed under the MIT License (MIT), see
10 | http://jedwatson.github.io/classnames
11 | */
12 |
13 | /** @license React v0.20.1
14 | * scheduler.production.min.js
15 | *
16 | * Copyright (c) Facebook, Inc. and its affiliates.
17 | *
18 | * This source code is licensed under the MIT license found in the
19 | * LICENSE file in the root directory of this source tree.
20 | */
21 |
22 | /** @license React v16.13.1
23 | * react-is.production.min.js
24 | *
25 | * Copyright (c) Facebook, Inc. and its affiliates.
26 | *
27 | * This source code is licensed under the MIT license found in the
28 | * LICENSE file in the root directory of this source tree.
29 | */
30 |
31 | /** @license React v17.0.1
32 | * react-dom.production.min.js
33 | *
34 | * Copyright (c) Facebook, Inc. and its affiliates.
35 | *
36 | * This source code is licensed under the MIT license found in the
37 | * LICENSE file in the root directory of this source tree.
38 | */
39 |
40 | /** @license React v17.0.1
41 | * react.production.min.js
42 | *
43 | * Copyright (c) Facebook, Inc. and its affiliates.
44 | *
45 | * This source code is licensed under the MIT license found in the
46 | * LICENSE file in the root directory of this source tree.
47 | */
48 |
--------------------------------------------------------------------------------
/electron.js:
--------------------------------------------------------------------------------
1 | const { app, BrowserWindow } = require('electron');
2 | var path = require('path');
3 |
4 | function createWindow() {
5 | const win = new BrowserWindow({
6 | width: 1740,
7 | height: 984,
8 | icon: path.join(__dirname, 'assets/icons/png/64x64.png'),
9 | webPreferences: {
10 | nodeIntegration: true,
11 | },
12 | });
13 |
14 | win.loadFile('./src/index.html');
15 | }
16 |
17 | app.whenReady().then(createWindow);
18 |
19 | app.on('window-all-closed', () => {
20 | if (process.platform !== 'darwin') {
21 | app.quit();
22 | }
23 | });
24 |
25 | app.on('activate', () => {
26 | if (BrowserWindow.getAllWindows().length === 0) {
27 | createWindow();
28 | }
29 | });
30 |
--------------------------------------------------------------------------------
/kafka-playground/.DS_Store:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/oslabs-beta/KafKare/994d7b6fd5a6af46de9e588c5cdbd0fea08d1210/kafka-playground/.DS_Store
--------------------------------------------------------------------------------
/kafka-playground/.gitignore:
--------------------------------------------------------------------------------
1 | node_modules
2 | package-lock.json
--------------------------------------------------------------------------------
/kafka-playground/Dockerfile:
--------------------------------------------------------------------------------
1 | FROM node:latest
2 |
3 | COPY . /image
4 |
5 | WORKDIR /image
6 |
7 | RUN npm i
8 |
9 | EXPOSE 5000
10 |
11 | CMD ["npm", "run", "testbed"]
--------------------------------------------------------------------------------
/kafka-playground/FETCH_HEAD:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/oslabs-beta/KafKare/994d7b6fd5a6af46de9e588c5cdbd0fea08d1210/kafka-playground/FETCH_HEAD
--------------------------------------------------------------------------------
/kafka-playground/Utils/.DS_Store:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/oslabs-beta/KafKare/994d7b6fd5a6af46de9e588c5cdbd0fea08d1210/kafka-playground/Utils/.DS_Store
--------------------------------------------------------------------------------
/kafka-playground/Utils/scripts/.DS_Store:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/oslabs-beta/KafKare/994d7b6fd5a6af46de9e588c5cdbd0fea08d1210/kafka-playground/Utils/scripts/.DS_Store
--------------------------------------------------------------------------------
/kafka-playground/Utils/scripts/prometheus.yml:
--------------------------------------------------------------------------------
1 | # my global config
2 | global:
3 | scrape_interval: 5s # By default, scrape targets every 15 seconds.
4 | evaluation_interval: 5s # By default, scrape targets every 15 seconds.
5 | # scrape_timeout is set to the global default (10s).
6 |
7 | scrape_configs:
8 | - job_name: 'kafka_exporter'
9 | scrape_interval: 5s
10 | static_configs:
11 | - targets: ['kafka_exporter:9308']
12 | labels:
13 | group: 'kafka_exporter'
14 |
15 | - job_name: 'prometheus'
16 | static_configs:
17 | - targets: ['localhost:9090', 'node-exporter:9100']
18 |
19 | # - job_name: 'node-ex'
20 | # static_configs:
21 | # - targets: ['node-exporter:9100']
22 |
--------------------------------------------------------------------------------
/kafka-playground/babel.config.js:
--------------------------------------------------------------------------------
1 | module.exports = {
2 | presets: [
3 | ['@babel/preset-env', { targets: { node: 'current' } }],
4 | '@babel/preset-typescript',
5 | ],
6 | };
7 |
--------------------------------------------------------------------------------
/kafka-playground/data-generator/dg.css:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/oslabs-beta/KafKare/994d7b6fd5a6af46de9e588c5cdbd0fea08d1210/kafka-playground/data-generator/dg.css
--------------------------------------------------------------------------------
/kafka-playground/data-generator/index.html:
--------------------------------------------------------------------------------
1 |
2 |
3 |
4 |
5 |
6 | Data Generator
7 |
8 |
9 |