├── .env.sample ├── .github ├── ISSUE_TEMPLATE │ ├── 01_question.md │ ├── 02_bug.md │ ├── 03_feature.md │ └── config.yml └── PULL_REQUEST_TEMPLATE.md ├── .gitignore ├── 00 - Aiven Setup.ipynb ├── 01 - Producer.ipynb ├── 02 - Consumer.ipynb ├── 03 - 00 - Partition Producer.ipynb ├── 03 - 01 - Consumer - Partition 0.ipynb ├── 03 - 02 - Consumer - Partition 1.ipynb ├── 04 - New Consumer Group.ipynb ├── 05 - Kafka Connect.ipynb ├── 0N - Aiven - Delete Services.ipynb ├── CODE_OF_CONDUCT.md ├── CONTRIBUTING.md ├── LICENSE ├── README.md └── images ├── connect_pg.png ├── consumer groups.png ├── consumer.png ├── consumer_groups.png ├── move-consumer.gif ├── overall.png ├── partitions.png └── producing.png /.env.sample: -------------------------------------------------------------------------------- 1 | TOKEN=YOUR_AIVEN_TOKEN 2 | PROJECT_NAME=YOUR_PROJECT_NAME -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/01_question.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: ❓ Ask a question 3 | about: Got stuck or missing something from the docs? Ask away! 4 | --- 5 | 6 | # What can we help you with? 7 | 8 | 9 | 10 | # Where would you expect to find this information? 11 | 12 | 13 | 14 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/02_bug.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: 🐜 Report a bug 3 | about: Spotted a problem? Let us know 4 | --- 5 | 6 | # What happened? 7 | 8 | 9 | 10 | # What did you expect to happen? 11 | 12 | 13 | 14 | # What else do we need to know? 15 | 16 | 17 | 18 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/03_feature.md: -------------------------------------------------------------------------------- 1 | --- 2 | name: 💡 Feature suggestion 3 | about: What would make this even better? 4 | --- 5 | 6 | # What is currently missing? 7 | 8 | 9 | 10 | # How could this be improved? 11 | 12 | 13 | 14 | # Is this a feature you would work on yourself? 15 | 16 | * [ ] I plan to open a pull request for this feature 17 | -------------------------------------------------------------------------------- /.github/ISSUE_TEMPLATE/config.yml: -------------------------------------------------------------------------------- 1 | blank_issues_enabled: true 2 | contact_links: 3 | - name: Aiven Security Bug Bounty 4 | url: https://hackerone.com/aiven_ltd 5 | about: Our bug bounty program. 6 | -------------------------------------------------------------------------------- /.github/PULL_REQUEST_TEMPLATE.md: -------------------------------------------------------------------------------- 1 | 2 | # About this change - What it does 3 | 4 | 5 | 6 | 7 | Resolves: #xxxxx 8 | 9 | # Why this way 10 | 11 | 12 | -------------------------------------------------------------------------------- /.gitignore: -------------------------------------------------------------------------------- 1 | .ipynb_checkpoints 2 | config 3 | kafkacerts 4 | .env -------------------------------------------------------------------------------- /00 - Aiven Setup.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# 🦀 Aiven Setup\n", 8 | "\n", 9 | "\n", 10 | "\n", 11 | "The setup starts an **Apache Kafka®** and a **PostgreSQL®** database used in the other notebooks\n", 12 | "To execute all the steps on top of Aiven.io instances, please register on the [console](https://go.aiven.io/francesco-signup) \n", 13 | "\n", 14 | "---\n", 15 | "\n", 16 | "## Set the Environment \n", 17 | "\n", 18 | "### Set Variables\n", 19 | "\n", 20 | "The following global variables will define the kafka instance, change with care" 21 | ] 22 | }, 23 | { 24 | "cell_type": "code", 25 | "execution_count": null, 26 | "id": "67e54f0c", 27 | "metadata": {}, 28 | "outputs": [], 29 | "source": [ 30 | "%%bash\n", 31 | "mkdir -p config\n", 32 | "echo \"\"\"\n", 33 | "FOLDER_NAME=kafkacerts\n", 34 | "CLOUD=google-europe-west3\n", 35 | "KAFKA_NAME=kafka-webinar\n", 36 | "POSTGRES_NAME=pg-webinar\n", 37 | "AIVEN_PLAN_NAME=business-4\n", 38 | "TOPIC_NAME=pizzas\n", 39 | "PG_USER=new_pg_user\n", 40 | "PG_PWD=NewPWD123\n", 41 | "KAFKA_TIMEOUT=5000\n", 42 | "\"\"\" > config/profile_info.sh\n", 43 | "\n", 44 | "echo \"\" > config/__init__.py\n" 45 | ] 46 | }, 47 | { 48 | "cell_type": "markdown", 49 | "id": "27f46848", 50 | "metadata": {}, 51 | "source": [ 52 | "### Create an Aiven Kafka® service\n", 53 | "\n", 54 | "For this demo we'll create an Apache Kafka® service in [Aiven](https://go.aiven.io/francesco-signup) in order to follow these steps you would need to have a valid login to Aiven console and have a [token already generated](https://docs.aiven.io/docs/platform/howto/create_authentication_token).\n", 55 | "\n", 56 | "Copy the file `.env.sample` to `.env` and substitute the placeholders `` and `` fields to match yours. Let's login" 57 | ] 58 | }, 59 | { 60 | "cell_type": "code", 61 | "execution_count": null, 62 | "id": "2b6320fd", 63 | "metadata": {}, 64 | "outputs": [], 65 | "source": [ 66 | "%%bash \n", 67 | ". .env\n", 68 | "pip install aiven-client" 69 | ] 70 | }, 71 | { 72 | "cell_type": "markdown", 73 | "id": "1941786b", 74 | "metadata": {}, 75 | "source": [ 76 | "After the login we can create our kafka instance" 77 | ] 78 | }, 79 | { 80 | "cell_type": "code", 81 | "execution_count": null, 82 | "id": "f2739d90", 83 | "metadata": {}, 84 | "outputs": [], 85 | "source": [ 86 | "%%bash\n", 87 | "source .env\n", 88 | "source config/profile_info.sh\n", 89 | "\n", 90 | "echo $TOKEN\n", 91 | "\n", 92 | "avn --auth-token $TOKEN \\\n", 93 | " service create -p $AIVEN_PLAN_NAME \\\n", 94 | " -t kafka $KAFKA_NAME \\\n", 95 | " --cloud $CLOUD \\\n", 96 | " --project $PROJECT_NAME \\\n", 97 | " -c kafka_rest=true \\\n", 98 | " -c kafka.auto_create_topics_enable=true \\\n", 99 | " -c schema_registry=true \\\n", 100 | " -c kafka_connect=true\n", 101 | "\n", 102 | "avn --auth-token $TOKEN service create $POSTGRES_NAME -t pg -p startup-4 --cloud $CLOUD --project $PROJECT_NAME\n", 103 | "\n", 104 | "avn --auth-token $TOKEN service wait $KAFKA_NAME --project $PROJECT_NAME" 105 | ] 106 | }, 107 | { 108 | "cell_type": "markdown", 109 | "id": "660b8a04", 110 | "metadata": {}, 111 | "source": [ 112 | "Get Kafka certificates and service URI. We'll use the certificates to connect" 113 | ] 114 | }, 115 | { 116 | "cell_type": "code", 117 | "execution_count": null, 118 | "id": "e14ffcf8", 119 | "metadata": {}, 120 | "outputs": [], 121 | "source": [ 122 | "%%bash\n", 123 | "# Download all certificates\n", 124 | "source .env\n", 125 | "source config/profile_info.sh\n", 126 | "\n", 127 | "mkdir -p kafkacerts\n", 128 | "avn --auth-token $TOKEN service user-creds-download $KAFKA_NAME --project $PROJECT_NAME -d $FOLDER_NAME --username avnadmin\n", 129 | "\n", 130 | "# get KAFKA URL and store the `HOSTNAME` and `PORT`\n", 131 | "HOSTNAME=$(avn --auth-token $TOKEN service get $KAFKA_NAME --project=$PROJECT_NAME --json | jq -r '.components[] | select (.component==\"kafka\") | .host')\n", 132 | "PORT=$(avn --auth-token $TOKEN service get $KAFKA_NAME --project=$PROJECT_NAME --json | jq -r '.components[] | select (.component==\"kafka\") | .port')\n", 133 | "\n", 134 | "echo \"hostname = \\\"$HOSTNAME\\\"\" > config/kafka_config.py\n", 135 | "echo \"port = $PORT\" >> config/kafka_config.py\n", 136 | "echo \"cert_folder = \\\"$FOLDER_NAME\\\"\" >> config/kafka_config.py\n", 137 | "echo \"topic_name = \\\"$TOPIC_NAME\\\"\" >> config/kafka_config.py\n", 138 | "echo \"pg_user = \\\"$PG_USER\\\"\" >> config/kafka_config.py\n", 139 | "echo \"pg_pwd = \\\"$PG_PWD\\\"\" >> config/kafka_config.py\n", 140 | "echo \"timeout_ms = $KAFKA_TIMEOUT\" >> config/kafka_config.py" 141 | ] 142 | }, 143 | { 144 | "cell_type": "markdown", 145 | "id": "06264121", 146 | "metadata": {}, 147 | "source": [ 148 | "Create **PostgreSQL User**" 149 | ] 150 | }, 151 | { 152 | "cell_type": "code", 153 | "execution_count": null, 154 | "id": "fed47d15", 155 | "metadata": {}, 156 | "outputs": [], 157 | "source": [ 158 | "%%bash\n", 159 | "source .env\n", 160 | "source config/profile_info.sh\n", 161 | "#avn service create $POSTGRES_NAME -t pg -p startup-4 --cloud $CLOUD --project $PROJECT_NAME\n", 162 | "\n", 163 | "avn --auth-token $TOKEN service wait $POSTGRES_NAME --project $PROJECT_NAME\n", 164 | "avn --auth-token $TOKEN service get $POSTGRES_NAME --project=$PROJECT_NAME --format '{service_uri_params}' > config/pg_config.json\n", 165 | "avn --auth-token $TOKEN service user-create $POSTGRES_NAME --project $PROJECT_NAME --username $PG_USER\n", 166 | "avn --auth-token $TOKEN service user-password-reset $POSTGRES_NAME --project $PROJECT_NAME --username $PG_USER --new-password $PG_PWD" 167 | ] 168 | }, 169 | { 170 | "cell_type": "markdown", 171 | "id": "d1d9144a", 172 | "metadata": {}, 173 | "source": [ 174 | "Now let's parse the fields and create the Kafka Connect Connector config file" 175 | ] 176 | }, 177 | { 178 | "cell_type": "code", 179 | "execution_count": null, 180 | "id": "7533687e", 181 | "metadata": {}, 182 | "outputs": [], 183 | "source": [ 184 | "import json\n", 185 | "from config.kafka_config import *\n", 186 | "with open('config/pg_config.json') as json_file:\n", 187 | " data = json.loads(json_file.read().replace(\"'\",'\"'))\n", 188 | " pg_dbname = data['dbname']\n", 189 | " pg_host = data['host']\n", 190 | " pg_port = data['port']\n", 191 | " pg_super_user = data['user']\n", 192 | " pg_super_pwd = data['password']\n", 193 | "\n", 194 | "connect_setup = {\n", 195 | " \"name\": \"sink_kafka_pg\",\n", 196 | " \"connector.class\": \"io.aiven.connect.jdbc.JdbcSinkConnector\",\n", 197 | " \"value.converter\": \"org.apache.kafka.connect.json.JsonConverter\",\n", 198 | " \"topics.regex\": topic_name+\"_schema\",\n", 199 | " \"connection.url\": \"jdbc:postgresql://\"+pg_host+\":\"+pg_port+\"/\"+pg_dbname+\"?sslmode=require\",\n", 200 | " \"connection.user\": pg_user,\n", 201 | " \"connection.password\": pg_pwd,\n", 202 | " \"auto.create\": \"true\"\n", 203 | "}\n", 204 | "\n", 205 | "f = open(\"config/kafka_connect_setup.txt\", \"w\")\n", 206 | "f.write(json.dumps(connect_setup, indent=4, sort_keys=True))\n", 207 | "f.close()" 208 | ] 209 | }, 210 | { 211 | "cell_type": "markdown", 212 | "id": "c679bdf0", 213 | "metadata": {}, 214 | "source": [ 215 | "Connect to postgress and grant access to newly created user" 216 | ] 217 | }, 218 | { 219 | "cell_type": "code", 220 | "execution_count": null, 221 | "id": "3afd3cff", 222 | "metadata": {}, 223 | "outputs": [], 224 | "source": [ 225 | "%%bash \n", 226 | "\n", 227 | "pip install psycopg2-binary" 228 | ] 229 | }, 230 | { 231 | "cell_type": "code", 232 | "execution_count": null, 233 | "id": "b26a7400", 234 | "metadata": {}, 235 | "outputs": [], 236 | "source": [ 237 | "import psycopg2\n", 238 | "conn = psycopg2.connect(dbname=pg_dbname,\n", 239 | " user=pg_super_user,\n", 240 | " host=pg_host,\n", 241 | " port=pg_port,\n", 242 | " password=pg_super_pwd,\n", 243 | " sslmode='require')\n", 244 | "cur = conn.cursor()\n", 245 | "cur.execute(\"GRANT CONNECT ON DATABASE defaultdb TO \"+pg_user+\";\")\n", 246 | "cur.execute(\"GRANT USAGE ON SCHEMA public TO \"+pg_user+\";\")\n" 247 | ] 248 | }, 249 | { 250 | "cell_type": "code", 251 | "execution_count": null, 252 | "metadata": {}, 253 | "outputs": [], 254 | "source": [] 255 | } 256 | ], 257 | "metadata": { 258 | "kernelspec": { 259 | "display_name": "Python 3.8.5 64-bit ('3.8.5')", 260 | "language": "python", 261 | "name": "python3" 262 | }, 263 | "language_info": { 264 | "codemirror_mode": { 265 | "name": "ipython", 266 | "version": 3 267 | }, 268 | "file_extension": ".py", 269 | "mimetype": "text/x-python", 270 | "name": "python", 271 | "nbconvert_exporter": "python", 272 | "pygments_lexer": "ipython3", 273 | "version": "3.9.15" 274 | }, 275 | "vscode": { 276 | "interpreter": { 277 | "hash": "6064663bdb0cf7ad80e39fa9924d85b32044da2a4abedcbe30d3eba51421769c" 278 | } 279 | } 280 | }, 281 | "nbformat": 4, 282 | "nbformat_minor": 5 283 | } 284 | -------------------------------------------------------------------------------- /01 - Producer.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# 🦀 Time to play with Apache Kafka®" 8 | ] 9 | }, 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "The first step is to install the required libraries, in our case `kafka-python` (and `pandas` to support visualization)" 15 | ] 16 | }, 17 | { 18 | "cell_type": "code", 19 | "execution_count": null, 20 | "metadata": {}, 21 | "outputs": [], 22 | "source": [ 23 | "%%bash\n", 24 | "pip install kafka-python\n" 25 | ] 26 | }, 27 | { 28 | "cell_type": "markdown", 29 | "metadata": {}, 30 | "source": [ 31 | "---\n", 32 | "\n", 33 | "## Kafka Producer\n", 34 | "\n", 35 | "Now we can create a Kafka Producer\n", 36 | "\n", 37 | "![Kafka Producer](images/producing.png)\n", 38 | "\n", 39 | "we need to have an Apache Kafka instance already available, if not you can easily create one on [Aiven.io](https://aiven.io/)" 40 | ] 41 | }, 42 | { 43 | "cell_type": "code", 44 | "execution_count": null, 45 | "metadata": {}, 46 | "outputs": [], 47 | "source": [ 48 | "from kafka import KafkaProducer\n", 49 | "import json\n", 50 | "from config.kafka_config import *\n", 51 | "\n", 52 | "producer = KafkaProducer(\n", 53 | " bootstrap_servers=hostname+\":\"+str(port),\n", 54 | " security_protocol=\"SSL\",\n", 55 | " ssl_cafile=cert_folder+\"/ca.pem\",\n", 56 | " ssl_certfile=cert_folder+\"/service.cert\",\n", 57 | " ssl_keyfile=cert_folder+\"/service.key\",\n", 58 | " value_serializer=lambda v: json.dumps(v).encode('ascii'),\n", 59 | " key_serializer=lambda v: json.dumps(v).encode('ascii')\n", 60 | ")" 61 | ] 62 | }, 63 | { 64 | "cell_type": "markdown", 65 | "metadata": {}, 66 | "source": [ 67 | "Once the Producer is created, we can produce a couple events" 68 | ] 69 | }, 70 | { 71 | "cell_type": "code", 72 | "execution_count": null, 73 | "metadata": {}, 74 | "outputs": [], 75 | "source": [ 76 | "producer.send(\n", 77 | " topic_name,\n", 78 | " key={\"id\":1},\n", 79 | " value={\"name\":\"👨 Francesco\", \"pizza\":\"Margherita 🍕\"}\n", 80 | ")\n", 81 | "\n", 82 | "producer.flush()" 83 | ] 84 | }, 85 | { 86 | "cell_type": "markdown", 87 | "metadata": {}, 88 | "source": [ 89 | "Now let's produce one more event" 90 | ] 91 | }, 92 | { 93 | "cell_type": "code", 94 | "execution_count": null, 95 | "metadata": {}, 96 | "outputs": [], 97 | "source": [ 98 | "producer.send(\n", 99 | " topic_name,\n", 100 | " key={\"id\":2},\n", 101 | " value={\"name\":\"👩 Adele\", \"pizza\":\"Hawaii 🍕+🍍+🥓\"}\n", 102 | ")\n", 103 | "\n", 104 | "producer.send(\n", 105 | " topic_name,\n", 106 | " key={\"id\":3},\n", 107 | " value={\"name\":\"👦 Mark\", \"pizza\":\"Choccolate 🍕+🍫\"}\n", 108 | ")\n", 109 | "\n", 110 | "producer.flush()" 111 | ] 112 | }, 113 | { 114 | "cell_type": "markdown", 115 | "metadata": {}, 116 | "source": [ 117 | "Now let's produce another event <-- New Consumer Group" 118 | ] 119 | }, 120 | { 121 | "cell_type": "code", 122 | "execution_count": null, 123 | "metadata": {}, 124 | "outputs": [], 125 | "source": [ 126 | "producer.send(\n", 127 | " topic_name,\n", 128 | " key={\"id\":4},\n", 129 | " value={\"name\":\"👨 Dan\", \"pizza\":\"Fries 🍕+🍟\"}\n", 130 | ")\n", 131 | "producer.flush()" 132 | ] 133 | }, 134 | { 135 | "cell_type": "code", 136 | "execution_count": null, 137 | "metadata": {}, 138 | "outputs": [], 139 | "source": [] 140 | } 141 | ], 142 | "metadata": { 143 | "kernelspec": { 144 | "display_name": "Python 3", 145 | "language": "python", 146 | "name": "python3" 147 | }, 148 | "language_info": { 149 | "codemirror_mode": { 150 | "name": "ipython", 151 | "version": 3 152 | }, 153 | "file_extension": ".py", 154 | "mimetype": "text/x-python", 155 | "name": "python", 156 | "nbconvert_exporter": "python", 157 | "pygments_lexer": "ipython3", 158 | "version": "3.9.15" 159 | } 160 | }, 161 | "nbformat": 4, 162 | "nbformat_minor": 5 163 | } 164 | -------------------------------------------------------------------------------- /02 - Consumer.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# 🦀 Kafka Consumer\n", 8 | "---\n" 9 | ] 10 | }, 11 | { 12 | "cell_type": "markdown", 13 | "metadata": {}, 14 | "source": [ 15 | "The first step is to install the required libraries, in our case `kafka-python` (and `pandas` to support visualization)" 16 | ] 17 | }, 18 | { 19 | "cell_type": "markdown", 20 | "metadata": {}, 21 | "source": [ 22 | "Now let's consume the messages, by creating a **Kafka Consumer**\n", 23 | "\n", 24 | "![Kafka Consumer](images/consumer.png)\n" 25 | ] 26 | }, 27 | { 28 | "cell_type": "code", 29 | "execution_count": null, 30 | "metadata": {}, 31 | "outputs": [], 32 | "source": [ 33 | "from kafka import KafkaConsumer\n", 34 | "from config.kafka_config import *\n", 35 | "import json\n", 36 | "group_id = \"my_pizza_group\"\n", 37 | "\n", 38 | "consumer = KafkaConsumer(\n", 39 | " client_id = \"client1\",\n", 40 | " group_id = group_id,\n", 41 | " bootstrap_servers = hostname+\":\"+str(port),\n", 42 | " security_protocol = \"SSL\",\n", 43 | " ssl_cafile = cert_folder+\"/ca.pem\",\n", 44 | " ssl_certfile = cert_folder+\"/service.cert\",\n", 45 | " ssl_keyfile = cert_folder+\"/service.key\",\n", 46 | " value_deserializer = lambda v: json.loads(v.decode('ascii')),\n", 47 | " key_deserializer = lambda v: json.loads(v.decode('ascii')),\n", 48 | " max_poll_records = 10\n", 49 | ")" 50 | ] 51 | }, 52 | { 53 | "cell_type": "markdown", 54 | "metadata": {}, 55 | "source": [ 56 | "Check out topics" 57 | ] 58 | }, 59 | { 60 | "cell_type": "code", 61 | "execution_count": null, 62 | "metadata": {}, 63 | "outputs": [], 64 | "source": [ 65 | "consumer.topics()" 66 | ] 67 | }, 68 | { 69 | "cell_type": "markdown", 70 | "metadata": {}, 71 | "source": [ 72 | "and subscribe to the topic" 73 | ] 74 | }, 75 | { 76 | "cell_type": "code", 77 | "execution_count": null, 78 | "metadata": {}, 79 | "outputs": [], 80 | "source": [ 81 | "consumer.subscribe(topics=[topic_name])\n", 82 | "consumer.subscription()" 83 | ] 84 | }, 85 | { 86 | "cell_type": "markdown", 87 | "metadata": {}, 88 | "source": [ 89 | "Now we start reading" 90 | ] 91 | }, 92 | { 93 | "cell_type": "code", 94 | "execution_count": null, 95 | "metadata": {}, 96 | "outputs": [], 97 | "source": [ 98 | "for message in consumer:\n", 99 | " print (\"%d:%d: k=%s v=%s\" % (message.partition,\n", 100 | " message.offset,\n", 101 | " message.key,\n", 102 | " message.value))" 103 | ] 104 | }, 105 | { 106 | "cell_type": "code", 107 | "execution_count": null, 108 | "metadata": {}, 109 | "outputs": [], 110 | "source": [] 111 | } 112 | ], 113 | "metadata": { 114 | "kernelspec": { 115 | "display_name": "Python 3", 116 | "language": "python", 117 | "name": "python3" 118 | }, 119 | "language_info": { 120 | "codemirror_mode": { 121 | "name": "ipython", 122 | "version": 3 123 | }, 124 | "file_extension": ".py", 125 | "mimetype": "text/x-python", 126 | "name": "python", 127 | "nbconvert_exporter": "python", 128 | "pygments_lexer": "ipython3", 129 | "version": "3.9.15" 130 | } 131 | }, 132 | "nbformat": 4, 133 | "nbformat_minor": 5 134 | } 135 | -------------------------------------------------------------------------------- /03 - 00 - Partition Producer.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "## 🦀 Topic Partitions\n", 8 | "\n", 9 | "![Partitions](images/partitions.png)\n", 10 | "\n", 11 | "Partitions are a way to distribute our dataset, Apache Kafka guarantees the correct ordering only in events registered in the same partitions. \n", 12 | "\n", 13 | "Let's create a new producer" 14 | ] 15 | }, 16 | { 17 | "cell_type": "code", 18 | "execution_count": null, 19 | "metadata": {}, 20 | "outputs": [], 21 | "source": [ 22 | "from kafka import KafkaProducer\n", 23 | "import json\n", 24 | "from config.kafka_config import *\n", 25 | "\n", 26 | "producer = KafkaProducer(\n", 27 | " bootstrap_servers=hostname+\":\"+str(port),\n", 28 | " security_protocol=\"SSL\",\n", 29 | " ssl_cafile=cert_folder+\"/ca.pem\",\n", 30 | " ssl_certfile=cert_folder+\"/service.cert\",\n", 31 | " ssl_keyfile=cert_folder+\"/service.key\",\n", 32 | " value_serializer=lambda v: json.dumps(v).encode('ascii'),\n", 33 | " key_serializer=lambda v: json.dumps(v).encode('ascii') \n", 34 | " )" 35 | ] 36 | }, 37 | { 38 | "cell_type": "markdown", 39 | "metadata": {}, 40 | "source": [ 41 | "Create a topic with **two partitions**, check `num_partitions=2`" 42 | ] 43 | }, 44 | { 45 | "cell_type": "code", 46 | "execution_count": null, 47 | "metadata": {}, 48 | "outputs": [], 49 | "source": [ 50 | "from kafka.admin import KafkaAdminClient, NewTopic\n", 51 | "\n", 52 | "\n", 53 | "admin = KafkaAdminClient(\n", 54 | " client_id ='admin',\n", 55 | " bootstrap_servers=hostname+\":\"+str(port),\n", 56 | " security_protocol=\"SSL\",\n", 57 | " ssl_cafile=cert_folder+\"/ca.pem\",\n", 58 | " ssl_certfile=cert_folder+\"/service.cert\",\n", 59 | " ssl_keyfile=cert_folder+\"/service.key\",\n", 60 | " )\n", 61 | "\n", 62 | "topic_name_partitioned=topic_name+\"_partitioned\"\n", 63 | "\n", 64 | "topic=NewTopic(name=topic_name_partitioned, num_partitions=2, replication_factor=1)\n", 65 | "\n", 66 | "admin.create_topics([topic], timeout_ms=int(timeout_ms))" 67 | ] 68 | }, 69 | { 70 | "cell_type": "markdown", 71 | "metadata": {}, 72 | "source": [ 73 | "And now push data to the two partitions" 74 | ] 75 | }, 76 | { 77 | "cell_type": "code", 78 | "execution_count": null, 79 | "metadata": {}, 80 | "outputs": [], 81 | "source": [ 82 | "producer.send(topic_name_partitioned,\n", 83 | " key={\"id\":1},\n", 84 | " value={\"name\":\"👨 Frank\", \"pizza\":\"Margherita 🍕\"},\n", 85 | " partition=0\n", 86 | " )\n", 87 | "producer.send(topic_name_partitioned,\n", 88 | " key={\"id\":2},\n", 89 | " value={\"name\":\"👩 Adele\", \"pizza\":\"Hawaii 🍕+🍍+🥓\"},\n", 90 | " partition=1\n", 91 | " )\n", 92 | "producer.flush()" 93 | ] 94 | }, 95 | { 96 | "cell_type": "markdown", 97 | "metadata": {}, 98 | "source": [ 99 | "Insert more data" 100 | ] 101 | }, 102 | { 103 | "cell_type": "code", 104 | "execution_count": null, 105 | "metadata": {}, 106 | "outputs": [], 107 | "source": [ 108 | "producer.send(topic_name_partitioned,\n", 109 | " key={\"id\":1},\n", 110 | " value={\"name\":\"🙎 Mark\", \"pizza\":\"Banana 🍕+🍌\"},\n", 111 | " partition=0\n", 112 | " )\n", 113 | "producer.send(topic_name_partitioned,\n", 114 | " key={\"id\":2},\n", 115 | " value={\"name\":\"👨 Jan\", \"pizza\":\"Mushrooms 🍕+🍄\"},\n", 116 | " partition=1\n", 117 | " )\n", 118 | "producer.flush()" 119 | ] 120 | }, 121 | { 122 | "cell_type": "code", 123 | "execution_count": null, 124 | "metadata": {}, 125 | "outputs": [], 126 | "source": [] 127 | } 128 | ], 129 | "metadata": { 130 | "kernelspec": { 131 | "display_name": "Python 3", 132 | "language": "python", 133 | "name": "python3" 134 | }, 135 | "language_info": { 136 | "codemirror_mode": { 137 | "name": "ipython", 138 | "version": 3 139 | }, 140 | "file_extension": ".py", 141 | "mimetype": "text/x-python", 142 | "name": "python", 143 | "nbconvert_exporter": "python", 144 | "pygments_lexer": "ipython3", 145 | "version": "3.6.7" 146 | } 147 | }, 148 | "nbformat": 4, 149 | "nbformat_minor": 5 150 | } 151 | -------------------------------------------------------------------------------- /03 - 01 - Consumer - Partition 0.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# 🦀 Kafka Consumer - Partition 0\n", 8 | "---\n" 9 | ] 10 | }, 11 | { 12 | "cell_type": "markdown", 13 | "metadata": {}, 14 | "source": [ 15 | "Let's create a consumer" 16 | ] 17 | }, 18 | { 19 | "cell_type": "code", 20 | "execution_count": null, 21 | "metadata": {}, 22 | "outputs": [], 23 | "source": [ 24 | "from kafka import TopicPartition\n", 25 | "from kafka import KafkaConsumer\n", 26 | "import json\n", 27 | "\n", 28 | "from config.kafka_config import *\n", 29 | "\n", 30 | "group_id = \"my_pizza_group\"\n", 31 | "\n", 32 | "topic_name_partitioned = topic_name +\"_partitioned\"\n", 33 | "\n", 34 | "consumer_partition_0 = KafkaConsumer(\n", 35 | " group_id=group_id+\"_partitioned\",\n", 36 | " bootstrap_servers=hostname+\":\"+str(port),\n", 37 | " security_protocol=\"SSL\",\n", 38 | " ssl_cafile=cert_folder+\"/ca.pem\",\n", 39 | " ssl_certfile=cert_folder+\"/service.cert\",\n", 40 | " ssl_keyfile=cert_folder+\"/service.key\",\n", 41 | " value_deserializer = lambda v: json.loads(v.decode('ascii')),\n", 42 | " key_deserializer = lambda v: json.loads(v.decode('ascii')),\n", 43 | " auto_offset_reset='earliest',\n", 44 | " max_poll_records = 10\n", 45 | " )\n", 46 | "\n" 47 | ] 48 | }, 49 | { 50 | "cell_type": "markdown", 51 | "metadata": {}, 52 | "source": [ 53 | "Read from **partition 0**" 54 | ] 55 | }, 56 | { 57 | "cell_type": "code", 58 | "execution_count": null, 59 | "metadata": {}, 60 | "outputs": [], 61 | "source": [ 62 | "consumer_partition_0.assign([TopicPartition(topic_name_partitioned, 0)])\n", 63 | "consumer_partition_0.subscription()\n", 64 | "for message in consumer_partition_0:\n", 65 | " print (\"p=%d o=%d value=%s\" % (message.partition,\n", 66 | " message.offset,\n", 67 | " message.value))" 68 | ] 69 | }, 70 | { 71 | "cell_type": "code", 72 | "execution_count": null, 73 | "metadata": {}, 74 | "outputs": [], 75 | "source": [] 76 | } 77 | ], 78 | "metadata": { 79 | "kernelspec": { 80 | "display_name": "Python 3", 81 | "language": "python", 82 | "name": "python3" 83 | }, 84 | "language_info": { 85 | "codemirror_mode": { 86 | "name": "ipython", 87 | "version": 3 88 | }, 89 | "file_extension": ".py", 90 | "mimetype": "text/x-python", 91 | "name": "python", 92 | "nbconvert_exporter": "python", 93 | "pygments_lexer": "ipython3", 94 | "version": "3.6.7" 95 | } 96 | }, 97 | "nbformat": 4, 98 | "nbformat_minor": 5 99 | } 100 | -------------------------------------------------------------------------------- /03 - 02 - Consumer - Partition 1.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# 🦀 Kafka Consumer - Partition 1\n", 8 | "---" 9 | ] 10 | }, 11 | { 12 | "cell_type": "markdown", 13 | "metadata": {}, 14 | "source": [ 15 | "Let's create a consumer" 16 | ] 17 | }, 18 | { 19 | "cell_type": "code", 20 | "execution_count": null, 21 | "metadata": {}, 22 | "outputs": [], 23 | "source": [ 24 | "from kafka import TopicPartition\n", 25 | "from kafka import KafkaConsumer\n", 26 | "import json\n", 27 | "\n", 28 | "from config.kafka_config import *\n", 29 | "\n", 30 | "group_id = \"my_pizza_group\"\n", 31 | "timeout_ms=5000\n", 32 | "\n", 33 | "topic_name_partitioned = topic_name +\"_partitioned\"\n", 34 | "\n", 35 | "consumer_partition_1 = KafkaConsumer(\n", 36 | " group_id=group_id+\"_partitioned\",\n", 37 | " bootstrap_servers=hostname+\":\"+str(port),\n", 38 | " security_protocol=\"SSL\",\n", 39 | " ssl_cafile=cert_folder+\"/ca.pem\",\n", 40 | " ssl_certfile=cert_folder+\"/service.cert\",\n", 41 | " ssl_keyfile=cert_folder+\"/service.key\",\n", 42 | " value_deserializer = lambda v: json.loads(v.decode('ascii')),\n", 43 | " key_deserializer = lambda v: json.loads(v.decode('ascii')),\n", 44 | " auto_offset_reset='earliest',\n", 45 | " max_poll_records = 10\n", 46 | " )\n", 47 | "\n" 48 | ] 49 | }, 50 | { 51 | "cell_type": "markdown", 52 | "metadata": {}, 53 | "source": [ 54 | "Read from **partition 1**" 55 | ] 56 | }, 57 | { 58 | "cell_type": "code", 59 | "execution_count": null, 60 | "metadata": {}, 61 | "outputs": [], 62 | "source": [ 63 | "consumer_partition_1.assign([TopicPartition(topic_name_partitioned, 1)])\n", 64 | "consumer_partition_1.subscription()\n", 65 | "for message in consumer_partition_1:\n", 66 | " print (\"p=%d o=%d value=%s\" % (message.partition,\n", 67 | " message.offset,\n", 68 | " message.value))" 69 | ] 70 | }, 71 | { 72 | "cell_type": "code", 73 | "execution_count": null, 74 | "metadata": {}, 75 | "outputs": [], 76 | "source": [] 77 | } 78 | ], 79 | "metadata": { 80 | "kernelspec": { 81 | "display_name": "Python 3", 82 | "language": "python", 83 | "name": "python3" 84 | }, 85 | "language_info": { 86 | "codemirror_mode": { 87 | "name": "ipython", 88 | "version": 3 89 | }, 90 | "file_extension": ".py", 91 | "mimetype": "text/x-python", 92 | "name": "python", 93 | "nbconvert_exporter": "python", 94 | "pygments_lexer": "ipython3", 95 | "version": "3.6.7" 96 | } 97 | }, 98 | "nbformat": 4, 99 | "nbformat_minor": 5 100 | } 101 | -------------------------------------------------------------------------------- /04 - New Consumer Group.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "## 🦀 New Consumer Group" 8 | ] 9 | }, 10 | { 11 | "cell_type": "markdown", 12 | "metadata": {}, 13 | "source": [ 14 | "---\n", 15 | "\n", 16 | "\n", 17 | "Let's try to create a consumer on another `group_id` (`my_NEW_pizza_group`) and the `auto_offset_reset=earliest`" 18 | ] 19 | }, 20 | { 21 | "cell_type": "code", 22 | "execution_count": null, 23 | "metadata": {}, 24 | "outputs": [], 25 | "source": [ 26 | "from kafka import KafkaConsumer\n", 27 | "import json\n", 28 | "from config.kafka_config import *\n", 29 | "\n", 30 | "consumer_new_group = KafkaConsumer(\n", 31 | " ## NEW group_id #########\n", 32 | " group_id='my_NEW_pizza_group',\n", 33 | " #########################\n", 34 | " bootstrap_servers=hostname+\":\"+str(port),\n", 35 | " security_protocol=\"SSL\",\n", 36 | " ssl_cafile=cert_folder+\"/ca.pem\",\n", 37 | " ssl_certfile=cert_folder+\"/service.cert\",\n", 38 | " ssl_keyfile=cert_folder+\"/service.key\",\n", 39 | " value_deserializer = lambda v: json.loads(v.decode('ascii')),\n", 40 | " key_deserializer = lambda v: json.loads(v.decode('ascii')),\n", 41 | " auto_offset_reset='earliest',\n", 42 | " max_poll_records = 10\n", 43 | " )\n", 44 | "\n", 45 | "consumer_new_group.subscribe(topics=[topic_name])\n", 46 | "consumer_new_group.subscription()\n", 47 | "for message in consumer_new_group:\n", 48 | " print (\"%d:%d: key=%s value=%s\" % (message.partition,\n", 49 | " message.offset,\n", 50 | " message.key,\n", 51 | " message.value))" 52 | ] 53 | }, 54 | { 55 | "cell_type": "code", 56 | "execution_count": null, 57 | "metadata": {}, 58 | "outputs": [], 59 | "source": [] 60 | }, 61 | { 62 | "cell_type": "code", 63 | "execution_count": null, 64 | "metadata": {}, 65 | "outputs": [], 66 | "source": [] 67 | } 68 | ], 69 | "metadata": { 70 | "kernelspec": { 71 | "display_name": "Python 3", 72 | "language": "python", 73 | "name": "python3" 74 | }, 75 | "language_info": { 76 | "codemirror_mode": { 77 | "name": "ipython", 78 | "version": 3 79 | }, 80 | "file_extension": ".py", 81 | "mimetype": "text/x-python", 82 | "name": "python", 83 | "nbconvert_exporter": "python", 84 | "pygments_lexer": "ipython3", 85 | "version": "3.6.7" 86 | } 87 | }, 88 | "nbformat": 4, 89 | "nbformat_minor": 5 90 | } 91 | -------------------------------------------------------------------------------- /05 - Kafka Connect.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "# 🦀 Kafka Connect\n", 8 | "\n", 9 | "![Kafka Connect PG](images/connect_pg.png)\n", 10 | "\n", 11 | "Kafka connect are prebuilt connectors that can be used to integrate Kafka with other sources or targets (souces or sinks in Kafka terms). Let's create a postgreSQL one. " 12 | ] 13 | }, 14 | { 15 | "cell_type": "code", 16 | "execution_count": null, 17 | "metadata": {}, 18 | "outputs": [], 19 | "source": [ 20 | "from kafka import KafkaProducer\n", 21 | "import json\n", 22 | "from config.kafka_config import *\n", 23 | "\n", 24 | "producer = KafkaProducer(\n", 25 | " bootstrap_servers=hostname+\":\"+str(port),\n", 26 | " security_protocol=\"SSL\",\n", 27 | " ssl_cafile=cert_folder+\"/ca.pem\",\n", 28 | " ssl_certfile=cert_folder+\"/service.cert\",\n", 29 | " ssl_keyfile=cert_folder+\"/service.key\",\n", 30 | " value_serializer=lambda v: json.dumps(v).encode('ascii'),\n", 31 | " key_serializer=lambda v: json.dumps(v).encode('ascii') \n", 32 | " )" 33 | ] 34 | }, 35 | { 36 | "cell_type": "markdown", 37 | "metadata": {}, 38 | "source": [ 39 | "---\n", 40 | "\n", 41 | "Let's create a new stream, adding the **schema** to it. \n", 42 | "\n", 43 | "Kafka Connect JDBC Sink requires a schema to be attached to the stream defining the its fields in detail. We have two choices:\n", 44 | "* Attaching the schema to each JSON message\n", 45 | "* Use schema registry with AVRO format\n", 46 | "\n", 47 | "For the sake of this example we'll include the schema definition to the JSON message. Let's define the schema" 48 | ] 49 | }, 50 | { 51 | "cell_type": "code", 52 | "execution_count": null, 53 | "metadata": {}, 54 | "outputs": [], 55 | "source": [ 56 | "key_schema = {\n", 57 | " \"type\": \"struct\",\n", 58 | " \"fields\": [\n", 59 | " {\n", 60 | " \"type\": \"int32\",\n", 61 | " \"optional\": False,\n", 62 | " \"field\": \"id\"\n", 63 | " }\n", 64 | " ]\n", 65 | "}\n", 66 | "\n", 67 | "value_schema = {\n", 68 | " \"type\": \"struct\",\n", 69 | " \"fields\": [\n", 70 | " {\n", 71 | " \"type\": \"string\",\n", 72 | " \"optional\": False,\n", 73 | " \"field\": \"name\"\n", 74 | " },\n", 75 | " {\n", 76 | " \"type\": \"string\",\n", 77 | " \"optional\": False,\n", 78 | " \"field\": \"pizza\"}]\n", 79 | "}" 80 | ] 81 | }, 82 | { 83 | "cell_type": "markdown", 84 | "metadata": {}, 85 | "source": [ 86 | "And send some data" 87 | ] 88 | }, 89 | { 90 | "cell_type": "code", 91 | "execution_count": null, 92 | "metadata": {}, 93 | "outputs": [], 94 | "source": [ 95 | "producer.send(\n", 96 | " topic_name+\"_schema\", \n", 97 | " key={\"schema\": key_schema, \"payload\": {\"id\":1}},\n", 98 | " value={\"schema\": value_schema, \n", 99 | " \"payload\": {\"name\":\"👨 Frank\", \"pizza\":\"Margherita 🍕\"}}\n", 100 | ")\n", 101 | "\n", 102 | "producer.send(\n", 103 | " topic_name+\"_schema\",\n", 104 | " key={\"schema\": key_schema, \"payload\": {\"id\":2}},\n", 105 | " value={\"schema\": value_schema, \n", 106 | " \"payload\": {\"name\":\"👨 Dan\", \"pizza\":\"Fries 🍕+🍟\"}}\n", 107 | ")\n", 108 | "\n", 109 | "\n", 110 | "producer.send(\n", 111 | " topic_name+\"_schema\",\n", 112 | " key={\"schema\": key_schema, \"payload\": {\"id\":3}},\n", 113 | " value={\"schema\": value_schema,\n", 114 | " \"payload\": {\"name\":\"👨 Jan\", \"pizza\":\"Mushrooms 🍕+🍄\"}}\n", 115 | ")\n", 116 | "\n", 117 | "producer.flush()" 118 | ] 119 | }, 120 | { 121 | "cell_type": "markdown", 122 | "metadata": {}, 123 | "source": [ 124 | "Let's start the **Kafka Connect JDBC Connector**" 125 | ] 126 | }, 127 | { 128 | "cell_type": "code", 129 | "execution_count": null, 130 | "metadata": {}, 131 | "outputs": [], 132 | "source": [ 133 | "%%bash\n", 134 | "\n", 135 | "source config/profile_info.sh\n", 136 | "\n", 137 | "avn service connector create $KAFKA_NAME @config/kafka_connect_setup.json --project $PROJECT_NAME" 138 | ] 139 | }, 140 | { 141 | "cell_type": "markdown", 142 | "metadata": {}, 143 | "source": [ 144 | "Verify the **Connector** status" 145 | ] 146 | }, 147 | { 148 | "cell_type": "code", 149 | "execution_count": null, 150 | "metadata": {}, 151 | "outputs": [], 152 | "source": [ 153 | "%%bash\n", 154 | "\n", 155 | "source config/profile_info.sh\n", 156 | "\n", 157 | "avn service connector status $KAFKA_NAME sink_kafka_pg --project $PROJECT_NAME" 158 | ] 159 | }, 160 | { 161 | "cell_type": "markdown", 162 | "metadata": {}, 163 | "source": [ 164 | "Let's add another **event**\n", 165 | "with our **Python Producer**" 166 | ] 167 | }, 168 | { 169 | "cell_type": "code", 170 | "execution_count": null, 171 | "metadata": {}, 172 | "outputs": [], 173 | "source": [ 174 | "producer.send(\n", 175 | " topic_name+\"_schema\",\n", 176 | " key={\n", 177 | " \"schema\": key_schema,\n", 178 | " \"payload\": {\"id\":4}\n", 179 | " },\n", 180 | " value={\n", 181 | " \"schema\": value_schema,\n", 182 | " \"payload\": {\"name\":\"👨 Giuseppe\", \"pizza\":\"Hawaii 🍕+🍍+🥓\"}\n", 183 | " }\n", 184 | ")\n", 185 | "\n", 186 | "\n", 187 | "producer.flush()" 188 | ] 189 | }, 190 | { 191 | "cell_type": "code", 192 | "execution_count": null, 193 | "metadata": {}, 194 | "outputs": [], 195 | "source": [] 196 | } 197 | ], 198 | "metadata": { 199 | "kernelspec": { 200 | "display_name": "Python 3", 201 | "language": "python", 202 | "name": "python3" 203 | }, 204 | "language_info": { 205 | "codemirror_mode": { 206 | "name": "ipython", 207 | "version": 3 208 | }, 209 | "file_extension": ".py", 210 | "mimetype": "text/x-python", 211 | "name": "python", 212 | "nbconvert_exporter": "python", 213 | "pygments_lexer": "ipython3", 214 | "version": "3.6.7" 215 | } 216 | }, 217 | "nbformat": 4, 218 | "nbformat_minor": 5 219 | } 220 | -------------------------------------------------------------------------------- /0N - Aiven - Delete Services.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": {}, 6 | "source": [ 7 | "## 🦀 Delete all services\n", 8 | "\n", 9 | "---\n", 10 | "\n", 11 | "Finally Let's delete the topics created" 12 | ] 13 | }, 14 | { 15 | "cell_type": "code", 16 | "execution_count": 1, 17 | "metadata": {}, 18 | "outputs": [ 19 | { 20 | "ename": "ModuleNotFoundError", 21 | "evalue": "No module named 'config.kafka_config'", 22 | "output_type": "error", 23 | "traceback": [ 24 | "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", 25 | "\u001b[0;31mModuleNotFoundError\u001b[0m Traceback (most recent call last)", 26 | "Cell \u001b[0;32mIn [1], line 1\u001b[0m\n\u001b[0;32m----> 1\u001b[0m \u001b[39mfrom\u001b[39;00m \u001b[39mconfig\u001b[39;00m\u001b[39m.\u001b[39;00m\u001b[39mkafka_config\u001b[39;00m \u001b[39mimport\u001b[39;00m \u001b[39m*\u001b[39m\n\u001b[1;32m 3\u001b[0m \u001b[39mprint\u001b[39m(hostname)\n\u001b[1;32m 4\u001b[0m \u001b[39mprint\u001b[39m(port)\n", 27 | "\u001b[0;31mModuleNotFoundError\u001b[0m: No module named 'config.kafka_config'" 28 | ] 29 | } 30 | ], 31 | "source": [ 32 | "from config.kafka_config import *\n", 33 | "\n", 34 | "print(hostname)\n", 35 | "print(port)\n", 36 | "print(cert_folder)\n", 37 | "print(topic_name)\n", 38 | "\n", 39 | "timeout_ms=5000\n" 40 | ] 41 | }, 42 | { 43 | "cell_type": "code", 44 | "execution_count": 2, 45 | "metadata": {}, 46 | "outputs": [ 47 | { 48 | "ename": "NameError", 49 | "evalue": "name 'topic_name' is not defined", 50 | "output_type": "error", 51 | "traceback": [ 52 | "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", 53 | "\u001b[0;31mNameError\u001b[0m Traceback (most recent call last)", 54 | "Cell \u001b[0;32mIn [2], line 3\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[39mfrom\u001b[39;00m \u001b[39mkafka\u001b[39;00m\u001b[39m.\u001b[39;00m\u001b[39madmin\u001b[39;00m \u001b[39mimport\u001b[39;00m KafkaAdminClient, NewTopic\n\u001b[0;32m----> 3\u001b[0m topic_name_partitioned\u001b[39m=\u001b[39mtopic_name\u001b[39m+\u001b[39m\u001b[39m\"\u001b[39m\u001b[39m_partitioned\u001b[39m\u001b[39m\"\u001b[39m\n\u001b[1;32m 4\u001b[0m admin \u001b[39m=\u001b[39m KafkaAdminClient(\n\u001b[1;32m 5\u001b[0m client_id \u001b[39m=\u001b[39m\u001b[39m'\u001b[39m\u001b[39madmin\u001b[39m\u001b[39m'\u001b[39m,\n\u001b[1;32m 6\u001b[0m bootstrap_servers\u001b[39m=\u001b[39mhostname\u001b[39m+\u001b[39m\u001b[39m\"\u001b[39m\u001b[39m:\u001b[39m\u001b[39m\"\u001b[39m\u001b[39m+\u001b[39m\u001b[39mstr\u001b[39m(port),\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 10\u001b[0m ssl_keyfile\u001b[39m=\u001b[39mcert_folder\u001b[39m+\u001b[39m\u001b[39m\"\u001b[39m\u001b[39m/service.key\u001b[39m\u001b[39m\"\u001b[39m,\n\u001b[1;32m 11\u001b[0m )\n\u001b[1;32m 13\u001b[0m admin\u001b[39m.\u001b[39mdelete_topics(topics\u001b[39m=\u001b[39m[topic_name, topic_name_partitioned, topic_name\u001b[39m+\u001b[39m\u001b[39m\"\u001b[39m\u001b[39m_schema\u001b[39m\u001b[39m\"\u001b[39m], timeout_ms\u001b[39m=\u001b[39mtimeout_ms)\n", 55 | "\u001b[0;31mNameError\u001b[0m: name 'topic_name' is not defined" 56 | ] 57 | } 58 | ], 59 | "source": [ 60 | "from kafka.admin import KafkaAdminClient, NewTopic\n", 61 | "\n", 62 | "topic_name_partitioned=topic_name+\"_partitioned\"\n", 63 | "admin = KafkaAdminClient(\n", 64 | " client_id ='admin',\n", 65 | " bootstrap_servers=hostname+\":\"+str(port),\n", 66 | " security_protocol=\"SSL\",\n", 67 | " ssl_cafile=cert_folder+\"/ca.pem\",\n", 68 | " ssl_certfile=cert_folder+\"/service.cert\",\n", 69 | " ssl_keyfile=cert_folder+\"/service.key\",\n", 70 | " )\n", 71 | "\n", 72 | "admin.delete_topics(topics=[topic_name, topic_name_partitioned, topic_name+\"_schema\"], timeout_ms=timeout_ms)" 73 | ] 74 | }, 75 | { 76 | "cell_type": "markdown", 77 | "metadata": {}, 78 | "source": [ 79 | "Now let's terminate the Kafka and PostgreSQL instances" 80 | ] 81 | }, 82 | { 83 | "cell_type": "code", 84 | "execution_count": 5, 85 | "metadata": {}, 86 | "outputs": [ 87 | { 88 | "name": "stderr", 89 | "output_type": "stream", 90 | "text": [ 91 | "bash: line 2: config/profile_info.sh: No such file or directory\n", 92 | "usage: avn [-h] [--config CONFIG] [--version] [--auth-ca FILE]\n", 93 | " [--auth-token AUTH_TOKEN] [--show-http] [--url URL]\n", 94 | " ...\n", 95 | "avn: error: argument : invalid choice: 'terminate' (choose from 'account', 'billing-group', 'card', 'cloud', 'crab', 'credits', 'events', 'help', 'mirrormaker', 'project', 'service', 'ticket', 'user', 'vpc')\n", 96 | "usage: avn [-h] [--config CONFIG] [--version] [--auth-ca FILE]\n", 97 | " [--auth-token AUTH_TOKEN] [--show-http] [--url URL]\n", 98 | " ...\n", 99 | "avn: error: argument : invalid choice: 'terminate' (choose from 'account', 'billing-group', 'card', 'cloud', 'crab', 'credits', 'events', 'help', 'mirrormaker', 'project', 'service', 'ticket', 'user', 'vpc')\n" 100 | ] 101 | } 102 | ], 103 | "source": [ 104 | "%%bash\n", 105 | "\n", 106 | "source config/profile_info.sh\n", 107 | "avn --auth-token $TOKEN service terminate $KAFKA_NAME --project $PROJECT_NAME --force\n", 108 | "avn --auth-token $TOKEN service terminate $POSTGRES_NAME --project $PROJECT_NAME --force\n", 109 | "rm -rf config\n", 110 | "rm -rf kafkacerts\n" 111 | ] 112 | }, 113 | { 114 | "cell_type": "code", 115 | "execution_count": null, 116 | "metadata": {}, 117 | "outputs": [], 118 | "source": [] 119 | } 120 | ], 121 | "metadata": { 122 | "kernelspec": { 123 | "display_name": "Python 3.8.5 64-bit ('3.8.5')", 124 | "language": "python", 125 | "name": "python3" 126 | }, 127 | "language_info": { 128 | "codemirror_mode": { 129 | "name": "ipython", 130 | "version": 3 131 | }, 132 | "file_extension": ".py", 133 | "mimetype": "text/x-python", 134 | "name": "python", 135 | "nbconvert_exporter": "python", 136 | "pygments_lexer": "ipython3", 137 | "version": "3.8.5" 138 | }, 139 | "vscode": { 140 | "interpreter": { 141 | "hash": "6064663bdb0cf7ad80e39fa9924d85b32044da2a4abedcbe30d3eba51421769c" 142 | } 143 | } 144 | }, 145 | "nbformat": 4, 146 | "nbformat_minor": 5 147 | } 148 | -------------------------------------------------------------------------------- /CODE_OF_CONDUCT.md: -------------------------------------------------------------------------------- 1 | # Contributor Covenant Code of Conduct 2 | 3 | ## Our Pledge 4 | 5 | We as members, contributors, and leaders pledge to make participation in our 6 | community a harassment-free experience for everyone, regardless of age, body 7 | size, visible or invisible disability, ethnicity, sex characteristics, gender 8 | identity and expression, level of experience, education, socio-economic status, 9 | nationality, personal appearance, race, religion, or sexual identity 10 | and orientation. 11 | 12 | We pledge to act and interact in ways that contribute to an open, welcoming, 13 | diverse, inclusive, and healthy community. 14 | 15 | ## Our Standards 16 | 17 | Examples of behavior that contributes to a positive environment for our 18 | community include: 19 | 20 | * Demonstrating empathy and kindness toward other people 21 | * Being respectful of differing opinions, viewpoints, and experiences 22 | * Giving and gracefully accepting constructive feedback 23 | * Accepting responsibility and apologizing to those affected by our mistakes, 24 | and learning from the experience 25 | * Focusing on what is best not just for us as individuals, but for the 26 | overall community 27 | 28 | Examples of unacceptable behavior include: 29 | 30 | * The use of sexualized language or imagery, and sexual attention or 31 | advances of any kind 32 | * Trolling, insulting or derogatory comments, and personal or political attacks 33 | * Public or private harassment 34 | * Publishing others' private information, such as a physical or email 35 | address, without their explicit permission 36 | * Other conduct which could reasonably be considered inappropriate in a 37 | professional setting 38 | 39 | ## Enforcement Responsibilities 40 | 41 | Community leaders are responsible for clarifying and enforcing our standards of 42 | acceptable behavior and will take appropriate and fair corrective action in 43 | response to any behavior that they deem inappropriate, threatening, offensive, 44 | or harmful. 45 | 46 | Community leaders have the right and responsibility to remove, edit, or reject 47 | comments, commits, code, wiki edits, issues, and other contributions that are 48 | not aligned to this Code of Conduct, and will communicate reasons for moderation 49 | decisions when appropriate. 50 | 51 | ## Scope 52 | 53 | This Code of Conduct applies within all community spaces, and also applies when 54 | an individual is officially representing the community in public spaces. 55 | Examples of representing our community include using an official e-mail address, 56 | posting via an official social media account, or acting as an appointed 57 | representative at an online or offline event. 58 | 59 | ## Enforcement 60 | 61 | Instances of abusive, harassing, or otherwise unacceptable behavior may be 62 | reported to the community leaders responsible for enforcement at 63 | opensource@aiven.io. 64 | All complaints will be reviewed and investigated promptly and fairly. 65 | 66 | All community leaders are obligated to respect the privacy and security of the 67 | reporter of any incident. 68 | 69 | ## Enforcement Guidelines 70 | 71 | Community leaders will follow these Community Impact Guidelines in determining 72 | the consequences for any action they deem in violation of this Code of Conduct: 73 | 74 | ### 1. Correction 75 | 76 | **Community Impact**: Use of inappropriate language or other behavior deemed 77 | unprofessional or unwelcome in the community. 78 | 79 | **Consequence**: A private, written warning from community leaders, providing 80 | clarity around the nature of the violation and an explanation of why the 81 | behavior was inappropriate. A public apology may be requested. 82 | 83 | ### 2. Warning 84 | 85 | **Community Impact**: A violation through a single incident or series 86 | of actions. 87 | 88 | **Consequence**: A warning with consequences for continued behavior. No 89 | interaction with the people involved, including unsolicited interaction with 90 | those enforcing the Code of Conduct, for a specified period of time. This 91 | includes avoiding interactions in community spaces as well as external channels 92 | like social media. Violating these terms may lead to a temporary or 93 | permanent ban. 94 | 95 | ### 3. Temporary Ban 96 | 97 | **Community Impact**: A serious violation of community standards, including 98 | sustained inappropriate behavior. 99 | 100 | **Consequence**: A temporary ban from any sort of interaction or public 101 | communication with the community for a specified period of time. No public or 102 | private interaction with the people involved, including unsolicited interaction 103 | with those enforcing the Code of Conduct, is allowed during this period. 104 | Violating these terms may lead to a permanent ban. 105 | 106 | ### 4. Permanent Ban 107 | 108 | **Community Impact**: Demonstrating a pattern of violation of community 109 | standards, including sustained inappropriate behavior, harassment of an 110 | individual, or aggression toward or disparagement of classes of individuals. 111 | 112 | **Consequence**: A permanent ban from any sort of public interaction within 113 | the community. 114 | 115 | ## Attribution 116 | 117 | This Code of Conduct is adapted from the [Contributor Covenant][homepage], 118 | version 2.0, available at 119 | https://www.contributor-covenant.org/version/2/0/code_of_conduct.html. 120 | 121 | Community Impact Guidelines were inspired by [Mozilla's code of conduct 122 | enforcement ladder](https://github.com/mozilla/diversity). 123 | 124 | [homepage]: https://www.contributor-covenant.org 125 | 126 | For answers to common questions about this code of conduct, see the FAQ at 127 | https://www.contributor-covenant.org/faq. Translations are available at 128 | https://www.contributor-covenant.org/translations. 129 | -------------------------------------------------------------------------------- /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | # Welcome! 2 | 3 | Contributions are very welcome on Python Apache Kafka - Python Jupyter Notebooks. When contributing please keep this in mind: 4 | 5 | - Open an issue to discuss new bigger features. 6 | - Write code consistent with the project style and make sure the tests are passing. 7 | - Stay in touch with us if we have follow up questions or requests for further changes. 8 | 9 | # Opening a PR 10 | 11 | - Commit messages should describe the changes, not the filenames. Win our admiration by following 12 | the [excellent advice from Chris Beams](https://chris.beams.io/posts/git-commit/) when composing 13 | commit messages. 14 | - Choose a meaningful title for your pull request. 15 | - The pull request description should focus on what changed and why. 16 | - Check that the tests pass (and add test coverage for your changes if appropriate). 17 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Apache License 2 | Version 2.0, January 2004 3 | http://www.apache.org/licenses/ 4 | 5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 6 | 7 | 1. Definitions. 8 | 9 | "License" shall mean the terms and conditions for use, reproduction, 10 | and distribution as defined by Sections 1 through 9 of this document. 11 | 12 | "Licensor" shall mean the copyright owner or entity authorized by 13 | the copyright owner that is granting the License. 14 | 15 | "Legal Entity" shall mean the union of the acting entity and all 16 | other entities that control, are controlled by, or are under common 17 | control with that entity. For the purposes of this definition, 18 | "control" means (i) the power, direct or indirect, to cause the 19 | direction or management of such entity, whether by contract or 20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 21 | outstanding shares, or (iii) beneficial ownership of such entity. 22 | 23 | "You" (or "Your") shall mean an individual or Legal Entity 24 | exercising permissions granted by this License. 25 | 26 | "Source" form shall mean the preferred form for making modifications, 27 | including but not limited to software source code, documentation 28 | source, and configuration files. 29 | 30 | "Object" form shall mean any form resulting from mechanical 31 | transformation or translation of a Source form, including but 32 | not limited to compiled object code, generated documentation, 33 | and conversions to other media types. 34 | 35 | "Work" shall mean the work of authorship, whether in Source or 36 | Object form, made available under the License, as indicated by a 37 | copyright notice that is included in or attached to the work 38 | (an example is provided in the Appendix below). 39 | 40 | "Derivative Works" shall mean any work, whether in Source or Object 41 | form, that is based on (or derived from) the Work and for which the 42 | editorial revisions, annotations, elaborations, or other modifications 43 | represent, as a whole, an original work of authorship. For the purposes 44 | of this License, Derivative Works shall not include works that remain 45 | separable from, or merely link (or bind by name) to the interfaces of, 46 | the Work and Derivative Works thereof. 47 | 48 | "Contribution" shall mean any work of authorship, including 49 | the original version of the Work and any modifications or additions 50 | to that Work or Derivative Works thereof, that is intentionally 51 | submitted to Licensor for inclusion in the Work by the copyright owner 52 | or by an individual or Legal Entity authorized to submit on behalf of 53 | the copyright owner. For the purposes of this definition, "submitted" 54 | means any form of electronic, verbal, or written communication sent 55 | to the Licensor or its representatives, including but not limited to 56 | communication on electronic mailing lists, source code control systems, 57 | and issue tracking systems that are managed by, or on behalf of, the 58 | Licensor for the purpose of discussing and improving the Work, but 59 | excluding communication that is conspicuously marked or otherwise 60 | designated in writing by the copyright owner as "Not a Contribution." 61 | 62 | "Contributor" shall mean Licensor and any individual or Legal Entity 63 | on behalf of whom a Contribution has been received by Licensor and 64 | subsequently incorporated within the Work. 65 | 66 | 2. Grant of Copyright License. Subject to the terms and conditions of 67 | this License, each Contributor hereby grants to You a perpetual, 68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 69 | copyright license to reproduce, prepare Derivative Works of, 70 | publicly display, publicly perform, sublicense, and distribute the 71 | Work and such Derivative Works in Source or Object form. 72 | 73 | 3. Grant of Patent License. Subject to the terms and conditions of 74 | this License, each Contributor hereby grants to You a perpetual, 75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 76 | (except as stated in this section) patent license to make, have made, 77 | use, offer to sell, sell, import, and otherwise transfer the Work, 78 | where such license applies only to those patent claims licensable 79 | by such Contributor that are necessarily infringed by their 80 | Contribution(s) alone or by combination of their Contribution(s) 81 | with the Work to which such Contribution(s) was submitted. If You 82 | institute patent litigation against any entity (including a 83 | cross-claim or counterclaim in a lawsuit) alleging that the Work 84 | or a Contribution incorporated within the Work constitutes direct 85 | or contributory patent infringement, then any patent licenses 86 | granted to You under this License for that Work shall terminate 87 | as of the date such litigation is filed. 88 | 89 | 4. Redistribution. You may reproduce and distribute copies of the 90 | Work or Derivative Works thereof in any medium, with or without 91 | modifications, and in Source or Object form, provided that You 92 | meet the following conditions: 93 | 94 | (a) You must give any other recipients of the Work or 95 | Derivative Works a copy of this License; and 96 | 97 | (b) You must cause any modified files to carry prominent notices 98 | stating that You changed the files; and 99 | 100 | (c) You must retain, in the Source form of any Derivative Works 101 | that You distribute, all copyright, patent, trademark, and 102 | attribution notices from the Source form of the Work, 103 | excluding those notices that do not pertain to any part of 104 | the Derivative Works; and 105 | 106 | (d) If the Work includes a "NOTICE" text file as part of its 107 | distribution, then any Derivative Works that You distribute must 108 | include a readable copy of the attribution notices contained 109 | within such NOTICE file, excluding those notices that do not 110 | pertain to any part of the Derivative Works, in at least one 111 | of the following places: within a NOTICE text file distributed 112 | as part of the Derivative Works; within the Source form or 113 | documentation, if provided along with the Derivative Works; or, 114 | within a display generated by the Derivative Works, if and 115 | wherever such third-party notices normally appear. The contents 116 | of the NOTICE file are for informational purposes only and 117 | do not modify the License. You may add Your own attribution 118 | notices within Derivative Works that You distribute, alongside 119 | or as an addendum to the NOTICE text from the Work, provided 120 | that such additional attribution notices cannot be construed 121 | as modifying the License. 122 | 123 | You may add Your own copyright statement to Your modifications and 124 | may provide additional or different license terms and conditions 125 | for use, reproduction, or distribution of Your modifications, or 126 | for any such Derivative Works as a whole, provided Your use, 127 | reproduction, and distribution of the Work otherwise complies with 128 | the conditions stated in this License. 129 | 130 | 5. Submission of Contributions. Unless You explicitly state otherwise, 131 | any Contribution intentionally submitted for inclusion in the Work 132 | by You to the Licensor shall be under the terms and conditions of 133 | this License, without any additional terms or conditions. 134 | Notwithstanding the above, nothing herein shall supersede or modify 135 | the terms of any separate license agreement you may have executed 136 | with Licensor regarding such Contributions. 137 | 138 | 6. Trademarks. This License does not grant permission to use the trade 139 | names, trademarks, service marks, or product names of the Licensor, 140 | except as required for reasonable and customary use in describing the 141 | origin of the Work and reproducing the content of the NOTICE file. 142 | 143 | 7. Disclaimer of Warranty. Unless required by applicable law or 144 | agreed to in writing, Licensor provides the Work (and each 145 | Contributor provides its Contributions) on an "AS IS" BASIS, 146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 147 | implied, including, without limitation, any warranties or conditions 148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 149 | PARTICULAR PURPOSE. You are solely responsible for determining the 150 | appropriateness of using or redistributing the Work and assume any 151 | risks associated with Your exercise of permissions under this License. 152 | 153 | 8. Limitation of Liability. In no event and under no legal theory, 154 | whether in tort (including negligence), contract, or otherwise, 155 | unless required by applicable law (such as deliberate and grossly 156 | negligent acts) or agreed to in writing, shall any Contributor be 157 | liable to You for damages, including any direct, indirect, special, 158 | incidental, or consequential damages of any character arising as a 159 | result of this License or out of the use or inability to use the 160 | Work (including but not limited to damages for loss of goodwill, 161 | work stoppage, computer failure or malfunction, or any and all 162 | other commercial damages or losses), even if such Contributor 163 | has been advised of the possibility of such damages. 164 | 165 | 9. Accepting Warranty or Additional Liability. While redistributing 166 | the Work or Derivative Works thereof, You may choose to offer, 167 | and charge a fee for, acceptance of support, warranty, indemnity, 168 | or other liability obligations and/or rights consistent with this 169 | License. However, in accepting such obligations, You may act only 170 | on Your own behalf and on Your sole responsibility, not on behalf 171 | of any other Contributor, and only if You agree to indemnify, 172 | defend, and hold each Contributor harmless for any liability 173 | incurred by, or claims asserted against, such Contributor by reason 174 | of your accepting any such warranty or additional liability. 175 | 176 | END OF TERMS AND CONDITIONS 177 | 178 | APPENDIX: How to apply the Apache License to your work. 179 | 180 | To apply the Apache License to your work, attach the following 181 | boilerplate notice, with the fields enclosed by brackets "[]" 182 | replaced with your own identifying information. (Don't include 183 | the brackets!) The text should be enclosed in the appropriate 184 | comment syntax for the file format. We also recommend that a 185 | file or class name and description of purpose be included on the 186 | same "printed page" as the copyright notice for easier 187 | identification within third-party archives. 188 | 189 | Copyright [yyyy] [name of copyright owner] 190 | 191 | Licensed under the Apache License, Version 2.0 (the "License"); 192 | you may not use this file except in compliance with the License. 193 | You may obtain a copy of the License at 194 | 195 | http://www.apache.org/licenses/LICENSE-2.0 196 | 197 | Unless required by applicable law or agreed to in writing, software 198 | distributed under the License is distributed on an "AS IS" BASIS, 199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 200 | See the License for the specific language governing permissions and 201 | limitations under the License. 202 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Python Jupyter Notebooks for Apache Kafka® 2 | 3 | This is a series of Jupyter Notebooks on how to start with Apache Kafka® and Python. 4 | You can try these notebooks in order to learn the basic concepts of Apache Kafka in an environment containing markdown text, media and executable code on the same page. 5 | 6 | The notebooks are based on a managed Apache Kafka instance created on [Aiven's website](https://aiven.io/kafka?utm_source=github&utm_medium=organic&utm_campaign=blog_art&utm_content=repo), but can be also customised to any Apache Kafka instance running locally with SSL authentication. Aiven's offer 300$ of free credit that you can redeem by creating your account on [Aiven's website](https://console.aiven.io/signup?utm_source=github&utm_medium=organic&utm_campaign=blog_art&utm_content=repo). 7 | 8 | If you have any question or improvement suggestion regarding the notebooks, please open an issue. Any contributions are welcome! 9 | 10 | 11 | ## Start JupyterLab on Docker 12 | 13 | You can access the notebooks via Jupyterlab, this example will be based on docker 14 | 15 | 1. clone the repository 16 | 2. open a terminal 17 | 3. go to the folder where the repository has been cloned 18 | 4. run the following 19 | 20 | ``` 21 | docker run --rm -p 8888:8888 \ 22 | -e JUPYTER_ENABLE_LAB=yes \ 23 | -v "$PWD":/home/jovyan/work \ 24 | jupyter/datascience-notebook 25 | ``` 26 | 27 | You'll see a folder named `work` on the top left, under it you'll find the list of notebooks. 28 | 29 | ## Notebook Overview 30 | 31 | This repository contains the following notebooks. 32 | 33 | * [00 - Aiven Setup.ipynb](00%20-%20Aiven%20Setup.ipynb): Creates requires Apache Kafka and PostgreSQL instances on Aiven website 34 | * [01 - Producer.ipynb](01%20-%20Producer.ipynb): Creates a Python Producers and produces the first messages 35 | * [02 - Consumer.ipynb](02%20-%20Consumer.ipynb): reads the messages produced by [01 - Producer.ipynb](01%20-%20Producer.ipynb) 36 | * [03 - 00 - Partition Producer.ipynb](03%20-%2000%20-%20Partition%20Producer.ipynb): creates a topic with two partitions and pushes records in each partition 37 | * [03 - 01 - Consumer - Partition 0.ipynb](03%20-%2001%20-%20Consumer%20-%20Partition%200.ipynb): reads messages pushed by [03 - 00 - Partition Producer.ipynb](03%20-%2000%20-%20Partition%20Producer.ipynb) on partition 0 38 | * [03 - 02 - Consumer - Partition 1.ipynb](03%20-%2001%20-%20Consumer%20-%20Partition%200.ipynb): reads messages pushed by [03 - 00 - Partition Producer.ipynb](03%20-%2000%20-%20Partition%20Producer.ipynb) on partition 1 39 | * [04 - New Consumer Group.ipynb](04%20-%20New%20Consumer%20Group.ipynb): creates a new consumer part of a new consumer group and reads messages produced by [01 - Producer.ipynb](01%20-%20Producer.ipynb) 40 | * [05 - Kafka Connect.ipynb](05%20-%20Kafka%20Connect.ipynb): creates a new topic containing JSON messages with schema and payload, and then creates a Kafka Connect JDBC Connector to PostgreSQL 41 | * [ON - Aiven - Delete Services.ipynb](0N%20-%20Aiven%20-%20Delete%20Services.ipynb): deletes all the instances created by [00 - Aiven Setup.ipynb](00%20-%20Aiven%20Setup.ipynb) 42 | 43 | ## Notebook Details 44 | 45 | The notebooks are divided per Apache Kafka functionality. 46 | 47 | ### Create Managed Apache Kafka and PostgreSQL instances with [Aiven.io](https://console.aiven.io/signup?utm_source=github&utm_medium=organic&utm_campaign=blog_art&utm_content=repo) 48 | 49 | ![Create services](images/overall.png) 50 | 51 | **00 - Aiven Setup.ipynb** notebook downloads [Aiven's command line interface](https://aiven.io/blog/command-line-magic-with-the-aiven-cli?utm_source=github&utm_medium=organic&utm_campaign=blog_art&utm_content=repo) and creates an Apache Kafka and a PostgreSQL instance. 52 | 53 | Please change `` and `` with a valid email address and token created on [Aiven's website](https://console.aiven.io/signup?utm_source=github&utm_medium=organic&utm_campaign=blog_art&utm_content=repo). The notebook creates the instances and also stores all the required connection credentials locally. 54 | 55 | ### Produce and read Messages to Apache Kafka 56 | 57 | ![Producer](images/producing.png) 58 | 59 | **01 - Producer.ipynb** Creates a Python Apache Kafka Producer and produces the first messages. After the first message is produced, open the **02 - Consumer.ipynb** notebook and pace it alongside the Producer. 60 | 61 | ![Place consumer alongside the producer](images/move-consumer.gif) 62 | 63 | **02 - Consumer.ipynb** reads from the topic where **01 - Producer** wrote. But it does it from the point in time that it attaches to Apache Kafka, not going back to history. 64 | 65 | ![Consumer](images/consumer.png) 66 | 67 | If you want to read messages created with **01 - Producer** you need to run **02 - Consumer.ipynb**'s last code block before producing any messages on **01 - Producer**. This behaviour is Apache Kafka's default and can be changed by adding a line `'auto.offset.reset'='earliest'` to the consumer properties. 68 | 69 | ### Understanding Apache Kafka Partitions 70 | 71 | ![Partitions](images/partitions.png) 72 | 73 | Partitions is Apache Kafka are a way to divide messages belonging to the same topic in sub-logs. 74 | * **03 - 00 - Partition Producer.ipynb** creates a topic with two partitions using `KafkaAdmin` and sends a message to each partition. 75 | We can then open both **03 - 01 - Consumer - Partition 0.ipynb** and **03 - 02 - Consumer - Partition 1.ipynb** which will read messages from `Partition 0` and `Partition 1` respectively. 76 | 77 | ### New Consumer Group 78 | 79 | ![Consumer groups](images/consumer_groups.png) 80 | 81 | Messages in Apache Kafka are not deleted when read from a consumer. This makes them available for other consumers to be read. **04 - New Consumer Group.ipynb** creates a new consumer part of the a new Consumer Group and reads from the topic where **01 - Producer** wrote. We can check now, by sending a message from the **01 - Producer** notebook, that we can receive it both in **02 - Consumer.ipynb** and **04 - New Consumer Group**. 82 | 83 | ### Kafka Connect 84 | 85 | ![Kafka Connect](images/connect_pg.png) 86 | 87 | Apache Kafka Connect® is a prebuilt framework enabling an easy integration of Apache Kafka with existing data sources or sinks. Aiven provides [Kafka connect as managed service](https://aiven.io/kafka-connect?utm_source=github&utm_medium=organic&utm_campaign=blog_art&utm_content=repo) making the integration a matter of a single config file. **05 - Kafka Connect.ipynb**: Creates a new Kafka topic containing messages with both schema and payload, and then pushes them to a PostgreSQL database via Apache Kafka Connect. 88 | 89 | ### Delete Aiven Services 90 | 91 | ![Delete services](images/overall.png) 92 | 93 | Once you're done, you can delete all the services create on [Aiven's website](https://console.aiven.io/signup?utm_source=github&utm_medium=organic&utm_campaign=blog_art&utm_content=repo) by executing the code in **ON - Aiven - Delete Services.ipynb** 94 | 95 | # Keep Reading 96 | 97 | We maintain some other resources that you may also find useful: 98 | 99 | * [Command Line Magic with avn](https://aiven.io/blog/command-line-magic-with-the-aiven-cli?utm_source=github&utm_medium=organic&utm_campaign=blog_art&utm_content=repo) 100 | * [Teach Yourself Apache Kafka with Jupyter Notebooks](#) 101 | 102 | # License 103 | This project is licensed under the [Apache License, Version 2.0](https://github.com/aiven/aiven-kafka-connect-s3/blob/master/LICENSE). 104 | 105 | Apache Kafka is either a registered trademark or trademark of the Apache Software Foundation in the United States and/or other countries. Aiven has no affiliation with and is not endorsed by The Apache Software Foundation. 106 | -------------------------------------------------------------------------------- /images/connect_pg.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Aiven-Labs/python-notebooks-for-apache-kafka/8b9d339c8c7b38b01ba08a8b42a016e475521f7a/images/connect_pg.png -------------------------------------------------------------------------------- /images/consumer groups.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Aiven-Labs/python-notebooks-for-apache-kafka/8b9d339c8c7b38b01ba08a8b42a016e475521f7a/images/consumer groups.png -------------------------------------------------------------------------------- /images/consumer.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Aiven-Labs/python-notebooks-for-apache-kafka/8b9d339c8c7b38b01ba08a8b42a016e475521f7a/images/consumer.png -------------------------------------------------------------------------------- /images/consumer_groups.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Aiven-Labs/python-notebooks-for-apache-kafka/8b9d339c8c7b38b01ba08a8b42a016e475521f7a/images/consumer_groups.png -------------------------------------------------------------------------------- /images/move-consumer.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Aiven-Labs/python-notebooks-for-apache-kafka/8b9d339c8c7b38b01ba08a8b42a016e475521f7a/images/move-consumer.gif -------------------------------------------------------------------------------- /images/overall.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Aiven-Labs/python-notebooks-for-apache-kafka/8b9d339c8c7b38b01ba08a8b42a016e475521f7a/images/overall.png -------------------------------------------------------------------------------- /images/partitions.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Aiven-Labs/python-notebooks-for-apache-kafka/8b9d339c8c7b38b01ba08a8b42a016e475521f7a/images/partitions.png -------------------------------------------------------------------------------- /images/producing.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/Aiven-Labs/python-notebooks-for-apache-kafka/8b9d339c8c7b38b01ba08a8b42a016e475521f7a/images/producing.png --------------------------------------------------------------------------------