├── ARM-Chipset ├── MAADS-HPDE v8.8.54 Linux ARM.zip ├── MAADS-VIPER v5.6.23 Linux ARM.zip ├── MAADS-VIPERviz v2.3.61 Linux ARM.zip └── readme ├── ARM64-Chipset ├── MAADS-HPDE v8.8.54 Linux ARM64.zip ├── MAADS-VIPER v5.6.23 Linux ARM64.zip ├── MAADS-VIPERviz v2.3.61 Linux ARM64.zip └── readme ├── MAADS-HPDE v.8.8.54 Windows AMD64.zip ├── MAADS-HPDE v8.8.54 Linux AMD64.zip ├── MAADS-VIPER v5.6.23 Linux AMD64.zip ├── MAADS-VIPER v5.6.23 Windows AMD64.zip ├── MAADS-VIPERviz v2.3.61 Linux AMD64.zip ├── MAADS-VIPERviz v2.3.61 Windows AMD64.zip ├── README.md ├── anomalydetectionimage.jpg ├── checkopenports ├── cluster.jpg ├── genericimage.jpg ├── logstreamingimage.jpg ├── mirrorbrokers ├── docker-compose.yml ├── mirrorbroker.pdf ├── mirrorbrokersvol-persistentvolumeclaim.yaml ├── otics-mirrorbrokers-deployment.yaml ├── otics-mirrorbrokers-service.yaml ├── readme ├── viper-generic-env-configmap.yaml └── viper-generic.env ├── optimizationimage.jpg ├── predictionimage.jpg └── preprocess.jpg /ARM-Chipset/MAADS-HPDE v8.8.54 Linux ARM.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/smaurice101/transactionalmachinelearning/83958ca9694a3b45e1e1ab5c4c73be4c95348482/ARM-Chipset/MAADS-HPDE v8.8.54 Linux ARM.zip -------------------------------------------------------------------------------- /ARM-Chipset/MAADS-VIPER v5.6.23 Linux ARM.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/smaurice101/transactionalmachinelearning/83958ca9694a3b45e1e1ab5c4c73be4c95348482/ARM-Chipset/MAADS-VIPER v5.6.23 Linux ARM.zip -------------------------------------------------------------------------------- /ARM-Chipset/MAADS-VIPERviz v2.3.61 Linux ARM.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/smaurice101/transactionalmachinelearning/83958ca9694a3b45e1e1ab5c4c73be4c95348482/ARM-Chipset/MAADS-VIPERviz v2.3.61 Linux ARM.zip -------------------------------------------------------------------------------- /ARM-Chipset/readme: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /ARM64-Chipset/MAADS-HPDE v8.8.54 Linux ARM64.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/smaurice101/transactionalmachinelearning/83958ca9694a3b45e1e1ab5c4c73be4c95348482/ARM64-Chipset/MAADS-HPDE v8.8.54 Linux ARM64.zip -------------------------------------------------------------------------------- /ARM64-Chipset/MAADS-VIPER v5.6.23 Linux ARM64.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/smaurice101/transactionalmachinelearning/83958ca9694a3b45e1e1ab5c4c73be4c95348482/ARM64-Chipset/MAADS-VIPER v5.6.23 Linux ARM64.zip -------------------------------------------------------------------------------- /ARM64-Chipset/MAADS-VIPERviz v2.3.61 Linux ARM64.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/smaurice101/transactionalmachinelearning/83958ca9694a3b45e1e1ab5c4c73be4c95348482/ARM64-Chipset/MAADS-VIPERviz v2.3.61 Linux ARM64.zip -------------------------------------------------------------------------------- /ARM64-Chipset/readme: -------------------------------------------------------------------------------- 1 | 2 | -------------------------------------------------------------------------------- /MAADS-HPDE v.8.8.54 Windows AMD64.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/smaurice101/transactionalmachinelearning/83958ca9694a3b45e1e1ab5c4c73be4c95348482/MAADS-HPDE v.8.8.54 Windows AMD64.zip -------------------------------------------------------------------------------- /MAADS-HPDE v8.8.54 Linux AMD64.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/smaurice101/transactionalmachinelearning/83958ca9694a3b45e1e1ab5c4c73be4c95348482/MAADS-HPDE v8.8.54 Linux AMD64.zip -------------------------------------------------------------------------------- /MAADS-VIPER v5.6.23 Linux AMD64.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/smaurice101/transactionalmachinelearning/83958ca9694a3b45e1e1ab5c4c73be4c95348482/MAADS-VIPER v5.6.23 Linux AMD64.zip -------------------------------------------------------------------------------- /MAADS-VIPER v5.6.23 Windows AMD64.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/smaurice101/transactionalmachinelearning/83958ca9694a3b45e1e1ab5c4c73be4c95348482/MAADS-VIPER v5.6.23 Windows AMD64.zip -------------------------------------------------------------------------------- /MAADS-VIPERviz v2.3.61 Linux AMD64.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/smaurice101/transactionalmachinelearning/83958ca9694a3b45e1e1ab5c4c73be4c95348482/MAADS-VIPERviz v2.3.61 Linux AMD64.zip -------------------------------------------------------------------------------- /MAADS-VIPERviz v2.3.61 Windows AMD64.zip: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/smaurice101/transactionalmachinelearning/83958ca9694a3b45e1e1ab5c4c73be4c95348482/MAADS-VIPERviz v2.3.61 Windows AMD64.zip -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | Comprehensive TML Details on [Readthedocs](https://tml.readthedocs.io/en/latest/index.html) 2 | 3 | # Transactional Machine Learning : *The Machine Learning Platform for Data Streams* 4 | *Transactional Machine Learning (TML) using Data Streams and AutoML is a platform for building and streaming cloud native solutions using Apache Kafka or Redpanda as the data backbone, with Kubernetes and Docker as core infrastucture components, running on Confluent, AWS, GCP, AZURE,* for advanced machine learning solutions using transactional data to learn from, and provide insights, quickly and continuously to any number of devices and humans in any format! 5 | 6 | **TML Is Based On the Belief that "_Fast data requires fast machine learning for fast decision-making_"**. TML gives rise in the industy to a **_Data Stream Scientist_** versus a **_Data Scientist_** in conventional machine learning (CML). 7 | 8 | **TML Book Details Found on** [Publisher's site](https://www.apress.com/us/book/9781484270226) 9 | 10 | **TML Video:** [Youtube](https://www.youtube.com/watch?v=kXmWE27Q_3o&t=4s) 11 | 12 | Apply data preprocessing and auto machine learning to data streams and create transactional machine learning (TML) solutions that are: 13 | 14 | **1. frictionless**: require minimal to no human intervention 15 | 16 | **2. elastic**: machine learning solutions that can scale up or down using Kubernetes to control or enhance the number of data streams, algorithms (or machine learning models) and predictions instantly and continuously. 17 | 18 | TML is ideal when data are highly erratic (nonlinear) and you want the machine to learn from the **latest** dataset by creating sliding windows of training datasets and auto creating **micro-machine learning models** quickly, that can be easily scaled, managed and the insights used immediately from any device! **There are many TML use cases such as:** 19 | 20 | **1. IoT**: Capture real-time, fast, data, and build custom **micro-machine learning models** for every IoT device specific to the environment that the device operates in and predict failures, optimize device settings, and more. 21 | 22 | **2. IoT Edge**: TML is ideal for edge devices with **No Internet** connections. Simply use the On-Prem version of TML software, with On-Prem Kafka and create large and powerful, real-time, edge solutions. 23 | 24 | **3. HealthCare**: TML is ideal for health data processing for patients, payers, and providers. Open access to health data has been mandated by [CMS](https://www.cms.gov/Regulations-and-Guidance/Guidance/Interoperability/index#:~:text=or%20fact%20sheet.-,CMS%20Interoperability%20and%20Patient%20Access%20Final%20Rule,they%20can%20best%20use%20it), which opens up enormous opportunities for TML. 25 | 26 | **4. Banking/Finance Fraud Detect**: Detect fraud using unsupervised learning on data streams and process millions of transactions for fraud - see the [LinkedIn blog](https://www.linkedin.com/pulse/bank-fraud-detection-data-streams-kafka-google-cloud-maurice-ph-d-/?trackingId=HEyDMUFIS0C4smYhwuUBfg%3D%3D) 27 | 28 | **5. Financial Trading**: Use TML to analyse stock prices and predict **sub-second** stock prices! 29 | 30 | **6. Pricing**: Use TML to build optimal pricing of products at scale. 31 | 32 | **7. Oil/Gas**: Use TML to optimize oil drilling operations **sub-second** and drill oil wells faster and cheaper with minimal downtime 33 | 34 | **8. SO MUCH MORE...** 35 | 36 | **The above usecases are not possible with conventional machine learning methods that require frequent human interventions that create lots of friction, and not very elastic.** 37 | 38 | **By using Apache Kafka On-Premise many advanced, and large, TML usecases are 80-90% cheaper than cloud-native usecases, mainly because storage, compute, Egress/Ingress and Kafka partitions are localized. Given Compute and Storage are extremely low-cost On-Premise solutions with TML are on the rise.** TML On-Prem is ideal for small companies or startups that do not want to incur large cloud costs but still want to provide TML solutions to their customers. 39 | 40 | Strengthen your knowledge of the inner workings of TML solutions using data streams with auto machine learning integrated with Apache Kafka. **You will be at the forefront of an exciting area of machine learning that is focused on speed of data and algorithm creation, scale, and automation.** 41 | 42 | --- 43 | 44 | **Create Your First TML Solution with Kafka by Downloading the Technologies Below** 45 | 46 | [WATCH The TML Instructional Video: Setup, Configuration, Execution, Visualization](https://youtu.be/b1fuIeC7d-8) 47 | 48 | 1) **_MAADS-VIPER:_** https://www.confluent.io/hub/oticsinc/maads-viper (Official Kafka connector for TML - Linux version). **Latest** Linux/Windows/MAC version can be found above.). More information [here](https://github.com/smaurice101/maads-viper/tree/master). 49 | 2) **_MAADS-VIPERviz:_** [Streaming Visualization for Windows/Linux/MAC versions](https://github.com/smaurice101/MAADS-VIPERviz/tree/main) 50 | 3) **_MAADS-HPDE:_** [AutoML for Windows/Linux/MAC versions available](https://github.com/smaurice101/MAADS-HPDE/tree/main) 51 | 4) **_MAADS-Python Library:_** https://pypi.org/project/maadstml/ (NOTE: You need Python IDE installed: tested with Python up to v.3.8) 52 | 5) **_Create a Kafka Cluster at Confluent Cloud:_** https://www.confluent.io/confluent-cloud 53 | 6) Users can also, **directly**, use MAADS-VIPER and Kafka services on Amazon AWS, Microsoft Azure, and Google (GCP) 54 | 55 | * VIPER, VIPERviz, and HPDE fall under the Software Licence from OTICS Advanced Analytics Inc.: http://www.otics.ca/maadsweb/maads-viper/license/Software-License-VIPER-HPDE.pdf 56 | * For Advanced users who want to integrate TML with MAADS DEEP LEARNING technology should email info@otics.ca* 57 | 58 | **Contact**: 59 | 60 | For any help and additional information, or if your token has expired you can e-mail: info@otics.ca or goto http://www.otics.ca, and we would be happy to help you! OTICS will provide 1 hour free developer session on TML if needed. 61 | 62 | *** 63 | [**Watch University of Calgary Lecture on TML to Software Engineering Graduate Students**](https://lnkd.in/gxMw6Da) 64 | 65 | [**Read Confluent blog**](https://www.confluent.io/blog/transactional-machine-learning-with-maads-viper-and-apache-kafka/) 66 | 67 | [**Read Medium blog**](https://sebastian-maurice.medium.com/transactional-machine-learning-with-data-streams-for-real-time-predictions-and-optimization-using-eb12c4df597c) 68 | 69 | [**Watch ApacheCon Presentation on TML - Apache Software Foundation**](https://www.youtube.com/watch?app=desktop&v=RFj4XdSXCdI) 70 | 71 | [**Read Fast Big Data Visualization blog**](https://www.linkedin.com/pulse/fast-visualization-big-data-streams-using-sliding-maurice-ph-d-/?trackingId=OFDLTUUUSGqsgYTC0srzdw%3D%3D) 72 | 73 | *** 74 | IF CREATING PRODUCTION TML SOLUTIONS: [SEE HERE FOR CODE EXAMPLES](https://github.com/smaurice101/TML_Production_Code_Examples) 75 | *** 76 | **EXAMPLE TML PYTHON CODE**: You can literally build extremely powerful, distributed, and scalable cloud-based machine learning solutions with the code below for your business use case of any size with low-code and low-cost! 77 | 78 | **CODE SET 1:** This set of programs will go through an example of predicting and optimizing Foot Traffic at ~11,000 Walmart Stores. 79 | 80 | **Step 1:** 81 | 82 | [**Produce Walmart Data to Kafka Cluster** (Let this run for 5 minutes or so THEN run the Machine Learning code next)](https://github.com/smaurice101/produce_data_to_kafka) 83 | 84 | **Step 2:** 85 | 86 | [**Walmart Foot Traffic TML** (Let this run for 5 minutes or so THEN run the Prediction/Optimization code next)](https://github.com/smaurice101/Walmart-Foot-Traffic-Transactional-Machine-Learning) 87 | 88 | **Step 3:** 89 | 90 | [**Perform Walmart Foot Traffic Prediction and Optimization**](https://github.com/smaurice101/Walmart-Predict-and-Optimize-Foot-Traffic) 91 | 92 | **Step 4:** 93 | 94 | To Visualize the results in Step 3 you need to run MAADS-VIPER Visualization (VIPERviz) and then enter the following URL For: 95 | 96 | **_Visualize Predictions:_** 97 | https://127.0.0.1:8003/prediction.html?topic=otics-tmlbook-walmartretail-foottrafic-prediction-results-output&offset=-1&groupid=&rollbackoffset=10&topictype=prediction&append=0&secure=1&consumerid=[Enter Consumer ID for Topic=otics-tmlbook-walmartretail-foottrafic-prediction-results-output]&vipertoken=hivmg1TMR1zS1ZHVqF4s83Zq1rDtsZKh9pEULHnLR0BXPlaPEMZBEAyC7TY0 98 | 99 | **_Visualize Optimization:_** 100 | https://127.0.0.1:8003/optimization.html?topic=otics-tmlbook-walmartretail-foottrafic-optimization-results-output&offset=-1&groupid=&rollbackoffset=10&topictype=optimization&secure=1&append=0&consumerid=[Enter Consumer ID for Topic=otics-tmlbook-walmartretail-foottrafic-prediction-results-output]&vipertoken=hivmg1TMR1zS1ZHVqF4s83Zq1rDtsZKh9pEULHnLR0BXPlaPEMZBEAyC7TY0 101 | 102 | 103 | The Above Assumes: 104 | 1) You have created a Kafka cluster in Confluent Cloud (Or AWS, Microsoft or Google Cloud) 105 | 2) You have MAADSViz running on IP: 127.0.0.1 and listening on Port: 8003 106 | 3) You downloaded views zip and extracted contents to **viperviz/views folder** 107 | 4) You added the consumer id for Topic=otics-tmlbook-walmartretail-foottrafic-prediction-results-output and Topic=otics-tmlbook-walmartretail-foottrafic-optimization-results-output 108 | 5) This Consumer IDs are printed out for you in the Python Program in Step 1 c) 109 | 110 | **CODE SET 2:** This set of program will perform Bank Fraud detection in 50 Bank accounts and 5 fields in each transaction. It will detect fraud in real-time. 111 | 112 | **Step 1:** 113 | 114 | [**Produce Bank Account Data to Kafka Cluster** (Let this run for 5 minutes or so THEN run the Anomaly Detection code next)](https://github.com/smaurice101/Produce-Bank-Fraud-Data-to-Kafka) 115 | 116 | **Step 2:** 117 | 118 | [**Perform Transactional Bank Fraud Detection on Streaming Data** This will use multi-threading in Python](https://github.com/smaurice101/Predict-Bank-Fraud) 119 | 120 | **Step 3:** 121 | 122 | **_Visualize Anomalies:_** 123 | 124 | https://127.0.0.1:8003/anomaly.html?topic=otics-tmlbook-anomalydataresults&offset=-1&rollbackoffset=20&append=0&topictype=anomaly&secure=1&groupid=&consumerid=[Enter your Consumer ID for otics-tmlbook-anomalydataresults]&vipertoken=hivmg1TMR1zS1ZHVqF4s83Zq1rDtsZKh9pEULHnLR0BXPlaPEMZBEAyC7TY0 125 | 126 | The Above Assumes: 127 | 1) You have created a Kafka cluster in Confluent Cloud (Or AWS, Microsoft or Google Cloud) 128 | 2) You have MAADSViz running on IP: 127.0.0.1 and listening on Port: 8003 129 | 3) You downloaded views zip and extracted contents to **viperviz/views folder** 130 | 4) You added the consumer id for Topic=otics-tmlbook-anomalydataresults 131 | 5) This Consumer IDs are printed out for you in the Python Program in Step 2 b) 132 | 133 | **_NOTE:_** Please monitor your Cloud Billing/Payments - DELETE YOUR CLUSTER WHEN YOU ARE DONE. DO NOT LET YOUR CLUSTER RUN IF YOU ARE NOT USING IT. The above programs will auto create all data very quickly. So you can DELETE your cluster immediately. Confluent will give you $200 free cloud credits. The above programs will consume a low fraction of this free $$. 134 | *** 135 | -------------------------------------------------------------------------------- /anomalydetectionimage.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/smaurice101/transactionalmachinelearning/83958ca9694a3b45e1e1ab5c4c73be4c95348482/anomalydetectionimage.jpg -------------------------------------------------------------------------------- /checkopenports: -------------------------------------------------------------------------------- 1 | import psutil 2 | import maadstml 3 | 4 | def getlisterners(): 5 | pidlist=psutil.pids() 6 | connections = psutil.net_connections() 7 | free=list() 8 | for c in connections: 9 | if c.status=="LISTEN": 10 | p = psutil.Process(c.pid) 11 | attrs=[p.name(),c.laddr.ip,c.laddr.port] 12 | free.append(attrs) 13 | return free 14 | 15 | def broadcast(httptype): 16 | freelist=getlisterners() 17 | freetml=list() 18 | 19 | for b in freelist: 20 | appname=b[0] 21 | if b[1]=='0.0.0.0': 22 | b[1]="127.0.0.1" 23 | host=httptype+"://"+b[1] 24 | 25 | port=b[2] 26 | try: 27 | if "mainhpd" in appname or "viper" in appname or "hpde" in appname: 28 | msg=maadstml.areyoubusy(host,port) 29 | if msg!=None: 30 | mlist=msg.split(",") 31 | if mlist[1]=="no": 32 | freetml.append([mlist[0],host,port,mlist[len(mlist)-1]]) 33 | 34 | except Exception as e: 35 | continue 36 | 37 | return freetml 38 | 39 | def broadcast_getavailable_viper_hpde(): 40 | freetml1=broadcast("https") 41 | freetml2=broadcast("http") 42 | 43 | mainfreetml=freetml1 + freetml2 44 | return mainfreetml 45 | 46 | available_viper_hpde_list=broadcast_getavailable_viper_hpde() 47 | -------------------------------------------------------------------------------- /cluster.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/smaurice101/transactionalmachinelearning/83958ca9694a3b45e1e1ab5c4c73be4c95348482/cluster.jpg -------------------------------------------------------------------------------- /genericimage.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/smaurice101/transactionalmachinelearning/83958ca9694a3b45e1e1ab5c4c73be4c95348482/genericimage.jpg -------------------------------------------------------------------------------- /logstreamingimage.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/smaurice101/transactionalmachinelearning/83958ca9694a3b45e1e1ab5c4c73be4c95348482/logstreamingimage.jpg -------------------------------------------------------------------------------- /mirrorbrokers/docker-compose.yml: -------------------------------------------------------------------------------- 1 | version: '3.4' 2 | 3 | services: 4 | otics-mirrorbrokers: 5 | image: maadsdocker/otics-mirrorbrokers 6 | container_name: mirrorbrokers 7 | network_mode: host 8 | env_file: 9 | - viper-generic.env 10 | volumes: 11 | - mirrorbrokersvol:/otics/tml/mirrorbrokers 12 | environment: 13 | BROKERJSONFILE: > 14 | { 15 | "brokers":[{ 16 | "id":1, 17 | "brokerfrom":"pkc-6ojv2.us-west4.gcp.confluent.cloud:9092", 18 | "brokerusernamefrom":"NELJO4LM3G4IJ72A:Ar5CxLsPKErfuidEWuiom6GueiaQSKpKCV5lriBlqMucNCRfjSqyHZ4+nMOl6xSz", 19 | "brokerto":"pkc-419q3.us-east4.gcp.confluent.cloud:9092", 20 | "brokerusernameto":"MW7MSO4DV5TVHQVL:pm5j+BYyRgGl7zaE1so8cIqRUrS3gqx9X/e3SuaZ2Ef+Gdh43O911yAM8qJe8Ucf", 21 | "enabletlsfrom":"1", 22 | "enabletlsto":"1", 23 | "compressionfrom":"snappy", 24 | "compressionto":"snappy", 25 | "saslfrom":"PLAIN", 26 | "saslto":"PLAIN" 27 | }, 28 | { 29 | "id":2, 30 | "brokerfrom":"pkc-ymrq7.us-east-2.aws.confluent.cloud:9092", 31 | "brokerusernamefrom":"5R3MSIALEXOZEXCL:Yt8hNOASEjPTjmRW0eAWvv9zNwOAGTDrYXK5Yd6vTJxTGr+UuqDq/fYqgXAdMeNN", 32 | "brokerto":"pkc-pgq85.us-west-2.aws.confluent.cloud:9092", 33 | "brokerusernameto":"33QXYXWFM7RC6XEK:w0nulHyorxbYoYw7utHsaM9dTkJTxQMwZnbZErv8W3Ka4nwVEWkLfAjtkhJt8MxM", 34 | "enabletlsfrom":"1", 35 | "enabletlsto":"1", 36 | "compressionfrom":"snappy", 37 | "compressionto":"snappy", 38 | "saslfrom":"PLAIN", 39 | "saslto":"PLAIN" 40 | }, 41 | { 42 | "id":3, 43 | "brokerfrom":"pkc-56d1g.eastus.azure.confluent.cloud:9092", 44 | "brokerusernamefrom":"G2Z5CIXJYNDF6HPY:umT7djfREmmfVEuWaJIj9Q0sJVO6PusrMP13CIfJfDZQC3lFkzbNq7pJtrVMt2J1", 45 | "brokerto":"pkc-pgq85.us-west-2.aws.confluent.cloud:9092", 46 | "brokerusernameto":"33QXYXWFM7RC6XEK:w0nulHyorxbYoYw7utHsaM9dTkJTxQMwZnbZErv8W3Ka4nwVEWkLfAjtkhJt8MxM", 47 | "enabletlsfrom":"1", 48 | "enabletlsto":"1", 49 | "compressionfrom":"snappy", 50 | "compressionto":"snappy", 51 | "saslfrom":"PLAIN", 52 | "saslto":"PLAIN" 53 | }, 54 | { 55 | "id":4, 56 | "brokerfrom":"pkc-ymrq7.us-east-2.aws.confluent.cloud:9092", 57 | "brokerusernamefrom":"RGYVM6QYUVE5LTAY:W8Ms15WuJKofsxEbuVsvVHiVgb7ItplQY89tS9uPreBP0cpp3HtJKtg2IfiNw649", 58 | "brokerto":"pkc-419q3.us-east4.gcp.confluent.cloud:9092", 59 | "brokerusernameto":"MW7MSO4DV5TVHQVL:pm5j+BYyRgGl7zaE1so8cIqRUrS3gqx9X/e3SuaZ2Ef+Gdh43O911yAM8qJe8Ucf", 60 | "enabletlsfrom":"1", 61 | "enabletlsto":"1", 62 | "compressionfrom":"snappy", 63 | "compressionto":"snappy", 64 | "saslfrom":"PLAIN", 65 | "saslto":"PLAIN" 66 | }, 67 | { 68 | "id":5, 69 | "brokerfrom":"pkc-mz3gw.westus3.azure.confluent.cloud:9092", 70 | "brokerusernamefrom":"FNKDA5VOA56FGLRX:oLC8LuUjC5YoHLrzmbUEZoARugJyG6agbVrUOWSfJzJBKCROqwMScXScyFdnRtW8", 71 | "brokerto":"pkc-pgq85.us-west-2.aws.confluent.cloud:9092", 72 | "brokerusernameto":"33QXYXWFM7RC6XEK:w0nulHyorxbYoYw7utHsaM9dTkJTxQMwZnbZErv8W3Ka4nwVEWkLfAjtkhJt8MxM", 73 | "enabletlsfrom":"1", 74 | "enabletlsto":"1", 75 | "compressionfrom":"snappy", 76 | "compressionto":"snappy", 77 | "saslfrom":"PLAIN", 78 | "saslto":"PLAIN" 79 | }, 80 | { 81 | "id":6, 82 | "brokerfrom":"pkc-ldvmy.centralus.azure.confluent.cloud:9092", 83 | "brokerusernamefrom":"LCYT5Q753JI2FG4D:6/wzTsuBKkgp4jaa9SYeNxqgiQ5Sy4kr+2sVeTDdrUf0syUwMY/wUEMNoybZqwgO", 84 | "brokerto":"pkc-419q3.us-east4.gcp.confluent.cloud:9092", 85 | "brokerusernameto":"MW7MSO4DV5TVHQVL:pm5j+BYyRgGl7zaE1so8cIqRUrS3gqx9X/e3SuaZ2Ef+Gdh43O911yAM8qJe8Ucf", 86 | "enabletlsfrom":"1", 87 | "enabletlsto":"1", 88 | "compressionfrom":"snappy", 89 | "compressionto":"snappy", 90 | "saslfrom":"PLAIN", 91 | "saslto":"PLAIN" 92 | } 93 | ] 94 | } 95 | ports: 96 | - "9092:9092" 97 | volumes: 98 | mirrorbrokersvol: 99 | name: mirrorbrokers-vol 100 | -------------------------------------------------------------------------------- /mirrorbrokers/mirrorbroker.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/smaurice101/transactionalmachinelearning/83958ca9694a3b45e1e1ab5c4c73be4c95348482/mirrorbrokers/mirrorbroker.pdf -------------------------------------------------------------------------------- /mirrorbrokers/mirrorbrokersvol-persistentvolumeclaim.yaml: -------------------------------------------------------------------------------- 1 | apiVersion: v1 2 | kind: PersistentVolumeClaim 3 | metadata: 4 | creationTimestamp: null 5 | labels: 6 | io.kompose.service: mirrorbrokersvol 7 | name: mirrorbrokersvol 8 | spec: 9 | accessModes: 10 | - ReadWriteOnce 11 | resources: 12 | requests: 13 | storage: 100Mi 14 | status: {} 15 | -------------------------------------------------------------------------------- /mirrorbrokers/otics-mirrorbrokers-deployment.yaml: -------------------------------------------------------------------------------- 1 | apiVersion: apps/v1 2 | kind: Deployment 3 | metadata: 4 | annotations: 5 | kompose.cmd: kompose convert 6 | kompose.version: 1.26.0 (40646f47) 7 | creationTimestamp: null 8 | labels: 9 | io.kompose.service: otics-mirrorbrokers 10 | name: otics-mirrorbrokers 11 | spec: 12 | replicas: 1 13 | selector: 14 | matchLabels: 15 | io.kompose.service: otics-mirrorbrokers 16 | strategy: 17 | type: Recreate 18 | template: 19 | metadata: 20 | annotations: 21 | kompose.cmd: kompose convert 22 | kompose.version: 1.26.0 (40646f47) 23 | creationTimestamp: null 24 | labels: 25 | io.kompose.service: otics-mirrorbrokers 26 | spec: 27 | containers: 28 | - env: 29 | - name: ASYNCTIMEOUT 30 | valueFrom: 31 | configMapKeyRef: 32 | key: ASYNCTIMEOUT 33 | name: viper-generic-env 34 | - name: BATCHTHREADS 35 | valueFrom: 36 | configMapKeyRef: 37 | key: BATCHTHREADS 38 | name: viper-generic-env 39 | - name: BROKERJSONFILE 40 | value: | 41 | { 42 | "brokers":[{ 43 | "id":1, 44 | "brokerfrom":"pkc-6ojv2.us-west4.gcp.confluent.cloud:9092", 45 | "brokerusernamefrom":"NELJO4LM3G4IJ72A:Ar5CxLsPKErfuidEWuiom6GueiaQSKpKCV5lriBlqMucNCRfjSqyHZ4+nMOl6xSz", 46 | "brokerto":"pkc-419q3.us-east4.gcp.confluent.cloud:9092", 47 | "brokerusernameto":"MW7MSO4DV5TVHQVL:pm5j+BYyRgGl7zaE1so8cIqRUrS3gqx9X/e3SuaZ2Ef+Gdh43O911yAM8qJe8Ucf", 48 | "enabletlsfrom":"1", 49 | "enabletlsto":"1", 50 | "compressionfrom":"snappy", 51 | "compressionto":"snappy", 52 | "saslfrom":"PLAIN", 53 | "saslto":"PLAIN" 54 | }, 55 | { 56 | "id":2, 57 | "brokerfrom":"pkc-ymrq7.us-east-2.aws.confluent.cloud:9092", 58 | "brokerusernamefrom":"5R3MSIALEXOZEXCL:Yt8hNOASEjPTjmRW0eAWvv9zNwOAGTDrYXK5Yd6vTJxTGr+UuqDq/fYqgXAdMeNN", 59 | "brokerto":"pkc-pgq85.us-west-2.aws.confluent.cloud:9092", 60 | "brokerusernameto":"33QXYXWFM7RC6XEK:w0nulHyorxbYoYw7utHsaM9dTkJTxQMwZnbZErv8W3Ka4nwVEWkLfAjtkhJt8MxM", 61 | "enabletlsfrom":"1", 62 | "enabletlsto":"1", 63 | "compressionfrom":"snappy", 64 | "compressionto":"snappy", 65 | "saslfrom":"PLAIN", 66 | "saslto":"PLAIN" 67 | }, 68 | { 69 | "id":3, 70 | "brokerfrom":"pkc-56d1g.eastus.azure.confluent.cloud:9092", 71 | "brokerusernamefrom":"G2Z5CIXJYNDF6HPY:umT7djfREmmfVEuWaJIj9Q0sJVO6PusrMP13CIfJfDZQC3lFkzbNq7pJtrVMt2J1", 72 | "brokerto":"pkc-pgq85.us-west-2.aws.confluent.cloud:9092", 73 | "brokerusernameto":"33QXYXWFM7RC6XEK:w0nulHyorxbYoYw7utHsaM9dTkJTxQMwZnbZErv8W3Ka4nwVEWkLfAjtkhJt8MxM", 74 | "enabletlsfrom":"1", 75 | "enabletlsto":"1", 76 | "compressionfrom":"snappy", 77 | "compressionto":"snappy", 78 | "saslfrom":"PLAIN", 79 | "saslto":"PLAIN" 80 | }, 81 | { 82 | "id":4, 83 | "brokerfrom":"pkc-ymrq7.us-east-2.aws.confluent.cloud:9092", 84 | "brokerusernamefrom":"RGYVM6QYUVE5LTAY:W8Ms15WuJKofsxEbuVsvVHiVgb7ItplQY89tS9uPreBP0cpp3HtJKtg2IfiNw649", 85 | "brokerto":"pkc-419q3.us-east4.gcp.confluent.cloud:9092", 86 | "brokerusernameto":"MW7MSO4DV5TVHQVL:pm5j+BYyRgGl7zaE1so8cIqRUrS3gqx9X/e3SuaZ2Ef+Gdh43O911yAM8qJe8Ucf", 87 | "enabletlsfrom":"1", 88 | "enabletlsto":"1", 89 | "compressionfrom":"snappy", 90 | "compressionto":"snappy", 91 | "saslfrom":"PLAIN", 92 | "saslto":"PLAIN" 93 | }, 94 | { 95 | "id":5, 96 | "brokerfrom":"pkc-mz3gw.westus3.azure.confluent.cloud:9092", 97 | "brokerusernamefrom":"FNKDA5VOA56FGLRX:oLC8LuUjC5YoHLrzmbUEZoARugJyG6agbVrUOWSfJzJBKCROqwMScXScyFdnRtW8", 98 | "brokerto":"pkc-pgq85.us-west-2.aws.confluent.cloud:9092", 99 | "brokerusernameto":"33QXYXWFM7RC6XEK:w0nulHyorxbYoYw7utHsaM9dTkJTxQMwZnbZErv8W3Ka4nwVEWkLfAjtkhJt8MxM", 100 | "enabletlsfrom":"1", 101 | "enabletlsto":"1", 102 | "compressionfrom":"snappy", 103 | "compressionto":"snappy", 104 | "saslfrom":"PLAIN", 105 | "saslto":"PLAIN" 106 | }, 107 | { 108 | "id":6, 109 | "brokerfrom":"pkc-ldvmy.centralus.azure.confluent.cloud:9092", 110 | "brokerusernamefrom":"LCYT5Q753JI2FG4D:6/wzTsuBKkgp4jaa9SYeNxqgiQ5Sy4kr+2sVeTDdrUf0syUwMY/wUEMNoybZqwgO", 111 | "brokerto":"pkc-419q3.us-east4.gcp.confluent.cloud:9092", 112 | "brokerusernameto":"MW7MSO4DV5TVHQVL:pm5j+BYyRgGl7zaE1so8cIqRUrS3gqx9X/e3SuaZ2Ef+Gdh43O911yAM8qJe8Ucf", 113 | "enabletlsfrom":"1", 114 | "enabletlsto":"1", 115 | "compressionfrom":"snappy", 116 | "compressionto":"snappy", 117 | "saslfrom":"PLAIN", 118 | "saslto":"PLAIN" 119 | } 120 | ] 121 | } 122 | - name: BROKER_HOSTPORT_FROM 123 | valueFrom: 124 | configMapKeyRef: 125 | key: BROKER_HOSTPORT_FROM 126 | name: viper-generic-env 127 | - name: BROKER_HOSTPORT_TO 128 | valueFrom: 129 | configMapKeyRef: 130 | key: BROKER_HOSTPORT_TO 131 | name: viper-generic-env 132 | - name: BROKER_USERNAME_PASS_FROM 133 | valueFrom: 134 | configMapKeyRef: 135 | key: BROKER_USERNAME_PASS_FROM 136 | name: viper-generic-env 137 | - name: BROKER_USERNAME_PASS_TO 138 | valueFrom: 139 | configMapKeyRef: 140 | key: BROKER_USERNAME_PASS_TO 141 | name: viper-generic-env 142 | - name: CLOUD_PASSWORD 143 | valueFrom: 144 | configMapKeyRef: 145 | key: CLOUD_PASSWORD 146 | name: viper-generic-env 147 | - name: CLOUD_USERNAME 148 | valueFrom: 149 | configMapKeyRef: 150 | key: CLOUD_USERNAME 151 | name: viper-generic-env 152 | - name: COMPANYNAME 153 | valueFrom: 154 | configMapKeyRef: 155 | key: COMPANYNAME 156 | name: viper-generic-env 157 | - name: COMPRESSIONTYPE 158 | valueFrom: 159 | configMapKeyRef: 160 | key: COMPRESSIONTYPE 161 | name: viper-generic-env 162 | - name: COMPRESSION_FROM 163 | valueFrom: 164 | configMapKeyRef: 165 | key: COMPRESSION_FROM 166 | name: viper-generic-env 167 | - name: COMPRESSION_TO 168 | valueFrom: 169 | configMapKeyRef: 170 | key: COMPRESSION_TO 171 | name: viper-generic-env 172 | - name: ENABLETLS_FROM 173 | valueFrom: 174 | configMapKeyRef: 175 | key: ENABLETLS_FROM 176 | name: viper-generic-env 177 | - name: ENABLETLS_TO 178 | valueFrom: 179 | configMapKeyRef: 180 | key: ENABLETLS_TO 181 | name: viper-generic-env 182 | - name: FILEAGEMAX 183 | valueFrom: 184 | configMapKeyRef: 185 | key: FILEAGEMAX 186 | name: viper-generic-env 187 | - name: KAFKA_ADVERTISED_HOST_NAME 188 | valueFrom: 189 | configMapKeyRef: 190 | key: KAFKA_ADVERTISED_HOST_NAME 191 | name: viper-generic-env 192 | - name: KAFKA_CONNECT_BOOTSTRAP_SERVERS 193 | valueFrom: 194 | configMapKeyRef: 195 | key: KAFKA_CONNECT_BOOTSTRAP_SERVERS 196 | name: viper-generic-env 197 | - name: KAFKA_ZOOKEEPER_CONNECT 198 | valueFrom: 199 | configMapKeyRef: 200 | key: KAFKA_ZOOKEEPER_CONNECT 201 | name: viper-generic-env 202 | - name: KUBERNETES 203 | valueFrom: 204 | configMapKeyRef: 205 | key: KUBERNETES 206 | name: viper-generic-env 207 | - name: LOGSTREAMTOPIC 208 | valueFrom: 209 | configMapKeyRef: 210 | key: LOGSTREAMTOPIC 211 | name: viper-generic-env 212 | - name: LOGSTREAMTOPICPARTITIONS 213 | valueFrom: 214 | configMapKeyRef: 215 | key: LOGSTREAMTOPICPARTITIONS 216 | name: viper-generic-env 217 | - name: LOGSTREAMTOPICREPLICATIONFACTOR 218 | valueFrom: 219 | configMapKeyRef: 220 | key: LOGSTREAMTOPICREPLICATIONFACTOR 221 | name: viper-generic-env 222 | - name: MAXBROKERSPERCONTAINER 223 | valueFrom: 224 | configMapKeyRef: 225 | key: MAXBROKERSPERCONTAINER 226 | name: viper-generic-env 227 | - name: MAXCONSUMEMESSAGES 228 | valueFrom: 229 | configMapKeyRef: 230 | key: MAXCONSUMEMESSAGES 231 | name: viper-generic-env 232 | - name: MAXOPENREQUESTS 233 | valueFrom: 234 | configMapKeyRef: 235 | key: MAXOPENREQUESTS 236 | name: viper-generic-env 237 | - name: MAXPERCMESSAGES 238 | valueFrom: 239 | configMapKeyRef: 240 | key: MAXPERCMESSAGES 241 | name: viper-generic-env 242 | - name: MAXPREDICTIONROWS 243 | valueFrom: 244 | configMapKeyRef: 245 | key: MAXPREDICTIONROWS 246 | name: viper-generic-env 247 | - name: MAXPREPROCESSMESSAGES 248 | valueFrom: 249 | configMapKeyRef: 250 | key: MAXPREPROCESSMESSAGES 251 | name: viper-generic-env 252 | - name: MAXTRAININGROWS 253 | valueFrom: 254 | configMapKeyRef: 255 | key: MAXTRAININGROWS 256 | name: viper-generic-env 257 | - name: MAXVIPERVIZCONNECTIONS 258 | valueFrom: 259 | configMapKeyRef: 260 | key: MAXVIPERVIZCONNECTIONS 261 | name: viper-generic-env 262 | - name: MAXVIPERVIZROLLBACKOFFSET 263 | valueFrom: 264 | configMapKeyRef: 265 | key: MAXVIPERVIZROLLBACKOFFSET 266 | name: viper-generic-env 267 | - name: MICROSERVICEID 268 | valueFrom: 269 | configMapKeyRef: 270 | key: MICROSERVICEID 271 | name: viper-generic-env 272 | - name: NAME 273 | valueFrom: 274 | configMapKeyRef: 275 | key: NAME 276 | name: viper-generic-env 277 | - name: ONPREM 278 | valueFrom: 279 | configMapKeyRef: 280 | key: ONPREM 281 | name: viper-generic-env 282 | - name: PARTITIONS 283 | valueFrom: 284 | configMapKeyRef: 285 | key: PARTITIONS 286 | name: viper-generic-env 287 | - name: PARTITION_CHANGE_PERC 288 | valueFrom: 289 | configMapKeyRef: 290 | key: PARTITION_CHANGE_PERC 291 | name: viper-generic-env 292 | - name: POLLING_ALERTS 293 | valueFrom: 294 | configMapKeyRef: 295 | key: POLLING_ALERTS 296 | name: viper-generic-env 297 | - name: REPLICATIONFACTOR_FROM 298 | valueFrom: 299 | configMapKeyRef: 300 | key: REPLICATIONFACTOR_FROM 301 | name: viper-generic-env 302 | - name: REPLICATIONFACTOR_TO 303 | valueFrom: 304 | configMapKeyRef: 305 | key: REPLICATIONFACTOR_TO 306 | name: viper-generic-env 307 | - name: SASLMECHANISM 308 | valueFrom: 309 | configMapKeyRef: 310 | key: SASLMECHANISM 311 | name: viper-generic-env 312 | - name: SASL_FROM 313 | valueFrom: 314 | configMapKeyRef: 315 | key: SASL_FROM 316 | name: viper-generic-env 317 | - name: SASL_TO 318 | valueFrom: 319 | configMapKeyRef: 320 | key: SASL_TO 321 | name: viper-generic-env 322 | - name: SERVICENAME_FROM 323 | valueFrom: 324 | configMapKeyRef: 325 | key: SERVICENAME_FROM 326 | name: viper-generic-env 327 | - name: SERVICENAME_TO 328 | valueFrom: 329 | configMapKeyRef: 330 | key: SERVICENAME_TO 331 | name: viper-generic-env 332 | - name: SSL_CLIENT_CERT_FILE 333 | valueFrom: 334 | configMapKeyRef: 335 | key: SSL_CLIENT_CERT_FILE 336 | name: viper-generic-env 337 | - name: SSL_CLIENT_KEY_FILE 338 | valueFrom: 339 | configMapKeyRef: 340 | key: SSL_CLIENT_KEY_FILE 341 | name: viper-generic-env 342 | - name: SSL_SERVER_CERT_FILE 343 | valueFrom: 344 | configMapKeyRef: 345 | key: SSL_SERVER_CERT_FILE 346 | name: viper-generic-env 347 | - name: SYNC_INTERVAL 348 | valueFrom: 349 | configMapKeyRef: 350 | key: SYNC_INTERVAL 351 | name: viper-generic-env 352 | - name: TOPICS_LIST_FROM 353 | valueFrom: 354 | configMapKeyRef: 355 | key: TOPICS_LIST_FROM 356 | name: viper-generic-env 357 | - name: TOPIC_FILTER 358 | valueFrom: 359 | configMapKeyRef: 360 | key: TOPIC_FILTER 361 | name: viper-generic-env 362 | - name: USEHTTP 363 | valueFrom: 364 | configMapKeyRef: 365 | key: USEHTTP 366 | name: viper-generic-env 367 | - name: VIPERDEBUG 368 | valueFrom: 369 | configMapKeyRef: 370 | key: VIPERDEBUG 371 | name: viper-generic-env 372 | - name: WRITETOVIPERDB 373 | valueFrom: 374 | configMapKeyRef: 375 | key: WRITETOVIPERDB 376 | name: viper-generic-env 377 | - name: replicationchange 378 | valueFrom: 379 | configMapKeyRef: 380 | key: replicationchange 381 | name: viper-generic-env 382 | image: maadsdocker/otics-mirrorbrokers 383 | name: mirrorbrokers 384 | ports: 385 | - containerPort: 9092 386 | resources: {} 387 | volumeMounts: 388 | - mountPath: /otics/tml/mirrorbrokers/brokers 389 | name: mirrorbrokersvol 390 | restartPolicy: Always 391 | volumes: 392 | - name: mirrorbrokersvol 393 | persistentVolumeClaim: 394 | claimName: mirrorbrokersvol 395 | status: {} 396 | -------------------------------------------------------------------------------- /mirrorbrokers/otics-mirrorbrokers-service.yaml: -------------------------------------------------------------------------------- 1 | apiVersion: v1 2 | kind: Service 3 | metadata: 4 | annotations: 5 | kompose.cmd: kompose convert 6 | kompose.version: 1.26.0 (40646f47) 7 | creationTimestamp: null 8 | labels: 9 | io.kompose.service: otics-mirrorbrokers 10 | name: otics-mirrorbrokers 11 | spec: 12 | ports: 13 | - name: "9092" 14 | port: 9092 15 | targetPort: 9092 16 | selector: 17 | io.kompose.service: otics-mirrorbrokers 18 | status: 19 | loadBalancer: {} 20 | -------------------------------------------------------------------------------- /mirrorbrokers/readme: -------------------------------------------------------------------------------- 1 | Mirrorbrokers 2 | -------------------------------------------------------------------------------- /mirrorbrokers/viper-generic-env-configmap.yaml: -------------------------------------------------------------------------------- 1 | apiVersion: v1 2 | data: 3 | ASYNCTIMEOUT: "600" 4 | BATCHTHREADS: "3" 5 | BROKER_HOSTPORT_FROM: "" 6 | BROKER_HOSTPORT_TO: "" 7 | BROKER_USERNAME_PASS_FROM: "" 8 | BROKER_USERNAME_PASS_TO: "" 9 | CLOUD_PASSWORD: Zo0a+IDBkHlg47JbO9OEWV9/wJgL8mA6n1ZA9DJ6JqQmUkVJ11ihVhdR7rg9afQr 10 | CLOUD_USERNAME: PH4QHCVPVWB7UOLB 11 | COMPANYNAME: "" 12 | COMPRESSION_FROM: "" 13 | COMPRESSION_TO: "" 14 | COMPRESSIONTYPE: SNAPPY 15 | ENABLETLS_FROM: "" 16 | ENABLETLS_TO: "" 17 | FILEAGEMAX: "5" 18 | KAFKA_ADVERTISED_HOST_NAME: kafka 19 | KAFKA_CONNECT_BOOTSTRAP_SERVERS: pkc-6ojv2.us-west4.gcp.confluent.cloud:9092 20 | KAFKA_ZOOKEEPER_CONNECT: "" 21 | KUBERNETES: "0" 22 | LOGSTREAMTOPIC: "" 23 | LOGSTREAMTOPICPARTITIONS: "" 24 | LOGSTREAMTOPICREPLICATIONFACTOR: "" 25 | MAXBROKERSPERCONTAINER: "1" 26 | MAXCONSUMEMESSAGES: "3000" 27 | MAXOPENREQUESTS: "10" 28 | MAXPERCMESSAGES: "3000" 29 | MAXPREDICTIONROWS: "500" 30 | MAXPREPROCESSMESSAGES: "3000" 31 | MAXTRAININGROWS: "500" 32 | MAXVIPERVIZCONNECTIONS: "5" 33 | MAXVIPERVIZROLLBACKOFFSET: "300" 34 | MICROSERVICEID: "" 35 | NAME: OTICS 36 | ONPREM: "0" 37 | PARTITION_CHANGE_PERC: "0" 38 | PARTITIONS: "" 39 | POLLING_ALERTS: "120" 40 | REPLICATIONFACTOR_FROM: "" 41 | REPLICATIONFACTOR_TO: "" 42 | SASL_FROM: "" 43 | SASL_TO: "" 44 | SASLMECHANISM: PLAIN 45 | SERVICENAME_FROM: KAFKA Migration to Redpanda 46 | SERVICENAME_TO: KAFKA Migration to Redpanda 47 | SSL_CLIENT_CERT_FILE: client.cer.pem 48 | SSL_CLIENT_KEY_FILE: client.key.pem 49 | SSL_SERVER_CERT_FILE: server.cer.pem 50 | SYNC_INTERVAL: "10" 51 | TOPIC_FILTER: "" 52 | TOPICS_LIST_FROM: "" 53 | USEHTTP: "0" 54 | VIPERDEBUG: "2" 55 | WRITETOVIPERDB: "0" 56 | replicationchange: "0" 57 | kind: ConfigMap 58 | metadata: 59 | creationTimestamp: null 60 | labels: 61 | io.kompose.service: otics-mirrorbrokers-viper-generic-env 62 | name: viper-generic-env 63 | -------------------------------------------------------------------------------- /mirrorbrokers/viper-generic.env: -------------------------------------------------------------------------------- 1 | KAFKA_ADVERTISED_HOST_NAME=kafka 2 | KAFKA_ZOOKEEPER_CONNECT= 3 | 4 | ######################################################################### BEGIN VIPER CONFIG 5 | # ENTER SERVER FOR VIPER TO CONNECT TO 6 | KAFKA_CONNECT_BOOTSTRAP_SERVERS=pkc-6ojv2.us-west4.gcp.confluent.cloud:9092 7 | SASLMECHANISM=PLAIN 8 | COMPRESSIONTYPE=SNAPPY 9 | SSL_CLIENT_CERT_FILE=client.cer.pem 10 | SSL_CLIENT_KEY_FILE=client.key.pem 11 | SSL_SERVER_CERT_FILE=server.cer.pem 12 | 13 | CLOUD_USERNAME=PH4QHCVPVWB7UOLB 14 | CLOUD_PASSWORD=Zo0a+IDBkHlg47JbO9OEWV9/wJgL8mA6n1ZA9DJ6JqQmUkVJ11ihVhdR7rg9afQr 15 | USEHTTP=0 16 | ONPREM=0 17 | ASYNCTIMEOUT=600 18 | MICROSERVICEID= 19 | 20 | ######################################################################### END VIPER CONFIG 21 | 22 | ######################################################################### BEGIN MIRRORBROKERS CONFIG 23 | 24 | # list source username:password,username:password 25 | BROKER_USERNAME_PASS_FROM= 26 | 27 | #List source brokers host:port,host:port 28 | BROKER_HOSTPORT_FROM= 29 | 30 | # list destination username:password,username:password 31 | BROKER_USERNAME_PASS_TO= 32 | 33 | #List desitination brokers host:port,host:port (if using multiple brokers use comma to separate) 34 | BROKER_HOSTPORT_TO= 35 | 36 | # List topics to migrate or leave blank to let VIPER get list on source broker(s) 37 | TOPICS_LIST_FROM= 38 | 39 | # specify TLS. 1=TLS, 0=NOTLS, use : to separate per broker i.e. 1:1, 2 source brokers need TLS 40 | ENABLETLS_FROM= 41 | ENABLETLS_TO= 42 | 43 | # specify replication factor, use : to separate per broker i.e. 3:3, 2 source brokers have replication factor of 3 44 | # Or leave blank to let viper choose 45 | REPLICATIONFACTOR_FROM= 46 | REPLICATIONFACTOR_TO= 47 | 48 | # Specify compression, use : to separate compression per broker i.e. snappy:snappy 49 | # valid are SNAPPY,GZIP,LZ4,NONE 50 | COMPRESSION_FROM= 51 | COMPRESSION_TO= 52 | 53 | # Specify SASL mechanism, use : to separate sasl per broker i.e. plain:scram512 54 | # valid as plain, scram512, scram256 55 | SASL_FROM= 56 | SASL_TO= 57 | 58 | # If topiclist is not empty, specify partitions to use for noew topics on destination broker 59 | # partitions=2,3:4,5 - 2 topics on 2 brokers, or leave blank for Viper to determine 60 | PARTITIONS= 61 | 62 | # Give any name for the source and destination brokers 63 | SERVICENAME_FROM=KAFKA Migration to Redpanda 64 | SERVICENAME_TO=KAFKA Migration to Redpanda 65 | 66 | # Specify a number between 0-100 which is a percentage to increase or decrease partitions i.e. 20 67 | # increase partitions by 20% 68 | PARTITION_CHANGE_PERC=0 69 | replicationchange=0 70 | 71 | # Specify a filter for topics on source broker to copy to destination broker 72 | # format is: searchstring1,searchstring2,...:0 or 1:0,1,2 73 | # Middle is 0=AND, 1=OR, 74 | # Last is: 0=beginswith, 1=Any, 2=Endswith 75 | TOPIC_FILTER= 76 | 77 | # Specify sync interval in seconds - Viper will continuously copy data from source to destination without duplicates 78 | SYNC_INTERVAL=10 79 | 80 | FILEAGEMAX=5 81 | MAXBROKERSPERCONTAINER=1 82 | 83 | 84 | ######################################################################### END MIRRORBROKERS CONFIG 85 | 86 | 87 | 88 | WRITETOVIPERDB=0 89 | VIPERDEBUG=2 90 | MAXOPENREQUESTS=10 91 | 92 | LOGSTREAMTOPIC= 93 | LOGSTREAMTOPICPARTITIONS= 94 | LOGSTREAMTOPICREPLICATIONFACTOR= 95 | 96 | MAXTRAININGROWS=500 97 | MAXPREDICTIONROWS=500 98 | MAXPREPROCESSMESSAGES=3000 99 | MAXPERCMESSAGES=3000 100 | MAXCONSUMEMESSAGES=3000 101 | 102 | MAXVIPERVIZROLLBACKOFFSET=300 103 | MAXVIPERVIZCONNECTIONS=5 104 | 105 | 106 | KUBERNETES=0 107 | 108 | BATCHTHREADS=3 109 | 110 | POLLING_ALERTS=120 111 | 112 | COMPANYNAME= 113 | NAME=OTICS 114 | -------------------------------------------------------------------------------- /optimizationimage.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/smaurice101/transactionalmachinelearning/83958ca9694a3b45e1e1ab5c4c73be4c95348482/optimizationimage.jpg -------------------------------------------------------------------------------- /predictionimage.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/smaurice101/transactionalmachinelearning/83958ca9694a3b45e1e1ab5c4c73be4c95348482/predictionimage.jpg -------------------------------------------------------------------------------- /preprocess.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/smaurice101/transactionalmachinelearning/83958ca9694a3b45e1e1ab5c4c73be4c95348482/preprocess.jpg --------------------------------------------------------------------------------