├── README.md ├── investigateMetrics.py ├── investigateTraces.py ├── reading.py └── workloads ├── boot_delete_config ├── boot-and-delete.yaml ├── commands-to-exec-nova-compute.sh ├── commands-to-exec-nova.sh ├── j_boot_delete_output.json ├── restart_nova_compute_nodes.sh ├── restart_nova_container.sh ├── trace_to_csv.py ├── trace_to_csv_1.1.py └── traceids_j_boot_delete.txt ├── create_delete_image_config ├── commands-to-exec-glance.sh ├── create-and-delete-image.yaml ├── j_image_output.html ├── j_image_output.json ├── list-images.yml ├── output_jasmin_image.html ├── output_jasmin_image.json ├── output_jasmin_images.json ├── restart_glance_container.sh ├── test.json ├── trace_ids_18th_nov ├── traceids_j_image.txt └── traceids_jasmin.txt ├── json_trace_ids.py ├── keystone_auth.yaml ├── network_create_delete_config ├── commands-to-exec-neutron.sh ├── create-and-delete-networks.yaml ├── debug.txt ├── glances_out.csv ├── restart_neutron_container.sh └── traceids_j_network.txt └── os-faults-latest.yaml /README.md: -------------------------------------------------------------------------------- 1 | # Multi-Source Distributed System Data for AI-powered Analytics 2 | This repository contains the simple scripts for data statistics, and link to the multi-source distributed system dataset. 3 | 4 | You may find details of this dataset from the original paper: 5 | 6 | *Sasho Nedelkoski, Jasmin Bogatinovski, Ajay Kumar Mandapati, Soeren Becker, Jorge Cardoso, Odej Kao, "Multi-Source Distributed System Data for AI-powered Analytics".* 7 | 8 | 9 |

10 | @inproceedings{nedelkoski2020multi,
11 |   title={Multi-source Distributed System Data for AI-Powered Analytics},
12 |   author={Nedelkoski, Sasho and Bogatinovski, Jasmin and Mandapati, Ajay Kumar and Becker, Soeren and Cardoso, Jorge and Kao, Odej},
13 |   booktitle={European Conference on Service-Oriented and Cloud Computing},
14 |   pages={161--176},
15 |   year={2020},
16 |   organization={Springer}
17 | }
18 |   
19 | 20 | If you use the data, implementation, or any details of the paper, please cite! 21 | 22 | Abstract: 23 | 24 | In recent years there has been an increased interest in Artificial Intelligence for IT Operations (AIOps). This field utilizes monitoring data from IT systems, big data platforms, and machine learning to automate various operations and maintenance (O&M) tasks for distributed systems. 25 | The major contributions have been materialized in the form of novel algorithms. 26 | Typically, researchers took the challenge of exploring one specific type of observability data sources, such as application logs, metrics, and distributed traces, to create new algorithms. 27 | Nonetheless, due to the low signal-to-noise ratio of monitoring data, there is a consensus that only the analysis of multi-source monitoring data will enable the development of useful algorithms that have better performance.   28 | Unfortunately, existing datasets usually contain only a single source of data, often logs or metrics. This limits the possibilities for greater advances in AIOps research. 29 | Thus, we generated high-quality multi-source data composed of distributed traces, application logs, and metrics from a complex distributed system. This paper provides detailed descriptions of the experiment, statistics of the data, and identifies how such data can be analyzed to support O&M tasks such as anomaly detection, root cause analysis, and remediation. 30 | 31 | General Information: 32 | 33 | This repository contains the simple scripts for data statistics, and link to the multi-source distributed system dataset. 34 | 35 | The multi-source/multimodal dataset is composed of distributed traces, application logs, and metrics produced from running a complex distributed system (Openstack). In addition, we also provide the workload and fault scripts together with the Rally report which can serve as ground truth (all at the Zenodo link below). We provide two datasets, which differ on how the workload is executed. The sequential_data is generated via executing workload of sequential user requests. The concurrent_data is generated via executing workload of concurrent user requests. 36 | 37 | Important: The logs and the metrics are synchronized with respect time and they are both recorded on CEST (central european standard time). The traces are on UTC (Coordinated Universal Time -2 hours). They should be synchronized if the user develops multimodal methods. Please read the IMPORTANT_experiment_start_end.txt file before working with the data. 38 | 39 | If you are interested in these data, please request the data via Zenodo. Kindly note that the affiliation, and information about the utilization of the dataset. If you do not receive any response from Zenodo within one week, please check your spam mailbox or consider to resubmit your data request with the required information. 40 | -------------------------------------------------------------------------------- /investigateMetrics.py: -------------------------------------------------------------------------------- 1 | from reading import * 2 | import matplotlib.pyplot as plt 3 | import os 4 | from PyPDF2 import PdfFileReader 5 | from PyPDF2 import PdfFileWriter 6 | from collections import defaultdict 7 | 8 | import matplotlib.ticker 9 | 10 | """ 11 | Contains sripts for calculation and plotting of the statistical properties of the "metrics" data. 12 | """ 13 | 14 | def plotFeaturesDistributionsMetrics(nodeID, metricsData): 15 | """ 16 | Function to plot the metrics data for nodeID. 17 | :param nodeID: The nodeID represents the ID of the node one wants to observe. 18 | :param metricsData: The metricsData represent the data which distribution we are interested in. 19 | :return: None. Stores an image of the distribution of the properties in a folder. 20 | """ 21 | 22 | pathToStoreTheImages = "./" 23 | 24 | plt.figure(nodeID, figsize=(12,12)) 25 | plt.rc('xtick', labelsize=13) # fontsize of the tick labels 26 | plt.rc('ytick', labelsize=13) # fontsize of the tick labels 27 | plt.rc('font', size=13) # controls default text sizes 28 | plt.rc('axes', titlesize=13) # fontsize of the axes title 29 | plt.rc('axes', labelsize=13) # fontsize of the x and y labels 30 | 31 | #for cnt, featureName in enumerate(['cpu.user', 'mem.used', 'load.cpucore', 'load.min1', 'load.min5', 'load.min15']): 32 | for cnt, featureName in enumerate(['cpu.user', 'mem.used', 'load.min1']): 33 | createSubplot = 230 + cnt + 1 34 | plt.subplot(createSubplot) 35 | plt.subplots_adjust(wspace=0.4) 36 | plt.xlabel(featureName) 37 | plt.hist(metricsData.loc[:, featureName], bins=100, color="black") 38 | 39 | #if nodeID in [117, 122, 123, 124]: 40 | # if featureName == "cpu.user": 41 | # plt.xlim((0, 80)) 42 | 43 | # if featureName == "mem.used": 44 | # plt.xlim((188463616, 3662356736)) 45 | 46 | # if featureName in ['load.min1', 'load.min5', 'load.min15']: 47 | # plt.xlim((0, 3)) 48 | 49 | #else: 50 | # if featureName == "cpu.user": 51 | # plt.xlim((0, 80)) 52 | 53 | # if featureName in ['load.min1', 'load.min5', 'load.min15']: 54 | # plt.xlim((0, 17)) 55 | 56 | tmpPath = pathToStoreTheImages + str(nodeID) + "_new.pdf" 57 | #plt.suptitle("Metrics data for node: " + str(nodeID)) 58 | plt.savefig(tmpPath, bbox_inches="tight") 59 | #cutPDF(tmpPath) 60 | plt.close() 61 | 62 | 63 | def caluclateStatisticalProperties(metricData): 64 | """ 65 | Function calculating statistical Properties of the metricData for a specific node. 66 | :param nodeID: The nodeID represents the ID of the node one wants to observe. 67 | :param metricData: The metricsData represent the data which distribution we are interested in. 68 | :return: To be continued if needed 69 | """ 70 | d = defaultdict(dict) 71 | for _, featureName in enumerate(['cpu.user', 'mem.used', 'load.cpucore', 'load.min1', 'load.min5', 'load.min15']): 72 | d[featureName]["mean"] = np.mean(metricData.loc[:, featureName]) 73 | d[featureName]["median"] = np.median(metricData.loc[:, featureName]) 74 | d[featureName]["std"] = np.std(metricData.loc[:, featureName]) 75 | d[featureName]["min"] = np.min(metricData.loc[:, featureName]) 76 | d[featureName]["max"] = np.max(metricData.loc[:, featureName]) 77 | return d 78 | #tmpPath = path + str(nodeID) + ".csv" 79 | 80 | 81 | def cutPDF(filePath): 82 | """ 83 | Helper function to trim a pdf. 84 | :param filePath: the Path were the pdf is stored. 85 | :return: 86 | """ 87 | pdfRead = PdfFileReader(open(filePath, "rb")) 88 | page = pdfRead.getPage(0) 89 | print(page.cropBox.getLowerLeft()) 90 | print(page.cropBox.getLowerRight()) 91 | print(page.cropBox.getUpperLeft()) 92 | print(page.cropBox.getUpperRight()) 93 | lower_right_new_x_coordinate = 864 94 | lower_right_new_y_coordinate = 400 95 | lower_left_new_x_coordinate = 0 96 | lower_left_new_y_coordinate = 400 97 | upper_left_new_x_coordinate = 0 98 | upper_left_new_y_coordinate = 864 99 | upper_right_new_x_coordinate = 864 100 | upper_right_new_y_coordinate = 864 101 | page.mediaBox.lowerRight = (lower_right_new_x_coordinate, lower_right_new_y_coordinate) 102 | page.mediaBox.lowerLeft = (lower_left_new_x_coordinate, lower_left_new_y_coordinate) 103 | page.mediaBox.upperRight = (upper_right_new_x_coordinate, upper_right_new_y_coordinate) 104 | page.mediaBox.upperLeft = (upper_left_new_x_coordinate, upper_left_new_y_coordinate) 105 | output = PdfFileWriter() 106 | output.addPage(page) 107 | with open("113_new.pdf", "wb") as out_f: 108 | output.write(out_f) 109 | 110 | lis = [] 111 | for nodeID in [113, 117, 122, 123, 124]: 112 | #for nodeID in [113, 117]: 113 | metricData = readMetrics(nodeID) 114 | lis.append(metricData.shape[0]) 115 | plotFeaturesDistributionsMetrics(nodeID, metricData) 116 | #break 117 | 118 | 119 | -------------------------------------------------------------------------------- /investigateTraces.py: -------------------------------------------------------------------------------- 1 | """ 2 | Contains usefull functions for calculating the properties of the Traces data. 3 | """ 4 | from reading import * 5 | from pprint import PrettyPrinter 6 | from collections import defaultdict 7 | import json 8 | 9 | 10 | def addOnes(dataFrame): 11 | """ 12 | :param dataFrame: dataFrame representing traces 13 | :return: dataFrame + newColumn of ones 14 | """ 15 | dataFrame["ones"] = np.ones(dataFrame.shape[0]) 16 | return dataFrame 17 | 18 | 19 | def groupCounts(dataFrame): 20 | pom = dataFrame.loc[:, ["Base_id", "ones"]].groupby(by=["Base_id"]).count()/2 21 | return pom 22 | 23 | 24 | def groupCountsAllFeatures(dataFrame, feature): 25 | pom = dataFrame.loc[:, [feature, "ones"]].groupby(by=[feature]).count()/2 26 | return pom 27 | 28 | 29 | def storeJsons(dic, pathToStore): 30 | """ 31 | :param dic: dict containing the properties 32 | :param operation: the operation being executed 33 | :param pathToStore: should be in form ./Traces/ 34 | :return: 35 | """ 36 | with open(pathToStore + "Traces_allFeatures.json", "w") as file: 37 | json.dump(dic, file) 38 | 39 | 40 | def calculateStats(dataFrame, feature): 41 | d={} 42 | d["mean"] = float(list(dataFrame.mean().values)[0]) 43 | d["std"] = float(list(dataFrame.std().values)[0]) 44 | d["median"] = float(list(dataFrame.median().values)[0]) 45 | d["min"] = float(list(dataFrame.min().values)[0]) 46 | d["max"] = float(list(dataFrame.max().values)[0]) 47 | d["quantile05"] = float(list(dataFrame.quantile(0.05).values)[0]) 48 | d["quantile25"] = float(list(dataFrame.quantile(0.25).values)[0]) 49 | d["quantile50"] = float(list(dataFrame.quantile(0.50).values)[0]) 50 | d["quantile75"] = float(list(dataFrame.quantile(0.75).values)[0]) 51 | d["quantile95"] = float(list(dataFrame.quantile(0.95).values)[0]) 52 | d["numberUniqueValues"] = dataFrame.shape[0] 53 | 54 | return d 55 | 56 | def tracesStats(): 57 | """ 58 | :return: creates .json files for the stats of all of the operations. 59 | """ 60 | operationName = ['boot_delete_api_faults', 'boot_delete_compute_faults_traces', 'cinder_api_faults', 'cinder_compute_faults', 'create_delete_image', 'create_delete_stack', 'network_create_delete',] 61 | dic = defaultdict(dict) 62 | dic1 = defaultdict(list) 63 | for operation in operationName: 64 | first = readTraces(operation) 65 | pom1 = addOnes(first) 66 | pom1 = pom1.drop(["Unnamed: 0"], axis=1) 67 | dic1[operation] = pom1 68 | for feature in pom1.columns[:-1]: 69 | pom2 = groupCountsAllFeatures(pom1, feature) 70 | pom3 = calculateStats(pom2, feature) 71 | dic[operation][feature] = pom3 72 | 73 | storeJsons(dic, "./") 74 | 75 | #lista.append(dic) 76 | 77 | 78 | def propertiesPerFeature(): 79 | """ 80 | 81 | :return: DataFrame containing the number of calls per operation on each type of calls. [rpc, wsgi and vif] 82 | """ 83 | li = [] 84 | operationName = ['boot_delete_api_faults', 'boot_delete_compute_faults_traces', 'cinder_api_faults', 85 | 'cinder_compute_faults', 'create_delete_image', 'create_delete_stack', 'network_create_delete', ] 86 | for operation in operationName: 87 | print("###########") 88 | print(operation) 89 | print("###########") 90 | first = readTraces(operation) 91 | oi = pd.value_counts(first.Name) 92 | dictionary = {} 93 | d = {} 94 | typesOfCalls = ["rpc", "wsgi", "vif", "driver"] 95 | for typeOfCall in typesOfCalls: 96 | flag = True 97 | for ind, x in enumerate(list(oi.index)): 98 | if typeOfCall in x: 99 | if flag == True: 100 | print("The operation {} has {} calls of type {} that number: {}".format(operation, typeOfCall, x, oi.loc[x])) 101 | 102 | #li.append(oi.loc[x]) 103 | d[typeOfCall] = oi.loc[x] 104 | flag = False 105 | #q = np.array(li) 106 | else: 107 | flag = True 108 | pomm = list(d.keys()) 109 | if len(pomm) != 4: 110 | if "rpc" not in pomm: 111 | d["rpc"] = 0 112 | if "wsgi" not in pomm: 113 | d["wsgi"] = 0 114 | if "vif" not in pomm: 115 | d["vif"] = 0 116 | if "driver" not in pomm: 117 | d["driver"] = 0 118 | dictionary[operation] = d 119 | li.append(pd.DataFrame(dictionary)) 120 | return pd.concat(li, axis=1) 121 | """ 122 | for operation in operationName: 123 | print("#######################################################") 124 | print("#######################################################") 125 | print("The mean number of calls for operation {} type of call {} is {}".format(operation, typeOfCall, np.mean(q))) 126 | print("The std per number of calls for operation {} type of call {} is {}".format(operation, typeOfCall, np.std(q))) 127 | print("The median number of calls for operation {} type of call {} is {}".format(operation, typeOfCall, np.median(q))) 128 | print("The max number of calls for operation {} type of call {} is {}".format(operation, typeOfCall, np.max(q))) 129 | print("The min number of calls for operation {} type of call {} is {}".format(operation, typeOfCall, np.min(q))) 130 | print("#######################################################") 131 | print("#######################################################") 132 | """ 133 | tracesStats() 134 | a = propertiesPerFeature() -------------------------------------------------------------------------------- /reading.py: -------------------------------------------------------------------------------- 1 | import pandas as pd 2 | import os 3 | import numpy as np 4 | 5 | def listLogsProject(projectName, path): 6 | """ 7 | :param name: traces, logs, metrics 8 | :param projectName: One in ['haproxy', 'keystone', 'grafana', 'glance', 9 | 'mariadb', 'kibana', 'swift', 'nova', 'ceph', 'rally', 'rabbitmq', 10 | 'placement', 'neutron', 'heat', 'chrony', 'openvswitch', 'cinder', 'horizon'] 11 | path: list the directories on the path. 12 | :return: List of available logs for the project with projectName; 13 | """ 14 | 15 | return os.listdir(path) 16 | 17 | 18 | def readMetrics(nodeID, path): 19 | """ 20 | This functions is used to read the metrics data for specific nodes. 21 | :param nodeID: allowed values [113, 117, 122, 123, 124] 22 | path:path where the metric data of the node. 23 | :return: Data frame of size nx7. Where the 7 columns represent different metrics of interest. 24 | """ 25 | assert nodeID in [113, 117, 122, 123, 124], str(nodeID) + " invalid node value. Please insert some of the 113, 117, 122, 123, 124 as relevant metrics files." 26 | return pd.read_csv(path) 27 | 28 | 29 | def readTraces(path, operation="boot_delete_api_faults"): 30 | """ 31 | Valid operations: 32 | ['network_create_delete', 'boot_delete_api_faults', 'create_delete_image', 'boot_delete_compute_faults_traces', 'cinder_api_faults', 'cinder_compute_faults', 'create_delete_stack'], 33 | :param operation: valid operation for the trace. It is initialized to boot_delete_api_faults as first operation when the sorting is done by alphabetical order. 34 | path:path where the raw_csv log file is stored. 35 | :return: 36 | """ 37 | 38 | assert operation in ['network_create_delete', 'boot_delete_api_faults', 'create_delete_image', 'boot_delete_compute_faults_traces', 'cinder_api_faults', 'cinder_compute_faults', 'create_delete_stack'], operation + " invalid node value. Please insert some of the set {'network_create_delete.csv', 'boot_delete_api_faults.csv', 'create_delete_image.csv', 'boot_delete_compute_faults_traces.csv', 'cinder_api_faults.csv', 'cinder_compute_faults.csv', 'create_delete_stack.csv'} as relevant trace IDs" 39 | return pd.read_csv(path) 40 | 41 | 42 | 43 | def readRaw(path): 44 | """ 45 | 46 | :param path:path where the raw_csv log file is stored. 47 | :return: DataFrame contaning the logs. 48 | """ 49 | return pd.read_csv(path) 50 | -------------------------------------------------------------------------------- /workloads/boot_delete_config/boot-and-delete.yaml: -------------------------------------------------------------------------------- 1 | {% set flavor_name = flavor_name or "m1.medium" %} 2 | --- 3 | NovaServers.boot_and_delete_server: 4 | - 5 | args: 6 | flavor: 7 | name: "{{flavor_name}}" 8 | image: 9 | name: "ubuntu-18.04" 10 | force_delete: true 11 | runner: 12 | type: "constant" 13 | times: 2000 14 | concurrency: 2 15 | context: 16 | users: 17 | tenants: 3 18 | users_per_tenant: 2 19 | sla: 20 | failure_rate: 21 | max: 100 22 | hooks: 23 | - name: sys_call 24 | description: Run script 25 | args: sh ./restart_nova_container.sh 26 | trigger: 27 | name: event 28 | args: 29 | unit: iteration 30 | at: [250,500,750,1000,1250,1500,2000,2250,2500,2750,3000,3250,3500,3750,4000,4250,4500,4750,5000,5250,5500,5750,6000,6250,6500,6750,7000] 31 | -------------------------------------------------------------------------------- /workloads/boot_delete_config/commands-to-exec-nova-compute.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash -x 2 | # Restarting nova_api container 5 times and sleep for 10 secs between restarts 3 | 4 | for i in {1..2} 5 | do 6 | echo "Restarting nova-container" 7 | docker stop nova_compute 8 | sleep 15 9 | docker start nova_compute 10 | done 11 | 12 | 13 | -------------------------------------------------------------------------------- /workloads/boot_delete_config/commands-to-exec-nova.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash -x 2 | # Restarting nova_api container 5 times and sleep for 10 secs between restarts 3 | 4 | for i in {1..2} 5 | do 6 | echo "Restarting nova-container" 7 | docker restart nova_api 8 | done 9 | 10 | 11 | -------------------------------------------------------------------------------- /workloads/boot_delete_config/restart_nova_compute_nodes.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash +x 2 | 3 | #SSH into compute nodes and execute commands from commands-to-exec-nova-compute.sh 4 | cat commands-to-exec-nova-compute.sh | sshpass -p 'rally' ssh rally@wally117.cit.tu-berlin.de & 5 | cat commands-to-exec-nova-compute.sh | sshpass -p 'rally' ssh rally@wally122.cit.tu-berlin.de & 6 | cat commands-to-exec-nova-compute.sh | sshpass -p 'rally' ssh rally@wally124.cit.tu-berlin.de & 7 | cat commands-to-exec-nova-compute.sh | sshpass -p 'rally' ssh rally@wally123.cit.tu-berlin.de & 8 | cat commands-to-exec-nova-compute.sh | sshpass -p 'rally' ssh rally@wally113.cit.tu-berlin.de 9 | -------------------------------------------------------------------------------- /workloads/boot_delete_config/restart_nova_container.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash +x 2 | 3 | #SSH into Wally113 controller and execute commands from commands-to-exec-nova.sh 4 | 5 | cat commands-to-exec-nova.sh | sshpass -p "rally" ssh rally@wally113.cit.tu-berlin.de 6 | -------------------------------------------------------------------------------- /workloads/boot_delete_config/trace_to_csv.py: -------------------------------------------------------------------------------- 1 | import pandas as pd 2 | import json 3 | #from bs4 import BeautifulSoup 4 | import fnmatch 5 | from pprint import pprint 6 | import csv 7 | import seaborn as sns 8 | from datetime import datetime 9 | from tqdm import tqdm 10 | import os 11 | import argparse 12 | 13 | 14 | 15 | # Working Parser for skipping DB calls and writing to a dataframe object. 16 | 17 | # construct the argument parse and parse the arguments 18 | ap = argparse.ArgumentParser() 19 | ap.add_argument("-p", "--path", required=True, 20 | help="path to the traces") 21 | ap.add_argument("-o", "--out", required=True, 22 | help="outputpath to save csv-dump") 23 | args = vars(ap.parse_args()) 24 | 25 | print(args) 26 | 27 | 28 | 29 | def create_df(data_dir,output_file): 30 | #creating a null dataframe before starting to fill it 31 | df_name = pd.DataFrame(columns=['Host', 'Name', 'Service', 'Project', 'Timestamp', 'Iteration_id','Trace_id', 'Parent_id', 'Base_id']) 32 | list_trace_files=os.listdir(data_dir) 33 | # tqdm is used to visualize the progress. 34 | for x in tqdm(list_trace_files): 35 | #data_folder = 'traces_data/boot_delete_config/traces_new/api_faults/' 36 | file_to_read = data_dir+'/'+x 37 | with open(file_to_read) as trace_file: 38 | dict_train = json.load(trace_file) 39 | print("Processing trace_file:",file_to_read) 40 | #print name of the iteration_id 41 | #print(os.path.splitext(os.path.basename(file_to_read))[0]) 42 | #print(os.path.splitext(str(list_trace_files)[0])) 43 | df_name = df_name.from_dict(parser(dict_train,df_name,os.path.splitext(os.path.basename(file_to_read))[0])) 44 | 45 | print("Saving Data-Frame") 46 | df_name.to_csv(output_file) 47 | print("--done--") 48 | 49 | def parser(dict_train,dfObj,iteration_id): 50 | 51 | substring ='db' 52 | pattern = 'meta.raw_payload.*' 53 | 54 | for x in range(len(dict_train['children'])): 55 | if not dict_train['children'][x]['children']: 56 | #print("Stage-1") 57 | #print("No-children present in index", x) 58 | #print("parent_id:",dict_train['children'][x]['parent_id']) 59 | #print("trace_id:",dict_train['children'][x]['trace_id']) 60 | # print("--- Printing Payload information ---") 61 | # print("Host:",dict_train['children'][x]['info']['meta.raw_payload.wsgi-start']['info']['host']) 62 | # print("Name:",dict_train['children'][x]['info']['meta.raw_payload.wsgi-start']['name']) 63 | # print("Service:",dict_train['children'][x]['info']['meta.raw_payload.wsgi-start']['service']) 64 | # print("Timestamp:",datetime.strptime(dict_train['children'][x]['info']['meta.raw_payload.wsgi-start']['timestamp'], '%Y-%m-%dT%H:%M:%S.%f')) 65 | # print("Parent_id:",dict_train['children'][x]['info']['meta.raw_payload.wsgi-start']['parent_id']) 66 | # print("Trace_id:",dict_train['children'][x]['info']['meta.raw_payload.wsgi-start']['trace_id']) 67 | # print("Base_id:",dict_train['children'][x]['info']['meta.raw_payload.wsgi-start']['base_id']) 68 | # print("Project:",dict_train['children'][x]['info']['meta.raw_payload.wsgi-start']['project']) 69 | dfObj = dfObj.append({'Host': dict_train['children'][x]['info']['meta.raw_payload.wsgi-start']['info']['host'], 'Name': dict_train['children'][x]['info']['meta.raw_payload.wsgi-start']['name'], 'Service': dict_train['children'][x]['info']['meta.raw_payload.wsgi-start']['service'], 'Project': dict_train['children'][x]['info']['meta.raw_payload.wsgi-start']['project'], 'Timestamp': datetime.strptime(dict_train['children'][x]['info']['meta.raw_payload.wsgi-start']['timestamp'], '%Y-%m-%dT%H:%M:%S.%f'), 'Iteration_id': iteration_id, 'Trace_id': dict_train['children'][x]['info']['meta.raw_payload.wsgi-start']['trace_id'], 'Parent_id': dict_train['children'][x]['info']['meta.raw_payload.wsgi-start']['parent_id'], 'Base_id': dict_train['children'][x]['info']['meta.raw_payload.wsgi-start']['base_id']}, ignore_index=True) 70 | 71 | # print("Host:",dict_train['children'][x]['info']['meta.raw_payload.wsgi-stop']['info']['host']) 72 | # print("Name:",dict_train['children'][x]['info']['meta.raw_payload.wsgi-stop']['name']) 73 | # print("Service:",dict_train['children'][x]['info']['meta.raw_payload.wsgi-stop']['service']) 74 | # print("Timestamp:",datetime.strptime(dict_train['children'][x]['info']['meta.raw_payload.wsgi-stop']['timestamp'], '%Y-%m-%dT%H:%M:%S.%f')) 75 | # print("Parent_id:",dict_train['children'][x]['info']['meta.raw_payload.wsgi-stop']['parent_id']) 76 | # print("Trace_id:",dict_train['children'][x]['info']['meta.raw_payload.wsgi-stop']['trace_id']) 77 | # print("Base_id:",dict_train['children'][x]['info']['meta.raw_payload.wsgi-stop']['base_id']) 78 | # print("Project:",dict_train['children'][x]['info']['meta.raw_payload.wsgi-stop']['project']) 79 | dfObj = dfObj.append({'Host': dict_train['children'][x]['info']['meta.raw_payload.wsgi-stop']['info']['host'], 'Name': dict_train['children'][x]['info']['meta.raw_payload.wsgi-stop']['name'], 'Service': dict_train['children'][x]['info']['meta.raw_payload.wsgi-stop']['service'], 'Project': dict_train['children'][x]['info']['meta.raw_payload.wsgi-stop']['project'], 'Timestamp': datetime.strptime(dict_train['children'][x]['info']['meta.raw_payload.wsgi-stop']['timestamp'], '%Y-%m-%dT%H:%M:%S.%f'), 'Iteration_id': iteration_id, 'Trace_id': dict_train['children'][x]['info']['meta.raw_payload.wsgi-stop']['trace_id'], 'Parent_id': dict_train['children'][x]['info']['meta.raw_payload.wsgi-stop']['parent_id'], 'Base_id': dict_train['children'][x]['info']['meta.raw_payload.wsgi-stop']['base_id']}, ignore_index=True) 80 | 81 | 82 | else: 83 | 84 | 85 | #print("Children present in index", x) 86 | for y in range(len(dict_train['children'][x]['children'])): 87 | #print("Stage-2") 88 | #print(len(dict_train['children'][x]['children'])) 89 | 90 | if not dict_train['children'][x]['children'][y]['children']: 91 | #print("No children to further process in Stage-1") 92 | meta_raw_payload_name_stage_1 = fnmatch.filter(dict_train['children'][x]['children'][y]['info'],pattern) 93 | #print("List of payload types:",meta_raw_payload_name_stage_1) 94 | 95 | for k in range(len(meta_raw_payload_name_stage_1)): 96 | if substring in meta_raw_payload_name_stage_1[k]: 97 | #compute something 98 | pass 99 | #print("Payload are DB-calls stage-2 , Skipping...",meta_raw_payload_name_stage_1[k]) 100 | else: 101 | 102 | # print("--- Printing Payload information ---") 103 | # print("Printing Pay-Load infomarion for:",meta_raw_payload_name_stage_1[k]) 104 | # print("Host:",dict_train['children'][x]['children'][y]['info'][meta_raw_payload_name_stage_1[k]]['info']['host']) 105 | # print("Name:",dict_train['children'][x]['children'][y]['info'][meta_raw_payload_name_stage_1[k]]['name']) 106 | # print("Service:",dict_train['children'][x]['children'][y]['info'][meta_raw_payload_name_stage_1[k]]['service']) 107 | # print("Project:",dict_train['children'][x]['children'][y]['info'][meta_raw_payload_name_stage_1[k]]['project']) 108 | # print("Timestamp:", datetime.strptime(dict_train['children'][x]['children'][y]['info'][meta_raw_payload_name_stage_1[k]]['timestamp'], '%Y-%m-%dT%H:%M:%S.%f')) 109 | # print("Trace_id:",dict_train['children'][x]['children'][y]['info'][meta_raw_payload_name_stage_1[k]]['trace_id']) 110 | # print("Parent_id:",dict_train['children'][x]['children'][y]['info'][meta_raw_payload_name_stage_1[k]]['parent_id']) 111 | # print("Base_id:",dict_train['children'][x]['children'][y]['info'][meta_raw_payload_name_stage_1[k]]['base_id']) 112 | dfObj = dfObj.append({'Host': dict_train['children'][x]['children'][y]['info'][meta_raw_payload_name_stage_1[k]]['info']['host'], 'Name': dict_train['children'][x]['children'][y]['info'][meta_raw_payload_name_stage_1[k]]['name'], 'Service': dict_train['children'][x]['children'][y]['info'][meta_raw_payload_name_stage_1[k]]['service'], 'Project': dict_train['children'][x]['children'][y]['info'][meta_raw_payload_name_stage_1[k]]['project'], 'Timestamp': datetime.strptime(dict_train['children'][x]['children'][y]['info'][meta_raw_payload_name_stage_1[k]]['timestamp'], '%Y-%m-%dT%H:%M:%S.%f'), 'Iteration_id': iteration_id, 'Trace_id': dict_train['children'][x]['children'][y]['info'][meta_raw_payload_name_stage_1[k]]['trace_id'], 'Parent_id': dict_train['children'][x]['children'][y]['info'][meta_raw_payload_name_stage_1[k]]['parent_id'], 'Base_id': dict_train['children'][x]['children'][y]['info'][meta_raw_payload_name_stage_1[k]]['base_id']}, ignore_index=True) 113 | 114 | 115 | 116 | 117 | else: 118 | #print("Children present in stage-2 index", x,y) 119 | for z in range(len(dict_train['children'][x]['children'][y]['children'])): 120 | #print("Stage-3") 121 | #print(len(dict_train['children'][x]['children'][y]['children'])) 122 | 123 | if not dict_train['children'][x]['children'][y]['children'][z]['children']: 124 | #print("No children to further process in Stage-2") 125 | meta_raw_payload_name_stage_2 = fnmatch.filter(dict_train['children'][x]['children'][y]['children'][z]['info'],pattern) 126 | #print("List of payload types:",meta_raw_payload_name_stage_2) 127 | for k in range(len(meta_raw_payload_name_stage_2)): 128 | if substring in meta_raw_payload_name_stage_2[k]: 129 | #compute something 130 | pass 131 | #print("Payload are DB-calls stage-3 , Skipping...",meta_raw_payload_name_stage_2[k]) 132 | else: 133 | # print("--- Printing Payload information ---") 134 | # print("Printing Pay-Load infomarion for:",meta_raw_payload_name_stage_2[k]) 135 | # print("Host:",dict_train['children'][x]['children'][y]['children'][z]['info'][meta_raw_payload_name_stage_2[k]]['info']['host']) 136 | # print("Name:",dict_train['children'][x]['children'][y]['children'][z]['info'][meta_raw_payload_name_stage_2[k]]['name']) 137 | # print("Service:",dict_train['children'][x]['children'][y]['children'][z]['info'][meta_raw_payload_name_stage_2[k]]['service']) 138 | # print("Project:",dict_train['children'][x]['children'][y]['children'][z]['info'][meta_raw_payload_name_stage_2[k]]['project']) 139 | # print("Timestamp:", datetime.strptime(dict_train['children'][x]['children'][y]['children'][z]['info'][meta_raw_payload_name_stage_2[k]]['timestamp'], '%Y-%m-%dT%H:%M:%S.%f')) 140 | # print("Trace_id:",dict_train['children'][x]['children'][y]['children'][z]['info'][meta_raw_payload_name_stage_2[k]]['trace_id']) 141 | # print("Parent_id:",dict_train['children'][x]['children'][y]['children'][z]['info'][meta_raw_payload_name_stage_2[k]]['parent_id']) 142 | # print("Base_id:",dict_train['children'][x]['children'][y]['children'][z]['info'][meta_raw_payload_name_stage_2[k]]['base_id']) 143 | dfObj = dfObj.append({'Host': dict_train['children'][x]['children'][y]['children'][z]['info'][meta_raw_payload_name_stage_2[k]]['info']['host'], 'Name': dict_train['children'][x]['children'][y]['children'][z]['info'][meta_raw_payload_name_stage_2[k]]['name'], 'Service': dict_train['children'][x]['children'][y]['children'][z]['info'][meta_raw_payload_name_stage_2[k]]['service'], 'Project': dict_train['children'][x]['children'][y]['children'][z]['info'][meta_raw_payload_name_stage_2[k]]['project'], 'Timestamp': datetime.strptime(dict_train['children'][x]['children'][y]['children'][z]['info'][meta_raw_payload_name_stage_2[k]]['timestamp'], '%Y-%m-%dT%H:%M:%S.%f'), 'Iteration_id': iteration_id, 'Trace_id': dict_train['children'][x]['children'][y]['children'][z]['info'][meta_raw_payload_name_stage_2[k]]['trace_id'], 'Parent_id': dict_train['children'][x]['children'][y]['children'][z]['info'][meta_raw_payload_name_stage_2[k]]['parent_id'], 'Base_id': dict_train['children'][x]['children'][y]['children'][z]['info'][meta_raw_payload_name_stage_2[k]]['base_id']}, ignore_index=True) 144 | 145 | 146 | 147 | 148 | 149 | else: 150 | #print("Children present in stage-3 index", x,y,z) 151 | 152 | for a in range(len(dict_train['children'][x]['children'][y]['children'][z]['children'])): 153 | #print("Stage-4") 154 | #print("Length",len(dict_train['children'][x]['children'][y]['children'][z]['children'])) 155 | 156 | if not dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children']: 157 | #print("No children to further process in Stage-3") 158 | meta_raw_payload_name_stage_3 = fnmatch.filter(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['info'],pattern) 159 | #print("List of payload types:",meta_raw_payload_name_stage_3) 160 | for k in range(len(meta_raw_payload_name_stage_3)): 161 | if substring in meta_raw_payload_name_stage_3[k]: 162 | #compute something 163 | pass 164 | #print("Payload are DB-calls stage-4 , Skipping...",meta_raw_payload_name_stage_3[k]) 165 | else: 166 | # print("--- Printing Payload information ---") 167 | # print("Printing Pay-Load infomarion for:",meta_raw_payload_name_stage_3[k]) 168 | # print("Host:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['info'][meta_raw_payload_name_stage_3[k]]['info']['host']) 169 | # print("Name:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['info'][meta_raw_payload_name_stage_3[k]]['name']) 170 | # print("Service:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['info'][meta_raw_payload_name_stage_3[k]]['service']) 171 | # print("Project:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['info'][meta_raw_payload_name_stage_3[k]]['project']) 172 | # #print("Timestamp:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['info'][meta_raw_payload_name_stage_3[k]]['timestamp']) 173 | # print("Timestamp:", datetime.strptime(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['info'][meta_raw_payload_name_stage_3[k]]['timestamp'], '%Y-%m-%dT%H:%M:%S.%f')) 174 | # print("Trace_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['info'][meta_raw_payload_name_stage_3[k]]['trace_id']) 175 | # print("Parent_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['info'][meta_raw_payload_name_stage_3[k]]['parent_id']) 176 | # print("Base_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['info'][meta_raw_payload_name_stage_3[k]]['base_id']) 177 | dfObj = dfObj.append({'Host': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['info'][meta_raw_payload_name_stage_3[k]]['info']['host'], 'Name': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['info'][meta_raw_payload_name_stage_3[k]]['name'], 'Service': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['info'][meta_raw_payload_name_stage_3[k]]['service'], 'Project': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['info'][meta_raw_payload_name_stage_3[k]]['project'], 'Timestamp': datetime.strptime(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['info'][meta_raw_payload_name_stage_3[k]]['timestamp'], '%Y-%m-%dT%H:%M:%S.%f'), 'Iteration_id': iteration_id, 'Trace_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['info'][meta_raw_payload_name_stage_3[k]]['trace_id'], 'Parent_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['info'][meta_raw_payload_name_stage_3[k]]['parent_id'], 'Base_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['info'][meta_raw_payload_name_stage_3[k]]['base_id']}, ignore_index=True) 178 | 179 | 180 | 181 | 182 | else: 183 | #print("Children present in stage-4 index", x,y,z,a) 184 | for b in range(len(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'])): 185 | #print("Stage-5") 186 | #print("Length",len(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'])) 187 | if not dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children']: 188 | #print("No children to further process in Stage-4") 189 | meta_raw_payload_name_stage_4 = fnmatch.filter(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['info'],pattern) 190 | #print("List of payload types:",meta_raw_payload_name_stage_4) 191 | for k in range(len(meta_raw_payload_name_stage_4)): 192 | if substring in meta_raw_payload_name_stage_4[k]: 193 | #compute something 194 | pass 195 | #print("Payload are DB-calls stage-5 , Skipping...",meta_raw_payload_name_stage_4[k]) 196 | else: 197 | # print("--- Printing Payload information ---") 198 | # print("Printing Pay-Load infomarion for:",meta_raw_payload_name_stage_4[k]) 199 | # print("Host:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['info'][meta_raw_payload_name_stage_4[k]]['info']['host']) 200 | # print("Name:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['info'][meta_raw_payload_name_stage_4[k]]['name']) 201 | # print("Service:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['info'][meta_raw_payload_name_stage_4[k]]['service']) 202 | # print("Project:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['info'][meta_raw_payload_name_stage_4[k]]['project']) 203 | # 204 | # print("Timestamp:", datetime.strptime(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['info'][meta_raw_payload_name_stage_4[k]]['timestamp'], '%Y-%m-%dT%H:%M:%S.%f')) 205 | # print("Trace_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['info'][meta_raw_payload_name_stage_4[k]]['trace_id']) 206 | # print("Parent_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['info'][meta_raw_payload_name_stage_4[k]]['parent_id']) 207 | # print("Base_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['info'][meta_raw_payload_name_stage_4[k]]['base_id']) 208 | 209 | dfObj = dfObj.append({'Host': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['info'][meta_raw_payload_name_stage_4[k]]['info']['host'], 'Name': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['info'][meta_raw_payload_name_stage_4[k]]['name'], 'Service': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['info'][meta_raw_payload_name_stage_4[k]]['service'], 'Project': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['info'][meta_raw_payload_name_stage_4[k]]['project'], 'Timestamp': datetime.strptime(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['info'][meta_raw_payload_name_stage_4[k]]['timestamp'], '%Y-%m-%dT%H:%M:%S.%f'), 'Iteration_id': iteration_id, 'Trace_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['info'][meta_raw_payload_name_stage_4[k]]['trace_id'], 'Parent_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['info'][meta_raw_payload_name_stage_4[k]]['parent_id'], 'Base_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['info'][meta_raw_payload_name_stage_4[k]]['base_id']}, ignore_index=True) 210 | 211 | 212 | 213 | else: 214 | #print("Children present in stage-5 index", x,y,z,a,b) 215 | for c in range(len(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'])): 216 | #print("Stage-6") 217 | #print("Length",len(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'])) 218 | if not dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children']: 219 | #print("No children to further process in Stage-5") 220 | meta_raw_payload_name_stage_5 = fnmatch.filter(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['info'],pattern) 221 | #print("List of payload types:",meta_raw_payload_name_stage_5) 222 | for k in range(len(meta_raw_payload_name_stage_5)): 223 | if substring in meta_raw_payload_name_stage_5[k]: 224 | #compute something 225 | pass 226 | #print("Payload are DB-calls stage-6 , Skipping...",meta_raw_payload_name_stage_5[k]) 227 | else: 228 | # print("--- Printing Payload information ---") 229 | # print("Printing Pay-Load infomarion for:",meta_raw_payload_name_stage_5[k]) 230 | # print("Host:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['info'][meta_raw_payload_name_stage_5[k]]['info']['host']) 231 | # print("Name:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['info'][meta_raw_payload_name_stage_5[k]]['name']) 232 | # print("Service:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['info'][meta_raw_payload_name_stage_5[k]]['service']) 233 | # print("Project:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['info'][meta_raw_payload_name_stage_5[k]]['project']) 234 | # 235 | # print("Timestamp:", datetime.strptime(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['info'][meta_raw_payload_name_stage_5[k]]['timestamp'], '%Y-%m-%dT%H:%M:%S.%f')) 236 | # print("Trace_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['info'][meta_raw_payload_name_stage_5[k]]['trace_id']) 237 | # print("Parent_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['info'][meta_raw_payload_name_stage_5[k]]['parent_id']) 238 | # print("Base_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['info'][meta_raw_payload_name_stage_5[k]]['base_id']) 239 | 240 | dfObj = dfObj.append({'Host': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['info'][meta_raw_payload_name_stage_5[k]]['info']['host'], 'Name': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['info'][meta_raw_payload_name_stage_5[k]]['name'], 'Service': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['info'][meta_raw_payload_name_stage_5[k]]['service'], 'Project': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['info'][meta_raw_payload_name_stage_5[k]]['project'], 'Timestamp': datetime.strptime(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['info'][meta_raw_payload_name_stage_5[k]]['timestamp'], '%Y-%m-%dT%H:%M:%S.%f'), 'Iteration_id': iteration_id, 'Trace_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['info'][meta_raw_payload_name_stage_5[k]]['trace_id'], 'Parent_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['info'][meta_raw_payload_name_stage_5[k]]['parent_id'], 'Base_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['info'][meta_raw_payload_name_stage_5[k]]['base_id']}, ignore_index=True) 241 | 242 | 243 | 244 | 245 | else: 246 | #print("Children present in stage-6 index", x,y,z,a,b,c) 247 | for d in range(len(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'])): 248 | #print("Stage-7") 249 | #print("Length",len(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'])) 250 | if not dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children']: 251 | #print("No children to further process in Stage-6") 252 | meta_raw_payload_name_stage_6 = fnmatch.filter(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['info'],pattern) 253 | #print("List of payload types:",meta_raw_payload_name_stage_6) 254 | for k in range(len(meta_raw_payload_name_stage_6)): 255 | if substring in meta_raw_payload_name_stage_6[k]: 256 | #compute something 257 | pass 258 | #print("Payload are DB-calls stage-7 , Skipping...",meta_raw_payload_name_stage_6[k]) 259 | else: 260 | # print("--- Printing Payload information ---") 261 | # print("Printing Pay-Load infomarion for:",meta_raw_payload_name_stage_6[k]) 262 | # print("Host:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['info'][meta_raw_payload_name_stage_6[k]]['info']['host']) 263 | # print("Name:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['info'][meta_raw_payload_name_stage_6[k]]['name']) 264 | # print("Service:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['info'][meta_raw_payload_name_stage_6[k]]['service']) 265 | # print("Project:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['info'][meta_raw_payload_name_stage_6[k]]['project']) 266 | # print("Timestamp:", datetime.strptime(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['info'][meta_raw_payload_name_stage_6[k]]['timestamp'], '%Y-%m-%dT%H:%M:%S.%f')) 267 | # print("Trace_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['info'][meta_raw_payload_name_stage_6[k]]['trace_id']) 268 | # print("Parent_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['info'][meta_raw_payload_name_stage_6[k]]['parent_id']) 269 | # print("Base_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['info'][meta_raw_payload_name_stage_6[k]]['base_id']) 270 | 271 | dfObj = dfObj.append({'Host': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['info'][meta_raw_payload_name_stage_6[k]]['info']['host'], 'Name': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['info'][meta_raw_payload_name_stage_6[k]]['name'], 'Service': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['info'][meta_raw_payload_name_stage_6[k]]['service'], 'Project': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['info'][meta_raw_payload_name_stage_6[k]]['project'], 'Timestamp': datetime.strptime(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['info'][meta_raw_payload_name_stage_6[k]]['timestamp'], '%Y-%m-%dT%H:%M:%S.%f'), 'Iteration_id': iteration_id, 'Trace_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['info'][meta_raw_payload_name_stage_6[k]]['trace_id'], 'Parent_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['info'][meta_raw_payload_name_stage_6[k]]['parent_id'], 'Base_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['info'][meta_raw_payload_name_stage_6[k]]['base_id']}, ignore_index=True) 272 | 273 | 274 | 275 | 276 | 277 | else: 278 | 279 | for e in range(len(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'])): 280 | #print("Stage-8") 281 | if not dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children']: 282 | meta_raw_payload_name_stage_7 = fnmatch.filter(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['info'],pattern) 283 | for k in range(len(meta_raw_payload_name_stage_7)): 284 | if substring in meta_raw_payload_name_stage_7[k]: 285 | #compute something 286 | pass 287 | #print("Payload are DB-calls stage-8 , Skipping...",meta_raw_payload_name_stage_7[k]) 288 | else: 289 | # print("--- Printing Payload information ---") 290 | # print("Printing Pay-Load infomarion for:",meta_raw_payload_name_stage_7[k]) 291 | # print("Host:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['info'][meta_raw_payload_name_stage_7[k]]['info']['host']) 292 | # print("Name:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['info'][meta_raw_payload_name_stage_7[k]]['name']) 293 | # print("Service:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['info'][meta_raw_payload_name_stage_7[k]]['service']) 294 | # print("Project:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['info'][meta_raw_payload_name_stage_7[k]]['project']) 295 | # print("Timestamp:", datetime.strptime(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['info'][meta_raw_payload_name_stage_7[k]]['timestamp'], '%Y-%m-%dT%H:%M:%S.%f')) 296 | # print("Trace_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['info'][meta_raw_payload_name_stage_7[k]]['trace_id']) 297 | # print("Parent_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['info'][meta_raw_payload_name_stage_7[k]]['parent_id']) 298 | # print("Base_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['info'][meta_raw_payload_name_stage_7[k]]['base_id']) 299 | 300 | dfObj = dfObj.append({'Host': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['info'][meta_raw_payload_name_stage_7[k]]['info']['host'], 'Name': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['info'][meta_raw_payload_name_stage_7[k]]['name'], 'Service': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['info'][meta_raw_payload_name_stage_7[k]]['service'], 'Project': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['info'][meta_raw_payload_name_stage_7[k]]['project'], 'Timestamp': datetime.strptime(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['info'][meta_raw_payload_name_stage_7[k]]['timestamp'], '%Y-%m-%dT%H:%M:%S.%f'), 'Iteration_id': iteration_id, 'Trace_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['info'][meta_raw_payload_name_stage_7[k]]['trace_id'], 'Parent_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['info'][meta_raw_payload_name_stage_7[k]]['parent_id'], 'Base_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['info'][meta_raw_payload_name_stage_7[k]]['base_id']}, ignore_index=True) 301 | 302 | 303 | 304 | 305 | else: 306 | #print("Children present in stage-8 index", x,y,z,a,b,c,d,e) 307 | 308 | for f in range(len(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'])): 309 | #print("Stage-9") 310 | #print("Length",len(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'])) 311 | if not dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children']: 312 | #print("No children to further process in Stage-6") 313 | meta_raw_payload_name_stage_8 = fnmatch.filter(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['info'],pattern) 314 | #print("List of payload types:",meta_raw_payload_name_stage_6) 315 | for k in range(len(meta_raw_payload_name_stage_8)): 316 | if substring in meta_raw_payload_name_stage_8[k]: 317 | #compute something 318 | pass 319 | #print("Payload are DB-calls stage-9 , Skipping...",meta_raw_payload_name_stage_8[k]) 320 | else: 321 | # print("--- Printing Payload information ---") 322 | # print("Printing Pay-Load infomarion for:",meta_raw_payload_name_stage_8[k]) 323 | # print("Host:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['info'][meta_raw_payload_name_stage_8[k]]['info']['host']) 324 | # print("Name:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['info'][meta_raw_payload_name_stage_8[k]]['name']) 325 | # print("Service:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['info'][meta_raw_payload_name_stage_8[k]]['service']) 326 | # print("Project:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['info'][meta_raw_payload_name_stage_8[k]]['project']) 327 | # print("Timestamp:", datetime.strptime(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['info'][meta_raw_payload_name_stage_8[k]]['timestamp'], '%Y-%m-%dT%H:%M:%S.%f')) 328 | # print("Trace_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['info'][meta_raw_payload_name_stage_8[k]]['trace_id']) 329 | # print("Parent_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['info'][meta_raw_payload_name_stage_8[k]]['parent_id']) 330 | # print("Base_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['info'][meta_raw_payload_name_stage_8[k]]['base_id']) 331 | 332 | dfObj = dfObj.append({'Host': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['info'][meta_raw_payload_name_stage_8[k]]['info']['host'], 'Name': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['info'][meta_raw_payload_name_stage_8[k]]['name'], 'Service': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['info'][meta_raw_payload_name_stage_8[k]]['service'], 'Project': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['info'][meta_raw_payload_name_stage_8[k]]['project'], 'Timestamp': datetime.strptime(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['info'][meta_raw_payload_name_stage_8[k]]['timestamp'], '%Y-%m-%dT%H:%M:%S.%f'), 'Iteration_id': iteration_id, 'Trace_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['info'][meta_raw_payload_name_stage_8[k]]['trace_id'], 'Parent_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['info'][meta_raw_payload_name_stage_8[k]]['parent_id'], 'Base_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['info'][meta_raw_payload_name_stage_8[k]]['base_id']}, ignore_index=True) 333 | 334 | 335 | 336 | 337 | else: 338 | #print("Children present in stage-9 index", x,y,z,a,b,c,d,e,f) 339 | 340 | for g in range(len(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'])): 341 | #print("Stage-10") 342 | #print("Length",len(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'])) 343 | if not dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children']: 344 | #print("No children to further process in Stage-6") 345 | meta_raw_payload_name_stage_9 = fnmatch.filter(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['info'],pattern) 346 | #print("List of payload types:",meta_raw_payload_name_stage_6) 347 | for k in range(len(meta_raw_payload_name_stage_9)): 348 | if substring in meta_raw_payload_name_stage_9[k]: 349 | #compute something 350 | pass 351 | #print("Payload are DB-calls stage-10 , Skipping...",meta_raw_payload_name_stage_9[k]) 352 | else: 353 | # print("--- Printing Payload information ---") 354 | # print("Printing Pay-Load infomarion for:",meta_raw_payload_name_stage_9[k]) 355 | # print("Host:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['info'][meta_raw_payload_name_stage_9[k]]['info']['host']) 356 | # print("Name:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['info'][meta_raw_payload_name_stage_9[k]]['name']) 357 | # print("Service:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['info'][meta_raw_payload_name_stage_9[k]]['service']) 358 | # print("Project:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['info'][meta_raw_payload_name_stage_9[k]]['project']) 359 | # print("Timestamp:", datetime.strptime(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['info'][meta_raw_payload_name_stage_9[k]]['timestamp'], '%Y-%m-%dT%H:%M:%S.%f')) 360 | # print("Trace_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['info'][meta_raw_payload_name_stage_9[k]]['trace_id']) 361 | # print("Parent_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['info'][meta_raw_payload_name_stage_9[k]]['parent_id']) 362 | # print("Base_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['info'][meta_raw_payload_name_stage_9[k]]['base_id']) 363 | 364 | dfObj = dfObj.append({'Host': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['info'][meta_raw_payload_name_stage_9[k]]['info']['host'], 'Name': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['info'][meta_raw_payload_name_stage_9[k]]['name'], 'Service': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['info'][meta_raw_payload_name_stage_9[k]]['service'], 'Project': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['info'][meta_raw_payload_name_stage_9[k]]['project'], 'Timestamp': datetime.strptime(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['info'][meta_raw_payload_name_stage_9[k]]['timestamp'], '%Y-%m-%dT%H:%M:%S.%f'), 'Iteration_id': iteration_id, 'Trace_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['info'][meta_raw_payload_name_stage_9[k]]['trace_id'], 'Parent_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['info'][meta_raw_payload_name_stage_9[k]]['parent_id'], 'Base_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['info'][meta_raw_payload_name_stage_9[k]]['base_id']}, ignore_index=True) 365 | 366 | 367 | 368 | 369 | else: 370 | #print("Children present in stage-10 index", x,y,z,a,b,c,d,e,f,g) 371 | 372 | for h in range(len(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'])): 373 | #print("Stage-11") 374 | #print("Length",len(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'])) 375 | if not dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['children']: 376 | #print("No children to further process in Stage-6") 377 | meta_raw_payload_name_stage_10 = fnmatch.filter(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['info'],pattern) 378 | #print("List of payload types:",meta_raw_payload_name_stage_6) 379 | for k in range(len(meta_raw_payload_name_stage_10)): 380 | if substring in meta_raw_payload_name_stage_10[k]: 381 | #compute something 382 | pass 383 | #print("Payload are DB-calls stage-11 , Skipping...",meta_raw_payload_name_stage_10[k]) 384 | else: 385 | # print("--- Printing Payload information ---") 386 | # print("Printing Pay-Load infomarion for:",meta_raw_payload_name_stage_10[k]) 387 | # print("Host:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['info'][meta_raw_payload_name_stage_10[k]]['info']['host']) 388 | # print("Name:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['info'][meta_raw_payload_name_stage_10[k]]['name']) 389 | # print("Service:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['info'][meta_raw_payload_name_stage_10[k]]['service']) 390 | # print("Project:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['info'][meta_raw_payload_name_stage_10[k]]['project']) 391 | # print("Timestamp:", datetime.strptime(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['info'][meta_raw_payload_name_stage_10[k]]['timestamp'], '%Y-%m-%dT%H:%M:%S.%f')) 392 | # print("Trace_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['info'][meta_raw_payload_name_stage_10[k]]['trace_id']) 393 | # print("Parent_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['info'][meta_raw_payload_name_stage_10[k]]['parent_id']) 394 | # print("Base_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['info'][meta_raw_payload_name_stage_10[k]]['base_id']) 395 | 396 | dfObj = dfObj.append({'Host': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['info'][meta_raw_payload_name_stage_10[k]]['info']['host'], 'Name': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['info'][meta_raw_payload_name_stage_10[k]]['name'], 'Service': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['info'][meta_raw_payload_name_stage_10[k]]['service'], 'Project': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['info'][meta_raw_payload_name_stage_10[k]]['project'], 'Timestamp': datetime.strptime(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['info'][meta_raw_payload_name_stage_10[k]]['timestamp'], '%Y-%m-%dT%H:%M:%S.%f'), 'Iteration_id': iteration_id, 'Trace_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['info'][meta_raw_payload_name_stage_10[k]]['trace_id'], 'Parent_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['info'][meta_raw_payload_name_stage_10[k]]['parent_id'], 'Base_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['info'][meta_raw_payload_name_stage_10[k]]['base_id']}, ignore_index=True) 397 | 398 | 399 | 400 | 401 | else: 402 | #print("Children present in stage-11 index", x,y,z,a,b,c,d,e,f,g,h) 403 | 404 | for i in range(len(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['children'])): 405 | # print("Stage-12") 406 | #print("Length",len(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'])) 407 | if not dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['children'][i]['children']: 408 | #print("No children to further process in Stage-6") 409 | meta_raw_payload_name_stage_11 = fnmatch.filter(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['children'][i]['info'],pattern) 410 | #print("List of payload types:",meta_raw_payload_name_stage_6) 411 | for k in range(len(meta_raw_payload_name_stage_11)): 412 | if substring in meta_raw_payload_name_stage_11[k]: 413 | #compute something 414 | pass 415 | #print("Payload are DB-calls stage-12 , Skipping...",meta_raw_payload_name_stage_11[k]) 416 | else: 417 | # print("--- Printing Payload information ---") 418 | # print("Printing Pay-Load infomarion for:",meta_raw_payload_name_stage_11[k]) 419 | # print("Host:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['children'][i]['info'][meta_raw_payload_name_stage_11[k]]['info']['host']) 420 | # print("Name:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['children'][i]['info'][meta_raw_payload_name_stage_11[k]]['name']) 421 | # print("Service:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['children'][i]['info'][meta_raw_payload_name_stage_11[k]]['service']) 422 | # print("Project:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['children'][i]['info'][meta_raw_payload_name_stage_11[k]]['project']) 423 | # print("Timestamp:", datetime.strptime(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['children'][i]['info'][meta_raw_payload_name_stage_11[k]]['timestamp'], '%Y-%m-%dT%H:%M:%S.%f')) 424 | # print("Trace_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['children'][i]['info'][meta_raw_payload_name_stage_11[k]]['trace_id']) 425 | # print("Parent_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['children'][i]['info'][meta_raw_payload_name_stage_11[k]]['parent_id']) 426 | # print("Base_id:",dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['children'][i]['info'][meta_raw_payload_name_stage_11[k]]['base_id']) 427 | 428 | dfObj = dfObj.append({'Host': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['children'][i]['info'][meta_raw_payload_name_stage_11[k]]['info']['host'], 'Name': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['children'][i]['info'][meta_raw_payload_name_stage_11[k]]['name'], 'Service': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['children'][i]['info'][meta_raw_payload_name_stage_11[k]]['service'], 'Project': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['children'][i]['info'][meta_raw_payload_name_stage_11[k]]['project'], 'Timestamp': datetime.strptime(dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['children'][i]['info'][meta_raw_payload_name_stage_11[k]]['timestamp'], '%Y-%m-%dT%H:%M:%S.%f'), 'Iteration_id': iteration_id, 'Trace_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['children'][i]['info'][meta_raw_payload_name_stage_11[k]]['trace_id'], 'Parent_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['children'][i]['info'][meta_raw_payload_name_stage_11[k]]['parent_id'], 'Base_id': dict_train['children'][x]['children'][y]['children'][z]['children'][a]['children'][b]['children'][c]['children'][d]['children'][e]['children'][f]['children'][g]['children'][h]['children'][i]['info'][meta_raw_payload_name_stage_11[k]]['base_id']}, ignore_index=True) 429 | 430 | 431 | 432 | 433 | 434 | 435 | else: 436 | print("Children present in stage-12 index", x,y,z,a,b,c,d,e,f,g,h,i) 437 | return dfObj 438 | 439 | 440 | 441 | 442 | #create_df('/var/lib/rally_container/boot_delete_config/traces_new/compute_faults','dataframe_compute_faults') 443 | 444 | create_df(args["path"], args["out"]) 445 | -------------------------------------------------------------------------------- /workloads/create_delete_image_config/commands-to-exec-glance.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash -x 2 | # Restarting glance_api container 10 times and sleep for 2 secs between restarts 3 | for i in {1..1} 4 | do 5 | echo "Restarting glance-container" 6 | docker restart glance_api 7 | done 8 | 9 | 10 | -------------------------------------------------------------------------------- /workloads/create_delete_image_config/create-and-delete-image.yaml: -------------------------------------------------------------------------------- 1 | --- 2 | GlanceImages.create_and_delete_image: 3 | - 4 | args: 5 | image_location: "http://download.cirros-cloud.net/0.3.5/cirros-0.3.5-x86_64-disk.img" 6 | container_format: "bare" 7 | disk_format: "qcow2" 8 | runner: 9 | type: "constant" 10 | times: 3000 11 | concurrency: 5 12 | context: 13 | users: 14 | tenants: 2 15 | users_per_tenant: 3 16 | sla: 17 | failure_rate: 18 | max: 100 19 | hooks: 20 | - name: sys_call 21 | description: Run script 22 | args: sh ./restart_glance_container.sh 23 | trigger: 24 | name: event 25 | args: 26 | unit: iteration 27 | at: [250,500,750,1000,1250,1500,1750,2000,2250,2500,2750,3000] 28 | -------------------------------------------------------------------------------- /workloads/create_delete_image_config/list-images.yml: -------------------------------------------------------------------------------- 1 | --- 2 | GlanceImages.list_images: 3 | - 4 | runner: 5 | type: "constant" 6 | times: 20 7 | concurrency: 1 8 | context: 9 | users: 10 | tenants: 2 11 | users_per_tenant: 2 12 | images: 13 | image_url: "http://download.cirros-cloud.net/0.3.5/cirros-0.3.5-x86_64-disk.img" 14 | disk_format: "qcow2" 15 | container_format: "bare" 16 | images_per_tenant: 4 17 | sla: 18 | failure_rate: 19 | max: 100 20 | 21 | hooks: 22 | - name: sys_call 23 | description: Run script 24 | args: sh /home/rally/data/list_image_config/restart_glance_container.sh 25 | trigger: 26 | name: event 27 | args: 28 | unit: iteration 29 | at: [2] 30 | -------------------------------------------------------------------------------- /workloads/create_delete_image_config/output_jasmin_image.html: -------------------------------------------------------------------------------- 1 | 2 | 3 | 4 | 5 | 6 | 7 | Rally | Rally Task Report 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 552 | 627 | 628 | 629 | 630 | 636 | 637 |
638 | 639 | 640 | 641 | 642 |
643 |
644 | 647 | 650 |
651 |
652 | 659 | 665 |
666 |
667 | 668 |
669 | 670 |
671 |

Task overview

672 | 674 | 675 | 676 | 736 | 737 | 738 | 740 | 751 | 752 |
679 | Scenario 680 | 681 | 682 | 683 | 684 | 687 | Load duration (s) 688 | 689 | 690 | 691 | 692 | 695 | Full duration (s) 696 | 697 | 698 | 699 | 700 | 702 | Iterations 703 | 704 | 705 | 706 | 707 | 709 | Runner 710 | 711 | 712 | 713 | 714 | 716 | Errors 717 | 718 | 719 | 720 | 721 | 723 | Hooks 724 | 725 | 726 | 727 | 728 | 730 | Success (SLA) 731 | 732 | 733 | 734 | 735 |
{{sc.ref}} 741 | {{sc.load_duration | number:3}} 742 | {{sc.full_duration | number:3}} 743 | {{sc.iterations_count}} 744 | {{sc.runner}} 745 | {{sc.errors.length}} 746 | {{sc.hooks.length}} 747 | 748 | 749 | 750 |
753 |
754 | 755 |
756 |

Input file

757 |
{{source}}
758 |
759 | 760 |
761 |

{{scenario.cls}}.{{scenario.name}} ({{scenario.full_duration | number:3}}s)

762 |
763 | {{scenario.description}} 764 |
765 |
    766 |
  • 770 |
    {{t.name}}
    771 |
  • 772 |
    773 |
774 |
775 | 776 | 852 | 853 | 886 | 887 | 929 | 930 | 1028 | 1029 | 1062 | 1063 | 1067 |
1068 | 1069 |
1070 |
1071 | 1072 | 1073 |
1074 | 1075 | 1082 | 1083 | -------------------------------------------------------------------------------- /workloads/create_delete_image_config/restart_glance_container.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash +x 2 | 3 | #SSH into Wally113 controller and execute commands from commands-to-exec-glance.sh 4 | 5 | cat commands-to-exec-glance.sh | sshpass -p 'rally' ssh rally@wally113.cit.tu-berlin.de 6 | -------------------------------------------------------------------------------- /workloads/create_delete_image_config/test.json: -------------------------------------------------------------------------------- 1 | { 2 | "info": { 3 | "started": 0, 4 | "last_trace_started": 8727, 5 | "finished": 10058, 6 | "name": "total" 7 | }, 8 | "stats": { 9 | "wsgi": { 10 | "count": 12, 11 | "duration": 6764 12 | } 13 | }, 14 | "children": [ 15 | { 16 | "info": { 17 | "exception": "None", 18 | "name": "wsgi", 19 | "service": "public", 20 | "started": 0, 21 | "meta.raw_payload.wsgi-stop": { 22 | "info": { 23 | "host": "wally113" 24 | }, 25 | "name": "wsgi-stop", 26 | "service": "public", 27 | "timestamp": "2019-11-18T12:19:05.147722", 28 | "trace_id": "116851fb-890f-46f7-a753-0dac94d55499", 29 | "project": "keystone", 30 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 31 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 32 | }, 33 | "finished": 102, 34 | "project": "keystone", 35 | "host": "wally113", 36 | "meta.raw_payload.wsgi-start": { 37 | "info": { 38 | "host": "wally113", 39 | "request": { 40 | "path": "/", 41 | "scheme": "http", 42 | "method": "GET", 43 | "query": "" 44 | } 45 | }, 46 | "name": "wsgi-start", 47 | "service": "public", 48 | "timestamp": "2019-11-18T12:19:05.045692", 49 | "trace_id": "116851fb-890f-46f7-a753-0dac94d55499", 50 | "project": "keystone", 51 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 52 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 53 | } 54 | }, 55 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 56 | "trace_id": "116851fb-890f-46f7-a753-0dac94d55499", 57 | "children": [] 58 | }, 59 | { 60 | "info": { 61 | "exception": "None", 62 | "name": "wsgi", 63 | "service": "public", 64 | "started": 161, 65 | "meta.raw_payload.wsgi-stop": { 66 | "info": { 67 | "host": "wally113" 68 | }, 69 | "name": "wsgi-stop", 70 | "service": "public", 71 | "timestamp": "2019-11-18T12:19:05.608341", 72 | "trace_id": "895c37bf-1035-44ac-809b-4ceaa02c4de6", 73 | "project": "keystone", 74 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 75 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 76 | }, 77 | "finished": 562, 78 | "project": "keystone", 79 | "host": "wally113", 80 | "meta.raw_payload.wsgi-start": { 81 | "info": { 82 | "host": "wally113", 83 | "request": { 84 | "path": "/v3/auth/tokens", 85 | "scheme": "http", 86 | "method": "POST", 87 | "query": "" 88 | } 89 | }, 90 | "name": "wsgi-start", 91 | "service": "public", 92 | "timestamp": "2019-11-18T12:19:05.207285", 93 | "trace_id": "895c37bf-1035-44ac-809b-4ceaa02c4de6", 94 | "project": "keystone", 95 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 96 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 97 | } 98 | }, 99 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 100 | "trace_id": "895c37bf-1035-44ac-809b-4ceaa02c4de6", 101 | "children": [] 102 | }, 103 | { 104 | "info": { 105 | "exception": "None", 106 | "name": "wsgi", 107 | "service": "public", 108 | "started": 676, 109 | "meta.raw_payload.wsgi-stop": { 110 | "info": { 111 | "host": "wally113" 112 | }, 113 | "name": "wsgi-stop", 114 | "service": "public", 115 | "timestamp": "2019-11-18T12:19:05.772028", 116 | "trace_id": "5d42aa64-016e-4e07-b1f9-5228ed926a71", 117 | "project": "keystone", 118 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 119 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 120 | }, 121 | "finished": 726, 122 | "project": "keystone", 123 | "host": "wally113", 124 | "meta.raw_payload.wsgi-start": { 125 | "info": { 126 | "host": "wally113", 127 | "request": { 128 | "path": "/", 129 | "scheme": "http", 130 | "method": "GET", 131 | "query": "" 132 | } 133 | }, 134 | "name": "wsgi-start", 135 | "service": "public", 136 | "timestamp": "2019-11-18T12:19:05.722138", 137 | "trace_id": "5d42aa64-016e-4e07-b1f9-5228ed926a71", 138 | "project": "keystone", 139 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 140 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 141 | } 142 | }, 143 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 144 | "trace_id": "5d42aa64-016e-4e07-b1f9-5228ed926a71", 145 | "children": [] 146 | }, 147 | { 148 | "info": { 149 | "exception": "None", 150 | "name": "wsgi", 151 | "service": "public", 152 | "started": 857, 153 | "meta.raw_payload.wsgi-stop": { 154 | "info": { 155 | "host": "wally113" 156 | }, 157 | "name": "wsgi-stop", 158 | "service": "public", 159 | "timestamp": "2019-11-18T12:19:06.237228", 160 | "trace_id": "2c937f51-57b7-401c-84ea-0458abcac204", 161 | "project": "keystone", 162 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 163 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 164 | }, 165 | "finished": 1191, 166 | "project": "keystone", 167 | "host": "wally113", 168 | "meta.raw_payload.wsgi-start": { 169 | "info": { 170 | "host": "wally113", 171 | "request": { 172 | "path": "/v3/auth/tokens", 173 | "scheme": "http", 174 | "method": "POST", 175 | "query": "" 176 | } 177 | }, 178 | "name": "wsgi-start", 179 | "service": "public", 180 | "timestamp": "2019-11-18T12:19:05.903666", 181 | "trace_id": "2c937f51-57b7-401c-84ea-0458abcac204", 182 | "project": "keystone", 183 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 184 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 185 | } 186 | }, 187 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 188 | "trace_id": "2c937f51-57b7-401c-84ea-0458abcac204", 189 | "children": [] 190 | }, 191 | { 192 | "info": { 193 | "exception": "None", 194 | "name": "wsgi", 195 | "service": "api", 196 | "started": 1261, 197 | "meta.raw_payload.wsgi-stop": { 198 | "info": { 199 | "host": "wally113" 200 | }, 201 | "name": "wsgi-stop", 202 | "service": "api", 203 | "timestamp": "2019-11-18T12:19:06.558746", 204 | "trace_id": "24968a47-2031-4765-898e-69a65572f188", 205 | "project": "glance", 206 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 207 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 208 | }, 209 | "finished": 1513, 210 | "project": "glance", 211 | "host": "wally113", 212 | "meta.raw_payload.wsgi-start": { 213 | "info": { 214 | "host": "wally113", 215 | "request": { 216 | "path": "/v2/schemas/image", 217 | "scheme": "http", 218 | "method": "GET", 219 | "query": "" 220 | } 221 | }, 222 | "name": "wsgi-start", 223 | "service": "api", 224 | "timestamp": "2019-11-18T12:19:06.306821", 225 | "trace_id": "24968a47-2031-4765-898e-69a65572f188", 226 | "project": "glance", 227 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 228 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 229 | } 230 | }, 231 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 232 | "trace_id": "24968a47-2031-4765-898e-69a65572f188", 233 | "children": [ 234 | { 235 | "info": { 236 | "exception": "None", 237 | "name": "wsgi", 238 | "service": "public", 239 | "started": 1329, 240 | "meta.raw_payload.wsgi-stop": { 241 | "info": { 242 | "host": "wally113" 243 | }, 244 | "name": "wsgi-stop", 245 | "service": "public", 246 | "timestamp": "2019-11-18T12:19:06.491599", 247 | "trace_id": "7967078c-ef88-420b-aca7-1e6fdf0837b6", 248 | "project": "keystone", 249 | "parent_id": "24968a47-2031-4765-898e-69a65572f188", 250 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 251 | }, 252 | "finished": 1445, 253 | "project": "keystone", 254 | "host": "wally113", 255 | "meta.raw_payload.wsgi-start": { 256 | "info": { 257 | "host": "wally113", 258 | "request": { 259 | "path": "/v3/auth/tokens", 260 | "scheme": "http", 261 | "method": "GET", 262 | "query": "" 263 | } 264 | }, 265 | "name": "wsgi-start", 266 | "service": "public", 267 | "timestamp": "2019-11-18T12:19:06.375032", 268 | "trace_id": "7967078c-ef88-420b-aca7-1e6fdf0837b6", 269 | "project": "keystone", 270 | "parent_id": "24968a47-2031-4765-898e-69a65572f188", 271 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 272 | } 273 | }, 274 | "parent_id": "24968a47-2031-4765-898e-69a65572f188", 275 | "trace_id": "7967078c-ef88-420b-aca7-1e6fdf0837b6", 276 | "children": [] 277 | } 278 | ] 279 | }, 280 | { 281 | "info": { 282 | "exception": "None", 283 | "name": "wsgi", 284 | "service": "api", 285 | "started": 1617, 286 | "meta.raw_payload.wsgi-stop": { 287 | "info": { 288 | "host": "wally113" 289 | }, 290 | "name": "wsgi-stop", 291 | "service": "api", 292 | "timestamp": "2019-11-18T12:19:06.815019", 293 | "trace_id": "64395150-5826-43dd-95a5-88d615f80483", 294 | "project": "glance", 295 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 296 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 297 | }, 298 | "finished": 1769, 299 | "project": "glance", 300 | "host": "wally113", 301 | "meta.raw_payload.wsgi-start": { 302 | "info": { 303 | "host": "wally113", 304 | "request": { 305 | "path": "/v2/images", 306 | "scheme": "http", 307 | "method": "POST", 308 | "query": "" 309 | } 310 | }, 311 | "name": "wsgi-start", 312 | "service": "api", 313 | "timestamp": "2019-11-18T12:19:06.663114", 314 | "trace_id": "64395150-5826-43dd-95a5-88d615f80483", 315 | "project": "glance", 316 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 317 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 318 | } 319 | }, 320 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 321 | "trace_id": "64395150-5826-43dd-95a5-88d615f80483", 322 | "children": [] 323 | }, 324 | { 325 | "info": { 326 | "exception": "None", 327 | "name": "wsgi", 328 | "service": "api", 329 | "started": 3861, 330 | "meta.raw_payload.wsgi-stop": { 331 | "info": { 332 | "host": "wally113" 333 | }, 334 | "name": "wsgi-stop", 335 | "service": "api", 336 | "timestamp": "2019-11-18T12:19:08.976930", 337 | "trace_id": "2420a5b6-9d3d-4a2c-a09c-f221038deb86", 338 | "project": "glance", 339 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 340 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 341 | }, 342 | "finished": 3931, 343 | "project": "glance", 344 | "host": "wally113", 345 | "meta.raw_payload.wsgi-start": { 346 | "info": { 347 | "host": "wally113", 348 | "request": { 349 | "path": "/v2/images/985aa609-db14-4b4f-abd1-f1e1521237c7", 350 | "scheme": "http", 351 | "method": "GET", 352 | "query": "" 353 | } 354 | }, 355 | "name": "wsgi-start", 356 | "service": "api", 357 | "timestamp": "2019-11-18T12:19:08.906980", 358 | "trace_id": "2420a5b6-9d3d-4a2c-a09c-f221038deb86", 359 | "project": "glance", 360 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 361 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 362 | } 363 | }, 364 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 365 | "trace_id": "2420a5b6-9d3d-4a2c-a09c-f221038deb86", 366 | "children": [] 367 | }, 368 | { 369 | "info": { 370 | "exception": "None", 371 | "name": "wsgi", 372 | "service": "api", 373 | "started": 4044, 374 | "meta.raw_payload.wsgi-stop": { 375 | "info": { 376 | "host": "wally113" 377 | }, 378 | "name": "wsgi-stop", 379 | "service": "api", 380 | "timestamp": "2019-11-18T12:19:09.189027", 381 | "trace_id": "eb43c33e-6381-4df6-b76d-55df14695cf0", 382 | "project": "glance", 383 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 384 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 385 | }, 386 | "finished": 4143, 387 | "project": "glance", 388 | "host": "wally113", 389 | "meta.raw_payload.wsgi-start": { 390 | "info": { 391 | "host": "wally113", 392 | "request": { 393 | "path": "/v2/schemas/image", 394 | "scheme": "http", 395 | "method": "GET", 396 | "query": "" 397 | } 398 | }, 399 | "name": "wsgi-start", 400 | "service": "api", 401 | "timestamp": "2019-11-18T12:19:09.090110", 402 | "trace_id": "eb43c33e-6381-4df6-b76d-55df14695cf0", 403 | "project": "glance", 404 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 405 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 406 | } 407 | }, 408 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 409 | "trace_id": "eb43c33e-6381-4df6-b76d-55df14695cf0", 410 | "children": [] 411 | }, 412 | { 413 | "info": { 414 | "exception": "None", 415 | "name": "wsgi", 416 | "service": "api", 417 | "started": 4642, 418 | "meta.raw_payload.wsgi-stop": { 419 | "info": { 420 | "host": "wally113" 421 | }, 422 | "name": "wsgi-stop", 423 | "service": "api", 424 | "timestamp": "2019-11-18T12:19:13.407117", 425 | "trace_id": "3ef6f663-e000-4b58-84f7-aeeb929e4dc3", 426 | "project": "glance", 427 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 428 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 429 | }, 430 | "finished": 8361, 431 | "project": "glance", 432 | "host": "wally113", 433 | "meta.raw_payload.wsgi-start": { 434 | "info": { 435 | "host": "wally113", 436 | "request": { 437 | "path": "/v2/images/985aa609-db14-4b4f-abd1-f1e1521237c7/file", 438 | "scheme": "http", 439 | "method": "PUT", 440 | "query": "" 441 | } 442 | }, 443 | "name": "wsgi-start", 444 | "service": "api", 445 | "timestamp": "2019-11-18T12:19:09.688067", 446 | "trace_id": "3ef6f663-e000-4b58-84f7-aeeb929e4dc3", 447 | "project": "glance", 448 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 449 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 450 | } 451 | }, 452 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 453 | "trace_id": "3ef6f663-e000-4b58-84f7-aeeb929e4dc3", 454 | "children": [] 455 | }, 456 | { 457 | "info": { 458 | "exception": "None", 459 | "name": "wsgi", 460 | "service": "api", 461 | "started": 8468, 462 | "meta.raw_payload.wsgi-stop": { 463 | "info": { 464 | "host": "wally113" 465 | }, 466 | "name": "wsgi-stop", 467 | "service": "api", 468 | "timestamp": "2019-11-18T12:19:13.652612", 469 | "trace_id": "eb1ec797-7411-4a6e-97ef-33449a5537d3", 470 | "project": "glance", 471 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 472 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 473 | }, 474 | "finished": 8606, 475 | "project": "glance", 476 | "host": "wally113", 477 | "meta.raw_payload.wsgi-start": { 478 | "info": { 479 | "host": "wally113", 480 | "request": { 481 | "path": "/v2/images/985aa609-db14-4b4f-abd1-f1e1521237c7", 482 | "scheme": "http", 483 | "method": "GET", 484 | "query": "" 485 | } 486 | }, 487 | "name": "wsgi-start", 488 | "service": "api", 489 | "timestamp": "2019-11-18T12:19:13.513819", 490 | "trace_id": "eb1ec797-7411-4a6e-97ef-33449a5537d3", 491 | "project": "glance", 492 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 493 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 494 | } 495 | }, 496 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 497 | "trace_id": "eb1ec797-7411-4a6e-97ef-33449a5537d3", 498 | "children": [] 499 | }, 500 | { 501 | "info": { 502 | "exception": "None", 503 | "name": "wsgi", 504 | "service": "api", 505 | "started": 8727, 506 | "meta.raw_payload.wsgi-stop": { 507 | "info": { 508 | "host": "wally113" 509 | }, 510 | "name": "wsgi-stop", 511 | "service": "api", 512 | "timestamp": "2019-11-18T12:19:15.103720", 513 | "trace_id": "4c6b7f84-4cd0-476a-beef-b11c77cc4033", 514 | "project": "glance", 515 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 516 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 517 | }, 518 | "finished": 10058, 519 | "project": "glance", 520 | "host": "wally113", 521 | "meta.raw_payload.wsgi-start": { 522 | "info": { 523 | "host": "wally113", 524 | "request": { 525 | "path": "/v2/images/985aa609-db14-4b4f-abd1-f1e1521237c7", 526 | "scheme": "http", 527 | "method": "DELETE", 528 | "query": "" 529 | } 530 | }, 531 | "name": "wsgi-start", 532 | "service": "api", 533 | "timestamp": "2019-11-18T12:19:13.773141", 534 | "trace_id": "4c6b7f84-4cd0-476a-beef-b11c77cc4033", 535 | "project": "glance", 536 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 537 | "base_id": "8f6ff588-fcb6-4983-883c-282f78922756" 538 | } 539 | }, 540 | "parent_id": "8f6ff588-fcb6-4983-883c-282f78922756", 541 | "trace_id": "4c6b7f84-4cd0-476a-beef-b11c77cc4033", 542 | "children": [] 543 | } 544 | ] 545 | } -------------------------------------------------------------------------------- /workloads/create_delete_image_config/trace_ids_18th_nov: -------------------------------------------------------------------------------- 1 | 9aa0efa3-07d0-489f-bcbf-509d3a8d7809 2 | e2fff62c-d057-4ca2-a5c1-8ba016235e0d 3 | 306a8c35-0260-4a91-a06c-294528697940 4 | 91477cd0-1993-48fb-b624-b6eefdf5ff7b 5 | 8acd2fde-389e-4ada-b47a-6544e4b5dba2 6 | f94b1484-63fd-4756-a75b-795daf67a4e3 7 | bd4a7570-ba05-4d4c-9eb8-2024a00055e2 8 | 6b65c41f-4e6e-4a86-8db4-a33a4badc9c5 9 | 985f9323-a509-4371-a7fa-33fae3aecfc1 10 | 57ecfcf7-4335-4856-af8f-d25b1f9dd922 11 | 1997f188-728e-4dd8-a7ef-d5057e27e31c 12 | fbcdc9c7-a8e2-4e65-8698-e7cb1e3c4f04 13 | d50058f8-a859-4900-9ed5-4ba65294f09d 14 | 9fa11480-a6eb-4793-a770-b77b1feb4a6d 15 | e36f581c-17f1-4610-9140-bb8272fb35fc 16 | dc0ad6c3-6294-4ccf-af77-063ce3d9ffcb 17 | fb5bef91-11eb-4957-8d48-3dc66e937cf9 18 | 900f7fc2-a913-40d2-90c9-5bc5778f2b3a 19 | f49ba9bc-bff2-4eae-a9db-02c860547296 20 | 3913e01b-6f0c-457b-b98d-f239b473c6c8 21 | 555f1899-e54f-4f7c-902f-c678c0aa3423 22 | 8f42a3c4-3854-4788-8af3-7fec8b92316d 23 | bc0684fd-46fb-44c7-a20d-80af9028a6d5 24 | 93a22503-7e89-46a0-a4e7-414320b59396 25 | e1166634-783d-43e2-b1e9-77cdee9b65ae 26 | f36afb65-65a7-40c8-8396-37a80022bb69 27 | 4d48a918-3255-4b0e-a765-eb83a878dfdc 28 | 29455f03-ba06-4e93-a282-d11bd6720427 29 | c2c841a4-eef0-41ae-b109-8522784c0579 30 | 78b77da7-9692-407e-9361-199980fb68c5 31 | fd64a7a8-5eff-4451-a78e-3d3b37de8e80 32 | ba108905-831d-4845-95bb-c5df85a66dba 33 | d1c9a800-4266-49d3-ba24-2c999362ee3d 34 | 8ec60d58-d9ba-4f3d-9717-316eea5d5436 35 | 4d4ad136-c886-4dff-b156-2e756828bd57 36 | f8164b69-c54b-44ba-9f92-da986d191215 37 | 212fe628-335d-4bf4-a385-2762f8dc1e1a 38 | 3d800598-6375-4a0a-8b9a-f1e25f7e4405 39 | 3abe020e-2469-4ac9-86f0-bb413631d740 40 | 87ad5b7f-a100-4ba1-a55e-ee0ed80489c9 41 | -------------------------------------------------------------------------------- /workloads/create_delete_image_config/traceids_jasmin.txt: -------------------------------------------------------------------------------- 1 | ab34aad1-40b5-4ebf-bc1c-765750970524 2 | 0513b6a0-bbaf-4732-aacb-1fca26f21e01 3 | 87709321-5490-4a52-9e3d-33fdd12775a3 4 | ec64ab98-3e73-4bd6-8254-7c2ab2fdba4a 5 | 9bca023e-b83e-4ae9-91c9-e76a02584986 6 | f6aaccd6-2d7c-401c-b17b-6b81b000e809 7 | 90dc4a7e-89ea-4c3a-acee-452d4b338997 8 | 27be13af-8093-4fa8-a2e0-802ae6eefe7c 9 | 8e4a16b6-a621-4c00-9330-895ec70b441e 10 | 6af15627-b470-4d89-8b84-761ffa43fa15 11 | 828c129e-e8b3-48c9-a47b-a21bfbee9ccd 12 | 41e087fc-a6a1-4b24-b776-9ccda8032c89 13 | 7baf432c-d63e-4656-b21c-f0ed9192940d 14 | 8637da32-11bb-41fd-a560-13bb6084b793 15 | 0d1af073-ea1a-4a12-a4a6-c65cd2d98e2c 16 | e39a0b61-6fd1-4f72-adc7-85116b977db9 17 | 4eca4b60-35e8-4462-94f4-7cb03f2c9f1f 18 | cd249ced-0fb9-422c-8f63-c2ef5582fd40 19 | afcdc4bf-8164-4129-a796-f490e0d79c72 20 | 127ec9df-7148-4e4a-88bc-e687d265734b 21 | -------------------------------------------------------------------------------- /workloads/json_trace_ids.py: -------------------------------------------------------------------------------- 1 | import json 2 | import sys 3 | print(sys.argv[1]) 4 | file_trace = sys.argv[1]#'create_delete_1.json' 5 | with open(str(file_trace)) as trace_file: 6 | trace_json = json.load(trace_file) 7 | 8 | # Trace IDs for create-delete list 9 | for i in range(len(trace_json['tasks'][0]['subtasks'][0]['workloads'][0]['data'])): 10 | print(trace_json['tasks'][0]['subtasks'][0]['workloads'][0]['data'][i]['output']['complete'][0]['data']['trace_id']) 11 | 12 | -------------------------------------------------------------------------------- /workloads/keystone_auth.yaml: -------------------------------------------------------------------------------- 1 | --- 2 | Authenticate.keystone: 3 | - 4 | runner: 5 | type: "constant_for_duration" 6 | duration: 40 7 | concurrency: 5 8 | context: 9 | users: 10 | tenants: 1 11 | users_per_tenant: 1 12 | hooks: 13 | - 14 | name: fault_injection 15 | args: 16 | action: restart memcached container on wally113 17 | trigger: 18 | name: event 19 | args: 20 | unit: iteration 21 | at: [10] 22 | - 23 | name: fault_injection 24 | args: 25 | action: restart nova-api container on wally113 26 | trigger: 27 | name: event 28 | args: 29 | unit: time 30 | at: [30] 31 | -------------------------------------------------------------------------------- /workloads/network_create_delete_config/commands-to-exec-neutron.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash -x 2 | # Restarting neutron containers and sleep for 25 secs between start and stop 3 | for i in {1} 4 | do 5 | echo "Restarting neutron-containers" 6 | docker stop neutron_metadata_agent & 7 | docker stop neutron_l3_agent & 8 | docker stop neutron_dhcp_agent & 9 | docker stop neutron_openvswitch_agent & 10 | docker stop neutron_openvswitch_agent & 11 | docker stop neutron_server 12 | sleep 0.1 13 | docker start neutron_metadata_agent & 14 | docker start neutron_l3_agent & 15 | docker start neutron_dhcp_agent & 16 | docker start neutron_openvswitch_agent & 17 | docker start neutron_openvswitch_agent & 18 | docker start neutron_server 19 | done 20 | -------------------------------------------------------------------------------- /workloads/network_create_delete_config/create-and-delete-networks.yaml: -------------------------------------------------------------------------------- 1 | --- 2 | NeutronNetworks.create_and_delete_networks: 3 | - 4 | args: 5 | network_create_args: {} 6 | runner: 7 | type: "constant" 8 | times: 6000 9 | concurrency: 1 10 | context: 11 | users: 12 | tenants: 3 13 | users_per_tenant: 3 14 | quotas: 15 | neutron: 16 | network: -1 17 | sla: 18 | failure_rate: 19 | max: 100 20 | hooks: 21 | - name: sys_call 22 | description: Run script 23 | args: sh ./restart_neutron_container.sh 24 | trigger: 25 | name: event 26 | args: 27 | unit: iteration 28 | at: [500,1000,1500,2000,2500,3000,3500,4000,4500,5000,5500,6000,6500,7000] 29 | -------------------------------------------------------------------------------- /workloads/network_create_delete_config/restart_neutron_container.sh: -------------------------------------------------------------------------------- 1 | #!/bin/bash +x 2 | 3 | #SSH into Wally113 controller and execute commands from commands-to-exec-nova.sh 4 | 5 | cat commands-to-exec-neutron.sh | sshpass -p 'rally' ssh rally@wally113.cit.tu-berlin.de 6 | -------------------------------------------------------------------------------- /workloads/os-faults-latest.yaml: -------------------------------------------------------------------------------- 1 | cloud_management: 2 | driver: universal 3 | node_discover: 4 | driver: node_list 5 | args: 6 | - ip: 130.149.249.123 # role: controller 7 | fqdn: wally113 8 | auth: 9 | username: rally 10 | private_key_file: /home/rally/.ssh/id_rsa 11 | become_password: rally 12 | - ip: 130.149.249.132 # role: compute 13 | fqdn: wally122 14 | auth: 15 | username: rally 16 | private_key_file: /home/rally/.ssh/id_rsa 17 | become_password: rally 18 | - ip: 130.149.249.134 # role: compute 19 | fqdn: wally124 20 | auth: 21 | username: rally 22 | private_key_file: /home/rally/.ssh/id_rsa 23 | become_password: rally 24 | - ip: 130.149.249.133 # role: compute 25 | fqdn: wally123 26 | auth: 27 | username: rally 28 | private_key_file: /home/rally/.ssh/id_rsa 29 | become_password: rally 30 | 31 | - ip: 130.149.249.127 # role: compute 32 | fqdn: wally117 33 | auth: 34 | username: rally 35 | private_key_file: /home/rally/.ssh/id_rsa 36 | become_password: rally 37 | 38 | containers: 39 | cinder_api: 40 | driver: docker_container 41 | args: 42 | container_name: cinder_api 43 | hosts: 44 | - 130.149.249.123 45 | cinder_volume: 46 | driver: docker_container 47 | args: 48 | container_name: cinder_volume 49 | hosts: 50 | - 130.149.249.123 51 | - 130.149.249.132 52 | - 130.149.249.134 53 | - 130.149.249.133 54 | - 130.149.249.127 55 | 56 | glance_api: 57 | driver: docker_container 58 | args: 59 | container_name: glance_api 60 | hosts: 61 | - 130.149.249.123 62 | heat_api: 63 | driver: docker_container 64 | args: 65 | container_name: heat_api 66 | hosts: 67 | - 130.149.249.123 68 | heat_engine: 69 | driver: docker_container 70 | args: 71 | container_name: heat_engine 72 | hosts: 73 | - 130.149.249.123 74 | keystone: 75 | driver: docker_container 76 | args: 77 | container_name: keystone 78 | hosts: 79 | - 130.149.249.123 80 | memcached: 81 | driver: docker_container 82 | args: 83 | container_name: memcached 84 | hosts: 85 | - 130.149.249.123 86 | neutron_server: 87 | driver: docker_container 88 | args: 89 | container_name: neutron_server 90 | hosts: 91 | - 130.149.249.123 92 | neutron_dhcp_agent: 93 | driver: docker_container 94 | args: 95 | container_name: neutron_dhcp_agent 96 | hosts: 97 | - 130.149.249.123 98 | neutron_l3_agent: 99 | driver: docker_container 100 | args: 101 | container_name: neutron_l3_agent 102 | hosts: 103 | - 130.149.249.123 104 | neutron_metadata_agent: 105 | driver: docker_container 106 | args: 107 | container_name: neutron_metadata_agent 108 | hosts: 109 | - 130.149.249.123 110 | neutron_openvswitch_agent: 111 | driver: docker_container 112 | args: 113 | container_name: neutron_openvswitch_agent 114 | hosts: 115 | - 130.149.249.123 116 | - 130.149.249.132 117 | - 130.149.249.134 118 | - 130.149.249.133 119 | - 130.149.249.127 120 | 121 | nova_api: 122 | driver: docker_container 123 | args: 124 | container_name: nova_api 125 | hosts: 126 | - 130.149.249.123 127 | nova_compute: 128 | driver: docker_container 129 | args: 130 | container_name: nova_compute 131 | hosts: 132 | - 130.149.249.132 133 | - 130.149.249.134 134 | - 130.149.249.133 135 | - 130.149.249.127 136 | nova_conductor: 137 | driver: docker_container 138 | args: 139 | container_name: nova_conductor 140 | hosts: 141 | - 130.149.249.123 142 | nova_scheduler: 143 | driver: docker_container 144 | args: 145 | container_name: nova_scheduler 146 | hosts: 147 | - 130.149.249.123 148 | ovsdb_server: 149 | driver: docker_container 150 | args: 151 | container_name: openvswitch_db 152 | hosts: 153 | - 130.149.249.123 154 | - 130.149.249.132 155 | - 130.149.249.134 156 | - 130.149.249.133 157 | - 130.149.249.127 158 | ovs_vswitchd: 159 | driver: docker_container 160 | args: 161 | container_name: openvswitch_vswitchd 162 | hosts: 163 | - 130.149.249.123 164 | - 130.149.249.132 165 | - 130.149.249.134 166 | - 130.149.249.133 167 | - 130.149.249.127 168 | rabbitmq: 169 | driver: docker_container 170 | args: 171 | container_name: rabbitmq 172 | hosts: 173 | - 130.149.249.123 174 | --------------------------------------------------------------------------------