├── .ipynb_checkpoints ├── Res-Plot-checkpoint.ipynb ├── bench_test-all_service-checkpoint.ipynb ├── bench_test-single_service-checkpoint.ipynb └── data_collection_all_services-checkpoint.ipynb ├── README.md ├── Res-Plot.ipynb ├── bench_test-all_service.ipynb ├── bench_test-latency.ipynb ├── bench_test-single_service.ipynb ├── cAdvisor_metrics.csv ├── cAdvisor_query.csv ├── config.py ├── cpu-test.txt ├── data_collected ├── cpu-hog1_carts.pkl ├── cpu-hog1_catalogue.pkl ├── cpu-hog1_front-end.pkl ├── cpu-hog1_orders.pkl ├── cpu-hog1_payment.pkl ├── cpu-hog1_shipping.pkl ├── cpu-hog1_user.pkl ├── latency1_carts.pkl ├── latency1_catalogue.pkl ├── latency1_front-end.pkl ├── latency1_orders.pkl ├── latency1_payment.pkl ├── latency1_shipping-old.pkl ├── latency1_shipping.pkl ├── latency1_user.pkl ├── memory-leak1_carts.pkl ├── memory-leak1_catalogue.pkl ├── memory-leak1_front-end.pkl ├── memory-leak1_orders.pkl ├── memory-leak1_payment.pkl ├── memory-leak1_shipping.pkl └── memory-leak1_user.pkl ├── data_collection_all_services.ipynb ├── figures ├── all_service_cpu.pdf ├── all_service_mem.pdf ├── all_service_net.pdf ├── diagnosis-fram.png ├── example_metrics.pdf ├── example_metrics.png ├── service-latency_Mem-leak.pdf ├── service-latency_Mem-leak_std.pdf ├── service-latency_cpu-hog.pdf ├── service-latency_cpu-hog_std.pdf ├── service-latency_net-lat.pdf ├── service-latency_net-lat_std.pdf ├── service-single_ac3.pdf └── service-single_avg5_std.pdf ├── graph.py ├── memory-test.txt ├── metrics_causality.png ├── modules.py ├── network-test.txt ├── pa_result ├── .ipynb_checkpoints │ └── Res-Plot-New-checkpoint.ipynb ├── Res-Plot-New.ipynb ├── all_service │ ├── cpu_all_service_eta100.txt │ ├── cpu_all_service_eta1000.txt │ ├── cpu_all_service_gamma5.txt │ ├── cpu_all_service_gamma75.txt │ ├── memory_all_service_eta100.txt │ ├── memory_all_service_eta1000.txt │ ├── memory_all_service_gamma5.txt │ ├── memory_all_service_gamma75.txt │ ├── network_all_service_eta100.txt │ ├── network_all_service_eta1000.txt │ ├── network_all_service_gamma5.txt │ └── network_all_service_gamma75.txt ├── all_service_gamma_eta.pdf ├── latency │ ├── .DS_Store │ ├── cpu_latency_eta100.txt │ ├── cpu_latency_eta1000.txt │ ├── cpu_latency_gamma5.txt │ ├── cpu_latency_gamma75.txt │ ├── memory_latency_eta100.txt │ ├── memory_latency_eta1000.txt │ ├── memory_latency_gamma5.txt │ ├── memory_latency_gamma75.txt │ ├── network_latency_eta100.txt │ ├── network_latency_eta1000.txt │ ├── network_latency_gamma5.txt │ └── network_latency_gamma75.txt ├── latency_gamma_eta.pdf ├── single_service │ ├── cpu_single_service_eta100.txt │ ├── cpu_single_service_eta1000.txt │ ├── cpu_single_service_gamma5.txt │ ├── cpu_single_service_gamma75.txt │ ├── memory_single_service_eta100.txt │ ├── memory_single_service_eta1000.txt │ ├── memory_single_service_gamma5.txt │ ├── memory_single_service_gamma75.txt │ ├── network_single_service_eta100.txt │ ├── network_single_service_eta1000.txt │ ├── network_single_service_gamma5.txt │ └── network_single_service_gamma75.txt └── single_service_gamma_eta.pdf ├── requirements.txt ├── test_all_service_per.sh ├── test_latency_per.sh ├── test_single_service_per.sh ├── train_all_services.py ├── train_latency.py ├── train_single_service.py └── utils.py /README.md: -------------------------------------------------------------------------------- 1 | # CausalRCA_code 2 | 3 | ## Description 4 | 5 | This repository includes codes and data for CausalRCA. CausalRCA is a root cause localization framework, including monitoring metrics collection, causal structure learning and root cause inference. 6 | 7 | ![image](https://github.com/AXinx/CausalRCA_code/blob/master/figures/diagnosis-fram.png) 8 | 9 | ## Data 10 | 11 | We deploy the sock-shop with Kubernetes on several VMs in the cloud and inject anomalies to simulate performance issues of a running microservice application. 12 | 13 | We collect data with *data_collection_all_services.ipynb* and put all data in the folder *data_collected*. 14 | 15 | We collect both service-level and resource-level data. At the service level, we collect the latency of each service. At the resource level, we collect container resource-related metrics, including CPU usage, memory usage, disk read and write, and network receive and transmit bytes. 16 | 17 | 18 | ## Code 19 | 20 | To run these python files, first install requirements with 21 | 22 | ``` 23 | pip install -r requirements.txt 24 | ``` 25 | 26 | We provide codes for benchmark methods and CausalRCA, including the three experiments in our paper for latency, single-service, and full-service tests. 27 | 28 | For benchmark test, check *bench_test-latency.ipynb*, *bench_test-single_service.ipynb*, or *bench_test-all_service.ipynb*. 29 | 30 | For CausalRCA, run *train_latency.py*, *train_single_service.py*, or *train_all_services.py*. 31 | 32 | To repeat experiments, *test_latency_per.sh*, *test_single_service_per.sh*, and *test_all_service_per.sh* can be used. 33 | 34 | -------------------------------------------------------------------------------- /cAdvisor_metrics.csv: -------------------------------------------------------------------------------- 1 | Metric name;Type;Description;Unit (where applicable);option parameter;additional build flag 2 | container_accelerator_duty_cycle;Gauge;Percent of time over the past sample period during which the accelerator was actively processing;percentage;accelerator; 3 | container_accelerator_memory_total_bytes;Gauge;Total accelerator memory;bytes;accelerator; 4 | container_accelerator_memory_used_bytes;Gauge;Total accelerator memory allocated;bytes;accelerator; 5 | container_blkio_device_usage_total;Counter;Blkio device bytes usage;bytes;diskIO; 6 | container_cpu_cfs_periods_total;Counter;Number of elapsed enforcement period intervals;;cpu; 7 | container_cpu_cfs_throttled_periods_total;Counter;Number of throttled period intervals;;cpu; 8 | container_cpu_cfs_throttled_seconds_total;Counter;Total time duration the container has been throttled;seconds;cpu; 9 | container_cpu_load_average_10s;Gauge;Value of container cpu load average over the last 10 seconds;;cpuLoad; 10 | container_cpu_schedstat_run_periods_total;Counter;Number of times processes of the cgroup have run on the cpu;;sched; 11 | container_cpu_schedstat_runqueue_seconds_total;Counter;Time duration processes of the container have been waiting on a runqueue;seconds;sched; 12 | container_cpu_schedstat_run_seconds_total;Counter;Time duration the processes of the container have run on the CPU;seconds;sched; 13 | container_cpu_system_seconds_total;Counter;Cumulative system cpu time consumed;seconds;cpu; 14 | container_cpu_usage_seconds_total;Counter;Cumulative cpu time consumed;seconds;cpu; 15 | container_cpu_user_seconds_total;Counter;Cumulative user cpu time consumed;seconds;cpu; 16 | container_file_descriptors;Gauge;Number of open file descriptors for the container;;process; 17 | container_fs_inodes_free;Gauge;Number of available Inodes;;disk; 18 | container_fs_inodes_total;Gauge;Total number of Inodes;;disk; 19 | container_fs_io_current;Gauge;Number of I/Os currently in progress;;diskIO; 20 | container_fs_io_time_seconds_total;Counter;Cumulative count of seconds spent doing I/Os;seconds;diskIO; 21 | container_fs_io_time_weighted_seconds_total;Counter;Cumulative weighted I/O time;seconds;diskIO; 22 | container_fs_limit_bytes;Gauge;Number of bytes that can be consumed by the container on this filesystem;bytes;disk; 23 | container_fs_reads_bytes_total;Counter;Cumulative count of bytes read;bytes;diskIO; 24 | container_fs_read_seconds_total;Counter;Cumulative count of seconds spent reading;;diskIO; 25 | container_fs_reads_merged_total;Counter;Cumulative count of reads merged;;diskIO; 26 | container_fs_reads_total;Counter;Cumulative count of reads completed;;diskIO; 27 | container_fs_sector_reads_total;Counter;Cumulative count of sector reads completed;;diskIO; 28 | container_fs_sector_writes_total;Counter;Cumulative count of sector writes completed;;diskIO; 29 | container_fs_usage_bytes;Gauge;Number of bytes that are consumed by the container on this filesystem;bytes;disk; 30 | container_fs_writes_bytes_total;Counter;Cumulative count of bytes written;bytes;diskIO; 31 | container_fs_write_seconds_total;Counter;Cumulative count of seconds spent writing;seconds;diskIO; 32 | container_fs_writes_merged_total;Counter;Cumulative count of writes merged;;diskIO; 33 | container_fs_writes_total;Counter;Cumulative count of writes completed;;diskIO; 34 | container_hugetlb_failcnt;Counter;Number of hugepage usage hits limits;;hugetlb; 35 | container_hugetlb_max_usage_bytes;Gauge;Maximum hugepage usages recorded;bytes;hugetlb; 36 | container_hugetlb_usage_bytes;Gauge;Current hugepage usage;bytes;hugetlb; 37 | container_last_seen;Gauge;Last time a container was seen by the exporter;timestamp;-; 38 | container_llc_occupancy_bytes;Gauge;Last level cache usage statistics for container counted with RDT Memory Bandwidth Monitoring (MBM).;bytes;resctrl; 39 | container_memory_bandwidth_bytes;Gauge;Total memory bandwidth usage statistics for container counted with RDT Memory Bandwidth Monitoring (MBM).;bytes;resctrl; 40 | container_memory_bandwidth_local_bytes;Gauge;Local memory bandwidth usage statistics for container counted with RDT Memory Bandwidth Monitoring (MBM).;bytes;resctrl; 41 | container_memory_cache;Gauge;Total page cache memory;bytes;memory; 42 | container_memory_failcnt;Counter;Number of memory usage hits limits;;memory; 43 | container_memory_failures_total;Counter;Cumulative count of memory allocation failures;;memory; 44 | container_memory_mapped_file;Gauge;Size of memory mapped files;bytes;memory; 45 | container_memory_max_usage_bytes;Gauge;Maximum memory usage recorded;bytes;memory; 46 | container_memory_migrate;Gauge;Memory migrate status;;cpuset; 47 | container_memory_numa_pages;Gauge;Number of used pages per NUMA node;;memory_numa; 48 | container_memory_rss;Gauge;Size of RSS;bytes;memory; 49 | container_memory_swap;Gauge;Container swap usage;bytes;memory; 50 | container_memory_usage_bytes;Gauge;Current memory usage, including all memory regardless of when it was accessed;bytes;memory; 51 | container_memory_working_set_bytes;Gauge;Current working set;bytes;memory; 52 | container_network_advance_tcp_stats_total;Gauge;advanced tcp connections statistic for container;;advtcp; 53 | container_network_receive_bytes_total;Counter;Cumulative count of bytes received;bytes;network; 54 | container_network_receive_errors_total;Counter;Cumulative count of errors encountered while receiving;;network; 55 | container_network_receive_packets_dropped_total;Counter;Cumulative count of packets dropped while receiving;;network; 56 | container_network_receive_packets_total;Counter;Cumulative count of packets received;;network; 57 | container_network_tcp6_usage_total;Gauge;tcp6 connection usage statistic for container;;tcp; 58 | container_network_tcp_usage_total;Gauge;tcp connection usage statistic for container;;tcp; 59 | container_network_transmit_bytes_total;Counter;Cumulative count of bytes transmitted;bytes;network; 60 | container_network_transmit_errors_total;Counter;Cumulative count of errors encountered while transmitting;;network; 61 | container_network_transmit_packets_dropped_total;Counter;Cumulative count of packets dropped while transmitting;;network; 62 | container_network_transmit_packets_total;Counter;Cumulative count of packets transmitted;;network; 63 | container_network_udp6_usage_total;Gauge;udp6 connection usage statistic for container;;udp; 64 | container_network_udp_usage_total;Gauge;udp connection usage statistic for container;;udp; 65 | container_oom_events_total;Counter;Count of out of memory events observed for the container;;oom_event; 66 | container_perf_events_scaling_ratio;Gauge;Scaling ratio for perf event counter (event can be identified by event label and cpu indicates the core for which event was measured). See perf event configuration.;;perf_event;libpfm 67 | container_perf_events_total;Counter;Scaled counter of perf core event (event can be identified by event label and cpu indicates the core for which event was measured). See perf event configuration.;;perf_event;libpfm 68 | container_perf_uncore_events_scaling_ratio;Gauge;"Scaling ratio for perf uncore event counter (event can be identified by event label, pmu and socket lables indicate the PMU and the CPU socket for which event was measured). See perf event configuration. Metric exists only for main cgroup (id=""/"").";;perf_event;libpfm 69 | container_perf_uncore_events_total;Counter;"Scaled counter of perf uncore event (event can be identified by event label, pmu and socket lables indicate the PMU and the CPU socket for which event was measured). See perf event configuration). Metric exists only for main cgroup (id=""/"").";;perf_event;libpfm 70 | container_processes;Gauge;Number of processes running inside the container;;process; 71 | container_referenced_bytes;Gauge;"Container referenced bytes during last measurements cycle based on Referenced field in /proc/smaps file, with /proc/PIDs/clear_refs set to 1 after defined number of cycles configured through referenced_reset_interval cAdvisor parameter. 72 | Warning: this is intrusive collection because can influence kernel page reclaim policy and add latency. Refer to https://github.com/brendangregg/wss#wsspl-referenced-page-flag for more details.";bytes;referenced_memory; 73 | container_sockets;Gauge;Number of open sockets for the container;;process; 74 | container_spec_cpu_period;Gauge;CPU period of the container;;-; 75 | container_spec_cpu_quota;Gauge;CPU quota of the container;;-; 76 | container_spec_cpu_shares;Gauge;CPU share of the container;;-; 77 | container_spec_memory_limit_bytes;Gauge;Memory limit for the container;bytes;-; 78 | container_spec_memory_reservation_limit_bytes;Gauge;Memory reservation limit for the container;bytes;; 79 | container_spec_memory_swap_limit_bytes;Gauge;Memory swap limit for the container;bytes;; 80 | container_start_time_seconds;Gauge;Start time of the container since unix epoch;seconds;; 81 | container_tasks_state;Gauge;Number of tasks in given state (sleeping, running, stopped, uninterruptible, or ioawaiting);;cpuLoad; 82 | container_threads;Gauge;Number of threads running inside the container;;process; 83 | container_threads_max;Gauge;Maximum number of threads allowed inside the container;;process; -------------------------------------------------------------------------------- /cAdvisor_query.csv: -------------------------------------------------------------------------------- 1 | 'container_cpu_cfs_periods_total','' 2 | 'container_cpu_cfs_throttled_periods_total', 3 | 'container_cpu_cfs_throttled_seconds_total', 4 | 'container_cpu_load_average_10s', 5 | 'container_cpu_system_seconds_total', 6 | 'container_cpu_usage_seconds_total', 7 | 'container_cpu_user_seconds_total', 8 | 'container_file_descriptors', 9 | 'container_fs_inodes_free', 10 | 'container_fs_inodes_total', 11 | 'container_fs_io_current', 12 | 'container_fs_io_time_seconds_total', 13 | 'container_fs_io_time_weighted_seconds_total', 14 | 'container_fs_limit_bytes', 15 | 'container_fs_read_seconds_total', 16 | 'container_fs_reads_bytes_total', 17 | 'container_fs_reads_merged_total', 18 | 'container_fs_reads_total', 19 | 'container_fs_sector_reads_total', 20 | 'container_fs_sector_writes_total', 21 | 'container_fs_usage_bytes', 22 | 'container_fs_write_seconds_total', 23 | 'container_fs_writes_bytes_total', 24 | 'container_fs_writes_merged_total', 25 | 'container_fs_writes_total', 26 | 'container_last_seen', 27 | 'container_memory_cache', 28 | 'container_memory_failcnt', 29 | 'container_memory_failures_total', 30 | 'container_memory_mapped_file', 31 | 'container_memory_max_usage_bytes', 32 | 'container_memory_rss', 33 | 'container_memory_swap', 34 | 'container_memory_usage_bytes', 35 | 'container_memory_working_set_bytes', 36 | 'container_network_receive_bytes_total', 37 | 'container_network_receive_errors_total', 38 | 'container_network_receive_packets_dropped_total', 39 | 'container_network_receive_packets_total', 40 | 'container_network_transmit_bytes_total', 41 | 'container_network_transmit_errors_total', 42 | 'container_network_transmit_packets_dropped_total', 43 | 'container_network_transmit_packets_total', 44 | 'container_processes', 45 | 'container_scrape_error', 46 | 'container_sockets', 47 | 'container_spec_cpu_period', 48 | 'container_spec_cpu_quota', 49 | 'container_spec_cpu_shares', 50 | 'container_spec_memory_limit_bytes', 51 | 'container_spec_memory_reservation_limit_bytes', 52 | 'container_spec_memory_swap_limit_bytes', 53 | 'container_start_time_seconds', 54 | 'container_tasks_state', 55 | 'container_threads', 56 | 'container_threads_max' -------------------------------------------------------------------------------- /config.py: -------------------------------------------------------------------------------- 1 | """ 2 | Contains config parameters for app 3 | """ 4 | 5 | 6 | class CONFIG: 7 | """Dataclass with app parameters""" 8 | 9 | def __init__(self): 10 | pass 11 | 12 | # You must change this to the filename you wish to use as input data! 13 | # data_filename = "alarm.csv" 14 | 15 | # Epochs 16 | epochs = 500 17 | 18 | # Batch size (note: should be divisible by sample size, otherwise throw an error) 19 | batch_size = 50 20 | 21 | # Learning rate (baseline rate = 1e-3) 22 | lr = 1e-3 23 | 24 | x_dims = 1 25 | z_dims = 1 26 | #data_variable_size = 12 27 | optimizer = "Adam" 28 | graph_threshold = 0.3 29 | tau_A = 0.0 30 | lambda_A = 0.0 31 | c_A = 1 32 | use_A_connect_loss = 0 33 | use_A_positiver_loss = 0 34 | #no_cuda = True 35 | seed = 42 36 | encoder_hidden = 64 37 | decoder_hidden = 64 38 | temp = 0.5 39 | k_max_iter = 1e2 40 | encoder = "mlp" 41 | decoder = "mlp" 42 | no_factor = False 43 | encoder_dropout = 0.0 44 | decoder_dropout = (0.0,) 45 | h_tol = 1e-8 46 | lr_decay = 200 47 | gamma = 1.0 48 | prior = False 49 | -------------------------------------------------------------------------------- /cpu-test.txt: -------------------------------------------------------------------------------- 1 | VM2 2 | 3 | sleep 3m; 4 | 5 | echo $(date +%Y-%m-%d" "%H:%M:%S); 6 | echo "front-end"; 7 | pumba stress --duration=5m --stressors="--cpu 4" k8s_front-end_front-end-6fc8dff6ff-754cl_sock-shop_bdcf7267-b66f-4972-bf79-ec5a4be4fd1b_0; 8 | 9 | echo "************" 10 | 11 | sleep 3m; 12 | 13 | echo $(date +%Y-%m-%d" "%H:%M:%S); 14 | echo "user"; 15 | pumba stress --duration=5m --stressors="--cpu 4" k8s_user_user-69c75f5cd5-bj2d5_sock-shop_9693d93a-cbde-4cfa-9c9f-afce7a6209af_0; 16 | 17 | echo "************" 18 | 19 | sleep 3m; 20 | 21 | echo $(date +%Y-%m-%d" "%H:%M:%S); 22 | echo "catalogue"; 23 | pumba stress --duration=5m --stressors="--cpu 4" k8s_catalogue_catalogue-57684ccb8d-p2hl4_sock-shop_e71f2395-aaca-4c9d-bf5a-12687a3b2140_0; 24 | 25 | echo "************" 26 | 27 | sleep 3m; 28 | 29 | echo $(date +%Y-%m-%d" "%H:%M:%S); 30 | echo "orders"; 31 | pumba stress --duration=5m --stressors="--cpu 4" k8s_orders_orders-98b4dc5bf-vqkhw_sock-shop_07e977f0-ed32-4d83-b524-9debcbfc7128_0; 32 | 33 | echo "************" 34 | 35 | ---------------------------------------------------------------------------- 36 | 37 | VM3 38 | 39 | sleep 3m; 40 | 41 | echo $(date +%Y-%m-%d" "%H:%M:%S); 42 | echo "carts"; 43 | pumba stress --duration=5m --stressors="--cpu 4" k8s_carts_carts-6958597bcf-nqm4m_sock-shop_a615e9fb-4214-48b1-a187-8eec921675ed_0; 44 | 45 | echo "************" 46 | 47 | sleep 3m; 48 | 49 | echo $(date +%Y-%m-%d" "%H:%M:%S); 50 | echo "shipping"; 51 | pumba stress --duration=5m --stressors="--cpu 4" k8s_shipping_shipping-6b88db4b4c-dtxvw_sock-shop_f55880e3-03dc-4125-8d38-d675c05f5264_0; 52 | 53 | echo "************" 54 | 55 | 56 | ---------------------------------------------------------------------------- 57 | VM1 58 | 59 | sleep 3m; 60 | 61 | echo $(date +%Y-%m-%d" "%H:%M:%S); 62 | echo "payment"; 63 | pumba stress --duration=5m --stressors="--cpu 4" k8s_payment_payment-6c59577559-xnsdh_sock-shop_ce23595c-e4e7-4e82-b8e6-f348aa3dbf88_0 64 | 65 | echo "************" 66 | 67 | -------------------------------------------------------------------------------- /data_collected/cpu-hog1_carts.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/data_collected/cpu-hog1_carts.pkl -------------------------------------------------------------------------------- /data_collected/cpu-hog1_catalogue.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/data_collected/cpu-hog1_catalogue.pkl -------------------------------------------------------------------------------- /data_collected/cpu-hog1_front-end.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/data_collected/cpu-hog1_front-end.pkl -------------------------------------------------------------------------------- /data_collected/cpu-hog1_orders.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/data_collected/cpu-hog1_orders.pkl -------------------------------------------------------------------------------- /data_collected/cpu-hog1_payment.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/data_collected/cpu-hog1_payment.pkl -------------------------------------------------------------------------------- /data_collected/cpu-hog1_shipping.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/data_collected/cpu-hog1_shipping.pkl -------------------------------------------------------------------------------- /data_collected/cpu-hog1_user.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/data_collected/cpu-hog1_user.pkl -------------------------------------------------------------------------------- /data_collected/latency1_carts.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/data_collected/latency1_carts.pkl -------------------------------------------------------------------------------- /data_collected/latency1_catalogue.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/data_collected/latency1_catalogue.pkl -------------------------------------------------------------------------------- /data_collected/latency1_front-end.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/data_collected/latency1_front-end.pkl -------------------------------------------------------------------------------- /data_collected/latency1_orders.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/data_collected/latency1_orders.pkl -------------------------------------------------------------------------------- /data_collected/latency1_payment.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/data_collected/latency1_payment.pkl -------------------------------------------------------------------------------- /data_collected/latency1_shipping-old.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/data_collected/latency1_shipping-old.pkl -------------------------------------------------------------------------------- /data_collected/latency1_shipping.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/data_collected/latency1_shipping.pkl -------------------------------------------------------------------------------- /data_collected/latency1_user.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/data_collected/latency1_user.pkl -------------------------------------------------------------------------------- /data_collected/memory-leak1_carts.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/data_collected/memory-leak1_carts.pkl -------------------------------------------------------------------------------- /data_collected/memory-leak1_catalogue.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/data_collected/memory-leak1_catalogue.pkl -------------------------------------------------------------------------------- /data_collected/memory-leak1_front-end.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/data_collected/memory-leak1_front-end.pkl -------------------------------------------------------------------------------- /data_collected/memory-leak1_orders.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/data_collected/memory-leak1_orders.pkl -------------------------------------------------------------------------------- /data_collected/memory-leak1_payment.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/data_collected/memory-leak1_payment.pkl -------------------------------------------------------------------------------- /data_collected/memory-leak1_shipping.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/data_collected/memory-leak1_shipping.pkl -------------------------------------------------------------------------------- /data_collected/memory-leak1_user.pkl: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/data_collected/memory-leak1_user.pkl -------------------------------------------------------------------------------- /figures/all_service_cpu.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/figures/all_service_cpu.pdf -------------------------------------------------------------------------------- /figures/all_service_mem.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/figures/all_service_mem.pdf -------------------------------------------------------------------------------- /figures/all_service_net.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/figures/all_service_net.pdf -------------------------------------------------------------------------------- /figures/diagnosis-fram.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/figures/diagnosis-fram.png -------------------------------------------------------------------------------- /figures/example_metrics.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/figures/example_metrics.pdf -------------------------------------------------------------------------------- /figures/example_metrics.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/figures/example_metrics.png -------------------------------------------------------------------------------- /figures/service-latency_Mem-leak.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/figures/service-latency_Mem-leak.pdf -------------------------------------------------------------------------------- /figures/service-latency_Mem-leak_std.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/figures/service-latency_Mem-leak_std.pdf -------------------------------------------------------------------------------- /figures/service-latency_cpu-hog.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/figures/service-latency_cpu-hog.pdf -------------------------------------------------------------------------------- /figures/service-latency_cpu-hog_std.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/figures/service-latency_cpu-hog_std.pdf -------------------------------------------------------------------------------- /figures/service-latency_net-lat.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/figures/service-latency_net-lat.pdf -------------------------------------------------------------------------------- /figures/service-latency_net-lat_std.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/figures/service-latency_net-lat_std.pdf -------------------------------------------------------------------------------- /figures/service-single_ac3.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/figures/service-single_ac3.pdf -------------------------------------------------------------------------------- /figures/service-single_avg5_std.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/figures/service-single_avg5_std.pdf -------------------------------------------------------------------------------- /graph.py: -------------------------------------------------------------------------------- 1 | """ 2 | Define the interface for algorithms to access relations 3 | """ 4 | from abc import ABC 5 | from typing import Dict 6 | from typing import List 7 | from typing import Set 8 | from typing import Union 9 | 10 | import networkx as nx 11 | import json 12 | 13 | ENCODING = "UTF-8" 14 | def dump_json(filename: str, data): 15 | """ 16 | Dump data into a json file 17 | """ 18 | with open(filename, "w", encoding=ENCODING) as obj: 19 | json.dump(data, obj, ensure_ascii=False, indent=2) 20 | 21 | 22 | def load_json(filename: str): 23 | """ 24 | Load data from a json file 25 | """ 26 | with open(filename, encoding=ENCODING) as obj: 27 | return json.load(obj) 28 | 29 | 30 | 31 | class Node: 32 | """ 33 | The element of a graph 34 | """ 35 | 36 | def __init__(self, entity: str, metric: str): 37 | self._entity = entity 38 | self._metric = metric 39 | 40 | @property 41 | def entity(self) -> str: 42 | """ 43 | Entity getter 44 | """ 45 | return self._entity 46 | 47 | @property 48 | def metric(self) -> str: 49 | """ 50 | Metric getter 51 | """ 52 | return self._metric 53 | 54 | def asdict(self) -> Dict[str, str]: 55 | """ 56 | Serialized as a dict 57 | """ 58 | return {"entity": self._entity, "metric": self._metric} 59 | 60 | def __eq__(self, obj: object) -> bool: 61 | if isinstance(obj, Node): 62 | return self.entity == obj.entity and self.metric == obj.metric 63 | return False 64 | 65 | def __hash__(self) -> int: 66 | return hash((self.entity, self.metric)) 67 | 68 | def __repr__(self) -> str: 69 | return f"Node{(self.entity, self.metric)}" 70 | 71 | 72 | class LoadingInvalidGraphException(Exception): 73 | """ 74 | This exception indicates that Graph tries to load from a broken file 75 | """ 76 | 77 | 78 | class Graph(ABC): 79 | """ 80 | The abstract interface to access relations 81 | """ 82 | 83 | def __init__(self): 84 | self._nodes: Set[Node] = set() 85 | self._sorted_nodes: List[Set[Node]] = None 86 | 87 | def dump(self, filename: str) -> bool: 88 | # pylint: disable=no-self-use, unused-argument 89 | """ 90 | Dump a graph into the given file 91 | 92 | Return whether the operation succeeds 93 | """ 94 | return False 95 | 96 | @classmethod 97 | def load(cls, filename: str) -> Union["Graph", None]: 98 | # pylint: disable=unused-argument 99 | """ 100 | Load a graph from the given file 101 | 102 | Returns: 103 | - A graph, if available 104 | - None, if dump/load is not supported 105 | - Raise LoadingInvalidGraphException if the file cannot be parsed 106 | """ 107 | return None 108 | 109 | @property 110 | def nodes(self) -> Set[Node]: 111 | """ 112 | Get the set of nodes in the graph 113 | """ 114 | return self._nodes 115 | 116 | @property 117 | def topological_sort(self) -> List[Set[Node]]: 118 | """ 119 | Sort nodes with parents first 120 | 121 | The graph specifies the parents of each node. 122 | """ 123 | if self._sorted_nodes: 124 | return self._sorted_nodes 125 | 126 | degrees = {node: len(self.parents(node)) for node in self.nodes} 127 | 128 | nodes: List[Set[Node]] = [] 129 | while degrees: 130 | minimum = min(degrees.values()) 131 | node_set = {node for node, degree in degrees.items() if degree == minimum} 132 | nodes.append(node_set) 133 | for node in node_set: 134 | degrees.pop(node) 135 | for child in self.children(node): 136 | if child in degrees: 137 | degrees[child] -= 1 138 | 139 | self._sorted_nodes = nodes 140 | return nodes 141 | 142 | def children(self, node: Node, **kwargs) -> Set[Node]: 143 | """ 144 | Get the children of the given node in the graph 145 | """ 146 | raise NotImplementedError 147 | 148 | def parents(self, node: Node, **kwargs) -> Set[Node]: 149 | """ 150 | Get the parents of the given node in the graph 151 | """ 152 | raise NotImplementedError 153 | 154 | 155 | class MemoryGraph(Graph): 156 | """ 157 | Implement Graph with data in memory 158 | """ 159 | 160 | def __init__(self, graph: nx.DiGraph): 161 | """ 162 | graph: The whole graph 163 | """ 164 | super().__init__() 165 | self._graph = graph 166 | self._nodes.update(self._graph.nodes) 167 | 168 | def dump(self, filename: str) -> bool: 169 | nodes: List[Node] = list(self._graph.nodes) 170 | node_indexes = {node: index for index, node in enumerate(nodes)} 171 | edges = [ 172 | (node_indexes[cause], node_indexes[effect]) 173 | for cause, effect in self._graph.edges 174 | ] 175 | data = dict(nodes=[node.asdict() for node in nodes], edges=edges) 176 | dump_json(filename=filename, data=data) 177 | 178 | @classmethod 179 | def load(cls, filename: str) -> Union["MemoryGraph", None]: 180 | data: dict = load_json(filename=filename) 181 | if "nodes" not in data or "edges" not in data: 182 | raise LoadingInvalidGraphException(filename) 183 | nodes: List[Node] = [Node(**node) for node in data["nodes"]] 184 | graph = nx.DiGraph() 185 | graph.add_nodes_from(nodes) 186 | graph.add_edges_from( 187 | (nodes[cause], nodes[effect]) for cause, effect in data["edges"] 188 | ) 189 | return MemoryGraph(graph) 190 | 191 | def children(self, node: Node, **kwargs) -> Set[Node]: 192 | if not self._graph.has_node(node): 193 | return set() 194 | return set(self._graph.successors(node)) 195 | 196 | def parents(self, node: Node, **kwargs) -> Set[Node]: 197 | if not self._graph.has_node(node): 198 | return set() 199 | return set(self._graph.predecessors(node)) 200 | -------------------------------------------------------------------------------- /memory-test.txt: -------------------------------------------------------------------------------- 1 | VM2 2 | 3 | sleep 3m; 4 | 5 | echo $(date +%Y-%m-%d" "%H:%M:%S); 6 | echo "front-end"; 7 | pumba stress --duration=5m --stressors="-m 1 --vm-bytes 500M" k8s_front-end_front-end-6fc8dff6ff-754cl_sock-shop_bdcf7267-b66f-4972-bf79-ec5a4be4fd1b_0; 8 | 9 | echo "************" 10 | 11 | sleep 3m; 12 | 13 | echo $(date +%Y-%m-%d" "%H:%M:%S); 14 | echo "user"; 15 | pumba stress --duration=5m --stressors="-m 1 --vm-bytes 100M" k8s_user_user-69c75f5cd5-bj2d5_sock-shop_9693d93a-cbde-4cfa-9c9f-afce7a6209af_0; 16 | 17 | echo "************" 18 | 19 | sleep 3m; 20 | 21 | echo $(date +%Y-%m-%d" "%H:%M:%S); 22 | echo "catalogue"; 23 | pumba stress --duration=5m --stressors="-m 1 --vm-bytes 100M" k8s_catalogue_catalogue-57684ccb8d-p2hl4_sock-shop_e71f2395-aaca-4c9d-bf5a-12687a3b2140_0; 24 | 25 | echo "************" 26 | 27 | sleep 3m; 28 | 29 | echo $(date +%Y-%m-%d" "%H:%M:%S); 30 | echo "orders"; 31 | pumba stress --duration=5m --stressors="-m 1 --vm-bytes 100M" k8s_orders_orders-98b4dc5bf-vqkhw_sock-shop_07e977f0-ed32-4d83-b524-9debcbfc7128_0; 32 | 33 | echo "************" 34 | 35 | ---------------------------------------------------------------------------- 36 | 37 | VM3 38 | 39 | sleep 3m; 40 | 41 | echo $(date +%Y-%m-%d" "%H:%M:%S); 42 | echo "carts"; 43 | pumba stress --duration=5m --stressors="-m 1 --vm-bytes 100M" k8s_carts_carts-6958597bcf-nqm4m_sock-shop_a615e9fb-4214-48b1-a187-8eec921675ed_0; 44 | 45 | echo "************" 46 | 47 | sleep 3m; 48 | 49 | echo $(date +%Y-%m-%d" "%H:%M:%S); 50 | echo "shipping"; 51 | pumba stress --duration=5m --stressors="-m 1 --vm-bytes 100M" k8s_shipping_shipping-6b88db4b4c-dtxvw_sock-shop_f55880e3-03dc-4125-8d38-d675c05f5264_0; 52 | 53 | echo "************" 54 | 55 | 56 | ---------------------------------------------------------------------------- 57 | VM1 58 | 59 | sleep 3m; 60 | 61 | echo $(date +%Y-%m-%d" "%H:%M:%S); 62 | echo "payment"; 63 | pumba stress --duration=5m --stressors="-m 1 --vm-bytes 100M" k8s_payment_payment-6c59577559-xnsdh_sock-shop_ce23595c-e4e7-4e82-b8e6-f348aa3dbf88_0; 64 | 65 | echo "************" 66 | 67 | -------------------------------------------------------------------------------- /metrics_causality.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/metrics_causality.png -------------------------------------------------------------------------------- /modules.py: -------------------------------------------------------------------------------- 1 | import torch 2 | import torch.nn as nn 3 | import torch.nn.functional as F 4 | import math 5 | 6 | from torch.autograd import Variable 7 | from utils import my_softmax, get_offdiag_indices, gumbel_softmax, preprocess_adj, preprocess_adj_new, preprocess_adj_new1, gauss_sample_z, my_normalize 8 | 9 | _EPS = 1e-10 10 | 11 | 12 | class MLPEncoder(nn.Module): 13 | """MLP encoder module.""" 14 | def __init__(self, n_in, n_xdims, n_hid, n_out, adj_A, batch_size, do_prob=0., factor=True, tol = 0.1): 15 | super(MLPEncoder, self).__init__() 16 | 17 | self.adj_A = nn.Parameter(Variable(torch.from_numpy(adj_A).double(), requires_grad=True)) 18 | self.factor = factor 19 | 20 | self.Wa = nn.Parameter(torch.zeros(n_out), requires_grad=True) 21 | self.fc1 = nn.Linear(n_xdims, n_hid, bias = True) 22 | self.fc2 = nn.Linear(n_hid, n_out, bias = True) 23 | self.dropout_prob = do_prob 24 | self.batch_size = batch_size 25 | self.z = nn.Parameter(torch.tensor(tol)) 26 | self.z_positive = nn.Parameter(torch.ones_like(torch.from_numpy(adj_A)).double()) 27 | self.init_weights() 28 | 29 | def init_weights(self): 30 | for m in self.modules(): 31 | if isinstance(m, nn.Linear): 32 | nn.init.xavier_normal_(m.weight.data) 33 | elif isinstance(m, nn.BatchNorm1d): 34 | m.weight.data.fill_(1) 35 | m.bias.data.zero_() 36 | 37 | 38 | def forward(self, inputs): 39 | if torch.sum(self.adj_A != self.adj_A): 40 | print('nan error \n') 41 | 42 | # to amplify the value of A and accelerate convergence. 43 | adj_A1 = torch.sinh(3.*self.adj_A) 44 | 45 | # adj_Aforz = I-A^T 46 | adj_Aforz = preprocess_adj_new(adj_A1) 47 | 48 | adj_A = torch.eye(adj_A1.size()[0]).double() 49 | H1 = F.relu((self.fc1(inputs))) 50 | x = (self.fc2(H1)) 51 | logits = torch.matmul(adj_Aforz, x+self.Wa) -self.Wa 52 | 53 | return x, logits, adj_A1, adj_A, self.z, self.z_positive, self.adj_A, self.Wa 54 | 55 | class SEMEncoder(nn.Module): 56 | """SEM encoder module.""" 57 | def __init__(self, n_in, n_hid, n_out, adj_A, batch_size, do_prob=0., factor=True, tol = 0.1): 58 | super(SEMEncoder, self).__init__() 59 | 60 | self.factor = factor 61 | self.adj_A = nn.Parameter(Variable(torch.from_numpy(adj_A).double(), requires_grad = True)) 62 | self.dropout_prob = do_prob 63 | self.batch_size = batch_size 64 | 65 | def init_weights(self): 66 | nn.init.xavier_normal(self.adj_A.data) 67 | 68 | def forward(self, inputs): 69 | 70 | if torch.sum(self.adj_A != self.adj_A): 71 | print('nan error \n') 72 | 73 | adj_A1 = torch.sinh(3.*self.adj_A) 74 | 75 | # adj_A = I-A^T, adj_A_inv = (I-A^T)^(-1) 76 | adj_A = preprocess_adj_new((adj_A1)) 77 | adj_A_inv = preprocess_adj_new1((adj_A1)) 78 | 79 | meanF = torch.matmul(adj_A_inv, torch.mean(torch.matmul(adj_A, inputs), 0)) 80 | logits = torch.matmul(adj_A, inputs-meanF) 81 | 82 | return inputs-meanF, logits, adj_A1, adj_A, self.z, self.z_positive, self.adj_A 83 | 84 | 85 | class MLPDecoder(nn.Module): 86 | """MLP decoder module.""" 87 | 88 | def __init__(self, n_in_node, n_in_z, n_out, encoder, data_variable_size, batch_size, n_hid, 89 | do_prob=0.): 90 | super(MLPDecoder, self).__init__() 91 | 92 | self.out_fc1 = nn.Linear(n_in_z, n_hid, bias = True) 93 | self.out_fc2 = nn.Linear(n_hid, n_out, bias = True) 94 | 95 | self.batch_size = batch_size 96 | self.data_variable_size = data_variable_size 97 | 98 | self.dropout_prob = do_prob 99 | 100 | self.init_weights() 101 | 102 | def init_weights(self): 103 | for m in self.modules(): 104 | if isinstance(m, nn.Linear): 105 | nn.init.xavier_normal_(m.weight.data) 106 | m.bias.data.fill_(0.0) 107 | elif isinstance(m, nn.BatchNorm1d): 108 | m.weight.data.fill_(1) 109 | m.bias.data.zero_() 110 | 111 | def forward(self, inputs, input_z, n_in_node, origin_A, adj_A_tilt, Wa): 112 | 113 | #adj_A_new1 = (I-A^T)^(-1) 114 | adj_A_new1 = preprocess_adj_new1(origin_A) 115 | mat_z = torch.matmul(adj_A_new1, input_z+Wa)-Wa 116 | 117 | H3 = F.relu(self.out_fc1((mat_z))) 118 | out = self.out_fc2(H3) 119 | 120 | return mat_z, out, adj_A_tilt 121 | 122 | class SEMDecoder(nn.Module): 123 | """SEM decoder module.""" 124 | 125 | def __init__(self, n_in_node, n_in_z, n_out, encoder, data_variable_size, batch_size, n_hid, 126 | do_prob=0.): 127 | super(SEMDecoder, self).__init__() 128 | 129 | self.batch_size = batch_size 130 | self.data_variable_size = data_variable_size 131 | 132 | print('Using learned interaction net decoder.') 133 | 134 | self.dropout_prob = do_prob 135 | 136 | def forward(self, inputs, input_z, n_in_node, origin_A, adj_A_tilt, Wa): 137 | 138 | # adj_A_new1 = (I-A^T)^(-1) 139 | adj_A_new1 = preprocess_adj_new1(origin_A) 140 | mat_z = torch.matmul(adj_A_new1, input_z + Wa) 141 | out = mat_z 142 | 143 | return mat_z, out-Wa, adj_A_tilt 144 | 145 | -------------------------------------------------------------------------------- /network-test.txt: -------------------------------------------------------------------------------- 1 | VM2 2 | 3 | sleep 3m; 4 | 5 | echo $(date +%Y-%m-%d" "%H:%M:%S); 6 | echo "front-end"; 7 | pumba netem --tc-image gaiadocker/iproute2 --duration 5m delay --time 100 k8s_front-end_front-end-6fc8dff6ff-754cl_sock-shop_bdcf7267-b66f-4972-bf79-ec5a4be4fd1b_0; 8 | 9 | echo "************" 10 | 11 | sleep 3m; 12 | 13 | echo $(date +%Y-%m-%d" "%H:%M:%S); 14 | echo "user"; 15 | pumba netem --tc-image gaiadocker/iproute2 --duration 5m delay --time 100 k8s_user_user-69c75f5cd5-bj2d5_sock-shop_9693d93a-cbde-4cfa-9c9f-afce7a6209af_0; 16 | 17 | echo "************" 18 | 19 | sleep 3m; 20 | 21 | echo $(date +%Y-%m-%d" "%H:%M:%S); 22 | echo "catalogue"; 23 | pumba netem --tc-image gaiadocker/iproute2 --duration 5m delay --time 100 k8s_catalogue_catalogue-57684ccb8d-p2hl4_sock-shop_e71f2395-aaca-4c9d-bf5a-12687a3b2140_0; 24 | 25 | echo "************" 26 | 27 | sleep 3m; 28 | 29 | echo $(date +%Y-%m-%d" "%H:%M:%S); 30 | echo "orders"; 31 | pumba netem --tc-image gaiadocker/iproute2 --duration 5m delay --time 100 k8s_orders_orders-98b4dc5bf-vqkhw_sock-shop_07e977f0-ed32-4d83-b524-9debcbfc7128_0; 32 | 33 | echo "************" 34 | 35 | ---------------------------------------------------------------------------- 36 | 37 | VM3 38 | 39 | sleep 3m; 40 | 41 | echo $(date +%Y-%m-%d" "%H:%M:%S); 42 | echo "carts"; 43 | pumba netem --tc-image gaiadocker/iproute2 --duration 5m delay --time 100 k8s_carts_carts-6958597bcf-nqm4m_sock-shop_a615e9fb-4214-48b1-a187-8eec921675ed_0; 44 | 45 | echo "************" 46 | 47 | sleep 3m; 48 | 49 | echo $(date +%Y-%m-%d" "%H:%M:%S); 50 | echo "shipping"; 51 | pumba netem --tc-image gaiadocker/iproute2 --duration 5m delay --time 100 k8s_shipping_shipping-6b88db4b4c-dtxvw_sock-shop_f55880e3-03dc-4125-8d38-d675c05f5264_0; 52 | 53 | echo "************" 54 | 55 | 56 | ---------------------------------------------------------------------------- 57 | VM1 58 | 59 | sleep 3m; 60 | 61 | echo $(date +%Y-%m-%d" "%H:%M:%S); 62 | echo "payment"; 63 | pumba netem --tc-image gaiadocker/iproute2 --duration 5m delay --time 100 k8s_payment_payment-6c59577559-xnsdh_sock-shop_ce23595c-e4e7-4e82-b8e6-f348aa3dbf88_0; 64 | 65 | echo "************" 66 | 67 | -------------------------------------------------------------------------------- /pa_result/all_service_gamma_eta.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/pa_result/all_service_gamma_eta.pdf -------------------------------------------------------------------------------- /pa_result/latency/.DS_Store: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/pa_result/latency/.DS_Store -------------------------------------------------------------------------------- /pa_result/latency/cpu_latency_eta1000.txt: -------------------------------------------------------------------------------- 1 | 2023-03-14 05:24:07 2 | 0 3 | 1 4 | front-end 5 | [(5, 0.3507134396203577), (4, 0.23324668996647654), (3, 0.22486731970180426), (0, 0.08149969603883109), (1, 0.039681209472752214), (6, 0.0383292039095177), (2, 0.03166244129026052)] 6 | 2 7 | front-end 8 | [(5, 0.34782990238851885), (4, 0.24166202790227617), (3, 0.22591436078745558), (0, 0.08155374627587188), (1, 0.03918751903375715), (2, 0.031926221806060154), (6, 0.031926221806060154)] 9 | 3 10 | front-end 11 | 4 12 | front-end 13 | [(3, 0.2723004976861766), (4, 0.2364128341771112), (1, 0.20052517066804576), (6, 0.20052517066804576), (0, 0.030078775600206872), (2, 0.030078775600206872), (5, 0.030078775600206872)] 14 | 5 15 | front-end 16 | [(5, 0.26625328776628326), (4, 0.21756435241585026), (0, 0.21357295777250548), (1, 0.20722692708888618), (6, 0.03559203352393582), (3, 0.029895220716269443), (2, 0.029895220716269443)] 17 | 6 18 | front-end 19 | [(5, 0.33691647841047245), (4, 0.24683803640606714), (3, 0.2299390296961218), (0, 0.07916313800264098), (2, 0.03785081599237365), (1, 0.03743540700404941), (6, 0.03185709448827457)] 20 | 7 21 | front-end 22 | [(5, 0.34848809731827163), (4, 0.24102847510537845), (3, 0.2255903678876614), (0, 0.08172086657321334), (1, 0.03934394313070837), (2, 0.031914124992383455), (6, 0.031914124992383455)] 23 | 8 24 | front-end 25 | 9 26 | front-end 27 | [(3, 0.43143457222586873), (4, 0.24345906574235243), (5, 0.15637097852456755), (0, 0.07268426304085165), (1, 0.03201704015545318), (2, 0.03201704015545318), (6, 0.03201704015545318)] 28 | 10 29 | front-end 30 | [(5, 0.3989035883111588), (3, 0.3276620998323566), (0, 0.09405031786572149), (4, 0.059324420143633094), (1, 0.04397507539750856), (6, 0.04181387676960026), (2, 0.034270621680021124)] 31 | -----one service finish----- 32 | 1 33 | 1 34 | user 35 | [(3, 0.2409563926887428), (1, 0.23187416709183728), (5, 0.22795699770007527), (0, 0.19791263176781876), (4, 0.03817459619501966), (6, 0.0334383197913334), (2, 0.02968689476517282)] 36 | 2 37 | user 38 | [(4, 0.4173723363310744), (6, 0.1730099289056279), (1, 0.13053746264198826), (2, 0.10034575600602315), (0, 0.09233383310168165), (5, 0.05853886715386703), (3, 0.027861815859737554)] 39 | 3 40 | user 41 | [(4, 0.3426850102931869), (6, 0.24686260410871658), (0, 0.2081820565483413), (5, 0.10858840360300176), (1, 0.0312273084822512), (2, 0.0312273084822512), (3, 0.0312273084822512)] 42 | 4 43 | user 44 | [(5, 0.37739020205895485), (2, 0.3027772389211271), (4, 0.09069890232966324), (6, 0.0786130169068281), (3, 0.06459012007189735), (1, 0.05369226504947772), (0, 0.03223825466205145)] 45 | 5 46 | user 47 | [(3, 0.23417013977189577), (1, 0.22774877706248658), (5, 0.2261116517582385), (0, 0.19413208945897567), (4, 0.051677886213051145), (6, 0.037039642316505805), (2, 0.029119813418846358)] 48 | 6 49 | user 50 | [(5, 0.3386799489951891), (3, 0.24818596717089655), (0, 0.20725434267268517), (1, 0.08264521696315624), (4, 0.05395829119064605), (6, 0.038188081606524225), (2, 0.03108815140090278)] 51 | 7 52 | user 53 | [(5, 0.29888351435207244), (2, 0.19353271765478536), (4, 0.19353271765478536), (6, 0.19353271765478536), (1, 0.062458517387135884), (0, 0.02902990764821781), (3, 0.02902990764821781)] 54 | 8 55 | user 56 | [(5, 0.3537894800276551), (4, 0.2581869419410722), (2, 0.21639439348366762), (3, 0.07425170747995471), (0, 0.03245915902255015), (1, 0.03245915902255015), (6, 0.03245915902255015)] 57 | 9 58 | user 59 | [(6, 0.4701352036709886), (5, 0.14560808444520115), (4, 0.13331724810137724), (2, 0.09671297404629751), (3, 0.07738876587497756), (1, 0.047086877448357115), (0, 0.02975084641280082)] 60 | 10 61 | user 62 | [(4, 0.2278163800195075), (0, 0.17334108488904118), (1, 0.17334108488904118), (3, 0.17334108488904118), (5, 0.17334108488904118), (6, 0.0528181176909715), (2, 0.026001162733356183)] 63 | -----one service finish----- 64 | 2 65 | 1 66 | catalogue 67 | [(1, 0.27049076613635886), (2, 0.21351494104266427), (6, 0.21308703136190274), (4, 0.19679825543394866), (0, 0.04706952939494067), (3, 0.029519738315092307), (5, 0.029519738315092307)] 68 | 2 69 | catalogue 70 | [(2, 0.3938815716957046), (6, 0.21277922539016272), (4, 0.16293542726311433), (1, 0.11444938548944955), (0, 0.05695722051998983), (3, 0.029498584820789542), (5, 0.029498584820789542)] 71 | 3 72 | catalogue 73 | [(1, 0.2693332188098872), (2, 0.21509752083475675), (6, 0.21225734232519783), (4, 0.19667717442877566), (0, 0.04763159127274983), (3, 0.029501576164316357), (5, 0.029501576164316357)] 74 | 4 75 | catalogue 76 | 5 77 | catalogue 78 | [(1, 0.240721375128446), (0, 0.2353538381120744), (4, 0.21686616915778328), (6, 0.19556510668905802), (2, 0.04810319057748047), (3, 0.03405555433179916), (5, 0.02933476600335871)] 79 | 6 80 | catalogue 81 | [(1, 0.2682735186961407), (2, 0.21388562143071502), (6, 0.20973224915490196), (4, 0.20289191172061166), (0, 0.04611546977916863), (3, 0.029550614609231085), (5, 0.029550614609231085)] 82 | 7 83 | catalogue 84 | 8 85 | catalogue 86 | [(1, 0.27425509168250367), (2, 0.21501434824654525), (6, 0.2079892612792513), (4, 0.1968479699866148), (0, 0.04683893780910073), (3, 0.02952719549799223), (5, 0.02952719549799223)] 87 | 9 88 | catalogue 89 | [(1, 0.26929097693439696), (2, 0.21493079405202647), (6, 0.21260590865028361), (4, 0.19671892313558276), (0, 0.04743772028703538), (3, 0.02950783847033742), (5, 0.02950783847033742)] 90 | 10 91 | catalogue 92 | -----one service finish----- 93 | 3 94 | 1 95 | orders 96 | [(2, 0.22144161032801016), (4, 0.1911335339578159), (0, 0.1780075320345981), (3, 0.1780075320345981), (5, 0.1780075320345981), (1, 0.026701129805189718), (6, 0.026701129805189718)] 97 | 2 98 | orders 99 | [(2, 0.2560889942902941), (5, 0.21886979624869113), (1, 0.21376998820596285), (3, 0.20596578135657795), (0, 0.040271297279536586), (4, 0.03548663107488276), (6, 0.029547511544054675)] 100 | 3 101 | orders 102 | [(2, 0.4052799782688064), (5, 0.22691916741763685), (1, 0.1550357958827596), (0, 0.0926636017767977), (3, 0.05172984166411437), (4, 0.037994291846174685), (6, 0.030377323143710527)] 103 | 4 104 | orders 105 | 5 106 | orders 107 | [(2, 0.4168930381284276), (5, 0.23447380056010156), (1, 0.15850587452620632), (0, 0.08730885460829298), (3, 0.04062527887055987), (4, 0.031096576653205865), (6, 0.031096576653205865)] 108 | 6 109 | orders 110 | [(2, 0.26052201116272633), (5, 0.21603786368591094), (3, 0.21131494728748185), (0, 0.20240295136974362), (1, 0.04443652798233008), (4, 0.03589034004686934), (6, 0.029395358464937942)] 111 | 7 112 | orders 113 | [(1, 0.26545490461315435), (4, 0.22696132739370342), (5, 0.19353271765478536), (6, 0.19353271765478536), (0, 0.06245851738713588), (2, 0.02902990764821781), (3, 0.02902990764821781)] 114 | 8 115 | orders 116 | [(2, 0.33495239965728363), (5, 0.23769109750961626), (1, 0.2181677464206133), (3, 0.09030564059447671), (0, 0.04831992377801636), (4, 0.039601268744265755), (6, 0.030961923295728142)] 117 | 9 118 | orders 119 | [(2, 0.25457821418137094), (5, 0.21939684237221677), (1, 0.21567844451543183), (3, 0.20601950457899143), (0, 0.03970317486529403), (4, 0.03504238196173015), (6, 0.029581437524964872)] 120 | 10 121 | orders 122 | [(2, 0.22109166384408305), (4, 0.19148348044174304), (0, 0.1780075320345981), (3, 0.1780075320345981), (5, 0.1780075320345981), (1, 0.026701129805189718), (6, 0.026701129805189718)] 123 | -----one service finish----- 124 | 4 125 | 1 126 | carts 127 | [(6, 0.38118163985636305), (2, 0.2335932064798735), (5, 0.15468554291133552), (4, 0.10612056173231707), (3, 0.06486950858751298), (0, 0.029774770216298922), (1, 0.029774770216298922)] 128 | 2 129 | carts 130 | [(0, 0.14285714285714288), (1, 0.14285714285714288), (2, 0.14285714285714288), (3, 0.14285714285714288), (4, 0.14285714285714288), (5, 0.14285714285714288), (6, 0.14285714285714288)] 131 | 3 132 | carts 133 | [(6, 0.38118163985636305), (2, 0.2335932064798735), (5, 0.15468554291133552), (4, 0.10612056173231707), (3, 0.06486950858751298), (0, 0.029774770216298922), (1, 0.029774770216298922)] 134 | 4 135 | carts 136 | [(2, 0.4105558842744868), (5, 0.24827774383227108), (4, 0.1540926557156568), (3, 0.09293620302811094), (0, 0.03137917104982479), (1, 0.03137917104982479), (6, 0.03137917104982479)] 137 | 5 138 | carts 139 | [(6, 0.38118163985636305), (2, 0.2335932064798735), (5, 0.15468554291133554), (3, 0.10612056173231707), (0, 0.06486950858751298), (1, 0.029774770216298922), (4, 0.029774770216298922)] 140 | 6 141 | carts 142 | [(6, 0.38118163985636305), (2, 0.2335932064798735), (5, 0.15468554291133552), (4, 0.10612056173231707), (3, 0.06486950858751298), (0, 0.029774770216298922), (1, 0.029774770216298922)] 143 | 7 144 | carts 145 | [(6, 0.28944109459420325), (0, 0.1877257776625807), (1, 0.1877257776625807), (2, 0.1877257776625807), (5, 0.09106383911928063), (3, 0.02815886664938711), (4, 0.02815886664938711)] 146 | 8 147 | carts 148 | [(0, 0.14285714285714288), (1, 0.14285714285714288), (2, 0.14285714285714288), (3, 0.14285714285714288), (4, 0.14285714285714288), (5, 0.14285714285714288), (6, 0.14285714285714288)] 149 | 9 150 | carts 151 | [(2, 0.37562566070730374), (1, 0.18884773866007648), (6, 0.15269531993891064), (5, 0.11742283293546682), (3, 0.08423791978416603), (0, 0.05457071853702918), (4, 0.026599809437047046)] 152 | 10 153 | carts 154 | [(6, 0.36673806474597065), (1, 0.19921437905010433), (2, 0.16018996142794664), (5, 0.11787643670073882), (4, 0.08061819055265669), (3, 0.04728105349249183), (0, 0.02808191403009103)] 155 | -----one service finish----- 156 | 5 157 | 1 158 | payment 159 | [(2, 0.2870560289952357), (6, 0.2649610701610829), (4, 0.25796631584623564), (0, 0.07835188619998341), (5, 0.042872183219422454), (3, 0.037084102870752055), (1, 0.031708412707287935)] 160 | 2 161 | payment 162 | [(4, 0.28683689363024073), (2, 0.2716254489616403), (6, 0.2602304459440479), (0, 0.07282997218338727), (5, 0.044357899883625214), (1, 0.03205966969852928), (3, 0.03205966969852928)] 163 | 3 164 | payment 165 | [(2, 0.31589483726090617), (4, 0.24441743046071485), (6, 0.2326031609461311), (0, 0.07216097863781397), (3, 0.06134007037249665), (5, 0.04254136017054697), (1, 0.03104216215139027)] 166 | 4 167 | payment 168 | [(6, 0.31058158313415357), (4, 0.2551112830123772), (2, 0.24112257902983011), (0, 0.07984700171836802), (5, 0.04322901526096787), (3, 0.038525935444631246), (1, 0.03158260239967191)] 169 | 5 170 | payment 171 | [(6, 0.3682587152205526), (2, 0.2322704481607923), (3, 0.2120942094701663), (4, 0.09193423288691409), (0, 0.03181413142052496), (1, 0.03181413142052496), (5, 0.03181413142052496)] 172 | 6 173 | payment 174 | 7 175 | payment 176 | [(2, 0.2900781053958258), (6, 0.26291743029887793), (4, 0.26151905098101275), (0, 0.07781534884181657), (5, 0.043889594393427434), (1, 0.03189023504451982), (3, 0.03189023504451982)] 177 | 8 178 | payment 179 | 9 180 | payment 181 | [(2, 0.31720358237902135), (4, 0.2538415174234074), (6, 0.24815531987516512), (0, 0.06391578985902657), (3, 0.043074414489109376), (1, 0.041729014277349395), (5, 0.032080361696920766)] 182 | 10 183 | payment 184 | [(6, 0.2062875721429131), (4, 0.1892971277660644), (2, 0.18656097582284076), (5, 0.18644453262320415), (3, 0.1780075320345981), (0, 0.02670112980518972), (1, 0.02670112980518972)] 185 | -----one service finish----- 186 | 6 187 | 1 188 | shipping 189 | [(4, 0.18148104255096673), (0, 0.15893571989301616), (1, 0.15893571989301616), (2, 0.15893571989301616), (3, 0.15893571989301616), (6, 0.15893571989301616), (5, 0.023840357983952433)] 190 | 2 191 | shipping 192 | [(3, 0.23148372449844254), (0, 0.22755437566264056), (1, 0.2114300757787961), (2, 0.21095441790361044), (5, 0.045247344975160954), (6, 0.044235120914080994), (4, 0.029094940267268526)] 193 | 3 194 | shipping 195 | [(0, 0.31333610619629876), (2, 0.25829749449876077), (1, 0.2570226258568031), (4, 0.059820062975919824), (3, 0.0465815208064134), (5, 0.032471094832902156), (6, 0.032471094832902156)] 196 | 4 197 | shipping 198 | [(3, 0.22737087630131456), (2, 0.22363136937604994), (1, 0.22065457845670552), (0, 0.21680898201398086), (5, 0.041403104858797245), (4, 0.04079772846716047), (6, 0.0293333605259913)] 199 | 5 200 | shipping 201 | [(0, 0.20628757214291316), (4, 0.20628757214291316), (1, 0.17800753203459813), (2, 0.17800753203459813), (6, 0.17800753203459813), (3, 0.026701129805189728), (5, 0.026701129805189728)] 202 | 6 203 | shipping 204 | [(0, 0.24571537812323563), (3, 0.22465451525690897), (2, 0.21352492550741387), (1, 0.19294597271906955), (5, 0.04825247202035107), (4, 0.045964840465160514), (6, 0.02894189590786044)] 205 | 7 206 | shipping 207 | [(0, 0.20628757214291316), (4, 0.20628757214291316), (1, 0.17800753203459813), (2, 0.17800753203459813), (6, 0.17800753203459813), (3, 0.026701129805189728), (5, 0.026701129805189728)] 208 | 8 209 | shipping 210 | [(3, 0.2062875721429131), (4, 0.2062875721429131), (1, 0.17800753203459813), (2, 0.17800753203459813), (6, 0.17800753203459813), (0, 0.026701129805189728), (5, 0.026701129805189728)] 211 | 9 212 | shipping 213 | [(3, 0.22678026721800254), (0, 0.22562382490758787), (1, 0.22307257072095657), (2, 0.21257865939684173), (5, 0.04142175903541057), (4, 0.04120356383960564), (6, 0.02931935488159511)] 214 | 10 215 | shipping 216 | [(3, 0.2302635534630637), (2, 0.22209568768149024), (0, 0.2215914549611425), (1, 0.21625068676373924), (5, 0.04129292315913406), (4, 0.039112969129922626), (6, 0.029392724841507657)] 217 | -----one service finish----- 218 | 2023-03-14 06:13:09 219 | -------------------------------------------------------------------------------- /pa_result/latency/memory_latency_eta1000.txt: -------------------------------------------------------------------------------- 1 | 2023-03-14 11:07:58 2 | 0 3 | 1 4 | front-end 5 | [(1, 0.42193962308692595), (5, 0.20022147872265242), (3, 0.15903887133263428), (0, 0.1085389964321062), (4, 0.05019458680888534), (2, 0.030033221808397872), (6, 0.030033221808397872)] 6 | 2 7 | front-end 8 | [(1, 0.252812301531703), (0, 0.2318838046769136), (4, 0.2245423963227171), (5, 0.20052517066804582), (2, 0.03007877560020688), (3, 0.03007877560020688), (6, 0.03007877560020688)] 9 | 3 10 | front-end 11 | 4 12 | front-end 13 | [(1, 0.3572418610876725), (0, 0.22896263717910667), (5, 0.20702812459898542), (3, 0.09375708057872388), (4, 0.05090185917581607), (2, 0.03105421868984782), (6, 0.03105421868984782)] 14 | 5 15 | front-end 16 | [(3, 0.31773970648015853), (5, 0.24163150761935412), (1, 0.2337161090195107), (0, 0.08700913159545126), (4, 0.0578064008942691), (2, 0.031048572195628125), (6, 0.031048572195628125)] 17 | 6 18 | front-end 19 | [(3, 0.25331833671295073), (1, 0.22068253995383927), (0, 0.21415842603686702), (5, 0.20426133883507594), (4, 0.0486413712317912), (2, 0.029468993614737937), (6, 0.029468993614737937)] 20 | 7 21 | front-end 22 | [(3, 0.3285129987739799), (4, 0.25148926148883133), (1, 0.21721303777136253), (0, 0.09374684899029959), (5, 0.04662336006603064), (2, 0.031207246454748046), (6, 0.031207246454748046)] 23 | 8 24 | front-end 25 | [(1, 0.26835203370829785), (4, 0.22182779263351662), (0, 0.2190586761895191), (5, 0.2005251706680458), (2, 0.030078775600206872), (3, 0.030078775600206872), (6, 0.030078775600206872)] 26 | 9 27 | front-end 28 | [(1, 0.42176978098596046), (5, 0.2001687357196758), (3, 0.15923143531598177), (0, 0.10873485050344832), (4, 0.05004457675903099), (2, 0.030025310357951376), (6, 0.030025310357951376)] 29 | 10 30 | front-end 31 | [(1, 0.3546683676760959), (0, 0.23109240926140334), (5, 0.2068764230408309), (3, 0.09407994213698248), (4, 0.05121993097243811), (2, 0.03103146345612464), (6, 0.03103146345612464)] 32 | -----one service finish----- 33 | 1 34 | 1 35 | user 36 | [(4, 0.18148104255096673), (0, 0.1589357198930162), (1, 0.1589357198930162), (2, 0.1589357198930162), (3, 0.1589357198930162), (5, 0.1589357198930162), (6, 0.023840357983952433)] 37 | 2 38 | user 39 | [(0, 0.3756256607073037), (1, 0.1888477386600765), (2, 0.1526953199389106), (5, 0.1174228329354668), (3, 0.08423791978416602), (4, 0.05457071853702917), (6, 0.02659980943704704)] 40 | 3 41 | user 42 | [(3, 0.22781638001950755), (0, 0.1733410848890412), (1, 0.1733410848890412), (2, 0.1733410848890412), (5, 0.1733410848890412), (4, 0.05281811769097149), (6, 0.026001162733356186)] 43 | 4 44 | user 45 | 5 46 | user 47 | [(3, 0.22781638001950755), (0, 0.1733410848890412), (1, 0.1733410848890412), (2, 0.1733410848890412), (5, 0.1733410848890412), (4, 0.05281811769097149), (6, 0.026001162733356186)] 48 | 6 49 | user 50 | [(3, 0.26545490461315435), (0, 0.22696132739370342), (2, 0.19353271765478536), (5, 0.19353271765478536), (4, 0.062458517387135884), (1, 0.02902990764821781), (6, 0.02902990764821781)] 51 | 7 52 | user 53 | [(3, 0.2988835143520724), (1, 0.19353271765478536), (2, 0.19353271765478536), (5, 0.19353271765478536), (4, 0.062458517387135884), (0, 0.02902990764821781), (6, 0.02902990764821781)] 54 | 8 55 | user 56 | [(3, 0.38136555785356213), (2, 0.20818205654834127), (5, 0.20818205654834127), (1, 0.06990785604262646), (4, 0.06990785604262646), (0, 0.031227308482251196), (6, 0.031227308482251196)] 57 | 9 58 | user 59 | [(3, 0.2062875721429131), (4, 0.2062875721429131), (0, 0.17800753203459815), (2, 0.17800753203459815), (5, 0.17800753203459815), (1, 0.026701129805189728), (6, 0.026701129805189728)] 60 | 10 61 | user 62 | [(6, 0.38118163985636305), (5, 0.2335932064798735), (4, 0.15468554291133552), (3, 0.10612056173231707), (1, 0.06486950858751298), (0, 0.029774770216298922), (2, 0.029774770216298922)] 63 | -----one service finish----- 64 | 2 65 | 1 66 | catalogue 67 | [(2, 0.23950674534469846), (0, 0.23178473615634212), (3, 0.2146574681606026), (1, 0.19353271765478539), (4, 0.062458517387135884), (5, 0.029029907648217813), (6, 0.029029907648217813)] 68 | 2 69 | catalogue 70 | 3 71 | catalogue 72 | [(2, 0.20628757214291313), (3, 0.20628757214291313), (1, 0.17800753203459813), (5, 0.17800753203459813), (6, 0.17800753203459813), (0, 0.026701129805189724), (4, 0.026701129805189724)] 73 | 4 74 | catalogue 75 | [(2, 0.18148104255096673), (1, 0.15893571989301616), (3, 0.15893571989301616), (4, 0.15893571989301616), (5, 0.15893571989301616), (6, 0.15893571989301616), (0, 0.023840357983952433)] 76 | 5 77 | catalogue 78 | [(3, 0.18148104255096673), (0, 0.1589357198930162), (1, 0.1589357198930162), (2, 0.1589357198930162), (5, 0.1589357198930162), (6, 0.1589357198930162), (4, 0.023840357983952433)] 79 | 6 80 | catalogue 81 | [(4, 0.41627637822757724), (5, 0.19849846810865945), (3, 0.15468554291133552), (1, 0.10612056173231707), (0, 0.06486950858751298), (2, 0.029774770216298926), (6, 0.029774770216298926)] 82 | 7 83 | catalogue 84 | [(0, 0.2062875721429131), (4, 0.2062875721429131), (1, 0.17800753203459815), (3, 0.17800753203459815), (5, 0.17800753203459815), (2, 0.026701129805189728), (6, 0.026701129805189728)] 85 | 8 86 | catalogue 87 | 9 88 | catalogue 89 | [(3, 0.20480755393112582), (2, 0.19634991097742294), (0, 0.1733410848890412), (1, 0.1733410848890412), (5, 0.1733410848890412), (4, 0.0528181176909715), (6, 0.02600116273335619)] 90 | 10 91 | catalogue 92 | [(3, 0.26545490461315435), (2, 0.2269613273937034), (1, 0.1935327176547853), (5, 0.1935327176547853), (4, 0.06245851738713588), (0, 0.029029907648217806), (6, 0.029029907648217806)] 93 | -----one service finish----- 94 | 3 95 | 1 96 | orders 97 | [(2, 0.24853934885703907), (3, 0.22665525951604207), (0, 0.2217056074857998), (1, 0.21286362976513862), (4, 0.03007871812532687), (5, 0.03007871812532687), (6, 0.03007871812532687)] 98 | 2 99 | orders 100 | 3 101 | orders 102 | [(2, 0.25421072377251724), (3, 0.22315639346783697), (0, 0.213830691843468), (1, 0.2062557898978725), (6, 0.04325960550410523), (4, 0.02964339775709997), (5, 0.02964339775709997)] 103 | 4 104 | orders 105 | [(2, 0.26578279896830886), (3, 0.21902425292477315), (0, 0.21758610348855842), (1, 0.20737051781773888), (4, 0.03007877560020687), (5, 0.03007877560020687), (6, 0.03007877560020687)] 106 | 5 107 | orders 108 | [(2, 0.2606902910276311), (3, 0.2320981174286571), (1, 0.20205048429732064), (0, 0.19612612021097328), (4, 0.041691119624752984), (5, 0.037924949379018885), (6, 0.029418918031645997)] 109 | 6 110 | orders 111 | [(2, 0.2574434058915298), (3, 0.22561137623372238), (0, 0.21845838280837557), (1, 0.20825050826575153), (4, 0.030078775600206872), (5, 0.030078775600206872), (6, 0.030078775600206872)] 112 | 7 113 | orders 114 | [(2, 0.21821492942824733), (3, 0.19436021485757893), (0, 0.17800753203459815), (1, 0.17800753203459815), (5, 0.17800753203459815), (4, 0.026701129805189728), (6, 0.026701129805189728)] 115 | 8 116 | orders 117 | [(2, 0.24902428530582515), (3, 0.22684951957467167), (0, 0.2211265496916705), (1, 0.21276349105185227), (4, 0.030078718125326875), (5, 0.030078718125326875), (6, 0.030078718125326875)] 118 | 9 119 | orders 120 | [(2, 0.2515660758979455), (3, 0.22759662161628194), (0, 0.21714820001038995), (1, 0.21345294809940213), (4, 0.03007871812532687), (5, 0.03007871812532687), (6, 0.03007871812532687)] 121 | 10 122 | orders 123 | [(2, 0.18148104255096673), (0, 0.1589357198930162), (1, 0.1589357198930162), (3, 0.1589357198930162), (5, 0.1589357198930162), (6, 0.1589357198930162), (4, 0.023840357983952433)] 124 | -----one service finish----- 125 | 4 126 | 1 127 | carts 128 | [(6, 0.4309903417679248), (5, 0.2029179516387443), (4, 0.15391374984403305), (2, 0.09536500430980882), (0, 0.06008373909600417), (1, 0.02836460667174249), (3, 0.02836460667174249)] 129 | 2 130 | carts 131 | [(6, 0.43087830582540154), (5, 0.20298288156106897), (4, 0.15399886214398692), (2, 0.09534303194333466), (0, 0.06007382532067891), (1, 0.028361546602764534), (3, 0.028361546602764534)] 132 | 3 133 | carts 134 | [(5, 0.4323352675159298), (6, 0.21480227748707156), (4, 0.16559870125110507), (2, 0.06968546475664793), (0, 0.06059960902500663), (1, 0.028489339982119482), (3, 0.028489339982119482)] 135 | 4 136 | carts 137 | [(6, 0.48291805094972423), (4, 0.20868027273917547), (2, 0.1320746916150176), (0, 0.07686694205658348), (3, 0.03911314067525761), (1, 0.030173450982120795), (5, 0.030173450982120795)] 138 | 5 139 | carts 140 | [(2, 0.3006781965822102), (6, 0.2969137211206225), (5, 0.21102823667402532), (0, 0.07139982704510246), (4, 0.056671547575831885), (1, 0.03165423550110381), (3, 0.03165423550110381)] 141 | 6 142 | carts 143 | [(6, 0.4307805784615028), (5, 0.20303951154391453), (4, 0.154073103394459), (2, 0.09532387357878434), (0, 0.0600651787379145), (1, 0.028358877141712393), (3, 0.028358877141712393)] 144 | 7 145 | carts 146 | [(4, 0.2199245259649718), (6, 0.19265061832085445), (0, 0.17800753203459813), (3, 0.17800753203459813), (5, 0.17800753203459813), (1, 0.026701129805189728), (2, 0.026701129805189728)] 147 | 8 148 | carts 149 | [(6, 0.42381941741520035), (5, 0.20196686317496168), (4, 0.15383514134580284), (2, 0.09951398582731398), (0, 0.06463534125602793), (1, 0.028114625490346573), (3, 0.028114625490346573)] 150 | 9 151 | carts 152 | [(5, 0.47101194781677613), (4, 0.17256107226189135), (6, 0.1536117023156344), (2, 0.07852446704549186), (0, 0.06480466944801669), (1, 0.02974307055609475), (3, 0.02974307055609475)] 153 | 10 154 | carts 155 | [(6, 0.4073803094493798), (5, 0.19846820949515814), (4, 0.15398874553215153), (2, 0.10627026590869047), (0, 0.07234109983236706), (3, 0.03399079931302463), (1, 0.027560570469228312)] 156 | -----one service finish----- 157 | 5 158 | 1 159 | payment 160 | [(5, 0.3492810902005588), (0, 0.2249767983070573), (2, 0.21852948288171703), (3, 0.09003597204394931), (6, 0.05510246409360829), (1, 0.031037096236554637), (4, 0.031037096236554637)] 161 | 2 162 | payment 163 | [(2, 0.4425301564652881), (0, 0.21163312437031911), (5, 0.15324192630903963), (3, 0.08036437957433454), (6, 0.04181472281780032), (1, 0.041597699036229965), (4, 0.028817991426988387)] 164 | 3 165 | payment 166 | [(2, 0.452597027253839), (0, 0.21522902040046285), (5, 0.14759163630585848), (3, 0.07673302480027355), (6, 0.049525907882906714), (1, 0.02916169167832974), (4, 0.02916169167832974)] 167 | 4 168 | payment 169 | [(6, 0.3980758715272962), (3, 0.368581445001603), (0, 0.08835892557302019), (1, 0.03624593947452015), (2, 0.03624593947452015), (4, 0.03624593947452015), (5, 0.03624593947452015)] 170 | 5 171 | payment 172 | [(3, 0.23995507846601039), (2, 0.22696132739370345), (0, 0.21903254380192938), (5, 0.19353271765478536), (6, 0.06245851738713589), (1, 0.029029907648217813), (4, 0.029029907648217813)] 173 | 6 174 | payment 175 | [(2, 0.20628757214291304), (6, 0.20628757214291304), (0, 0.17800753203459813), (3, 0.17800753203459813), (5, 0.17800753203459813), (1, 0.02670112980518972), (4, 0.02670112980518972)] 176 | 7 177 | payment 178 | [(6, 0.28699478796632116), (5, 0.2854143837904498), (2, 0.2804224693517475), (3, 0.04662435040849042), (0, 0.03351466949433038), (1, 0.03351466949433038), (4, 0.03351466949433038)] 179 | 8 180 | payment 181 | [(2, 0.45748477222398937), (0, 0.21016841161410788), (5, 0.14425922794708065), (3, 0.07061917264052753), (6, 0.05886724317248391), (1, 0.029300586200905372), (4, 0.029300586200905372)] 182 | 9 183 | payment 184 | [(0, 0.3796629590260059), (3, 0.35704570063385516), (2, 0.07545075720761389), (5, 0.062195338413456126), (4, 0.046655011165626724), (6, 0.04424053616325328), (1, 0.034749697390188995)] 185 | 10 186 | payment 187 | [(4, 0.18148104255096673), (0, 0.1589357198930162), (1, 0.1589357198930162), (2, 0.1589357198930162), (3, 0.1589357198930162), (5, 0.1589357198930162), (6, 0.023840357983952433)] 188 | -----one service finish----- 189 | 6 190 | 1 191 | shipping 192 | [(2, 0.3942953816109072), (5, 0.30279198269084323), (6, 0.10756665999294607), (4, 0.06360057919728876), (0, 0.057083703368632464), (3, 0.0417132095727332), (1, 0.032948483566648964)] 193 | 2 194 | shipping 195 | [(2, 0.3895566369906633), (5, 0.3135243752035392), (6, 0.10190453041032377), (4, 0.06993070149010495), (0, 0.05214590848737556), (3, 0.03972838095306868), (1, 0.03320946646492454)] 196 | 3 197 | shipping 198 | [(2, 0.26221782898761054), (4, 0.22757666421215292), (0, 0.2149383713838119), (5, 0.19912115788784496), (6, 0.03640963016222615), (1, 0.02986817368317675), (3, 0.02986817368317675)] 199 | 4 200 | shipping 201 | 5 202 | shipping 203 | [(2, 0.3946734431557999), (5, 0.27882377162035293), (3, 0.1336507347385084), (0, 0.07243453846392861), (4, 0.05183632842793979), (1, 0.036618676170115494), (6, 0.031962507423354734)] 204 | 6 205 | shipping 206 | [(2, 0.3965131534101357), (5, 0.30089608992651296), (3, 0.11192132430265385), (4, 0.06205344474395692), (0, 0.055778922444708084), (6, 0.039874518962751446), (1, 0.032962546209281164)] 207 | 7 208 | shipping 209 | [(4, 0.17025592806394388), (0, 0.17016083438003904), (2, 0.1589357198930162), (3, 0.1589357198930162), (5, 0.1589357198930162), (6, 0.1589357198930162), (1, 0.023840357983952433)] 210 | 8 211 | shipping 212 | [(2, 0.2145671214452592), (0, 0.18705657440040455), (4, 0.1822377238199041), (6, 0.18180229557436017), (5, 0.17733061547883092), (3, 0.030406076959416502), (1, 0.026599592321824645)] 213 | 9 214 | shipping 215 | [(4, 0.20015597401532614), (2, 0.19762761134312706), (0, 0.1905800667159856), (6, 0.17733984570227593), (5, 0.17733984570227593), (3, 0.030355679665667976), (1, 0.026600976855341397)] 216 | 10 217 | shipping 218 | [(2, 0.3286220482792417), (5, 0.25458915769808643), (6, 0.20600914710983065), (3, 0.094061898241267), (0, 0.04618420102179618), (4, 0.03963217558330347), (1, 0.030901372066474603)] 219 | -----one service finish----- 220 | 2023-03-14 11:56:50 221 | -------------------------------------------------------------------------------- /pa_result/latency/network_latency_eta1000.txt: -------------------------------------------------------------------------------- 1 | 2023-03-14 16:43:24 2 | 0 3 | 1 4 | front-end 5 | [(3, 0.329409340773434), (1, 0.2360161812467078), (6, 0.2002309817964963), (5, 0.10845435492472719), (0, 0.06581984671968596), (2, 0.030034647269474447), (4, 0.030034647269474447)] 6 | 2 7 | front-end 8 | [(3, 0.20628757214291313), (5, 0.20628757214291313), (1, 0.17800753203459813), (2, 0.17800753203459813), (6, 0.17800753203459813), (0, 0.026701129805189724), (4, 0.026701129805189724)] 9 | 3 10 | front-end 11 | [(5, 0.43277999993373373), (3, 0.2617035579554863), (0, 0.13138175731088644), (2, 0.07561970249985225), (1, 0.032838327433347185), (4, 0.032838327433347185), (6, 0.032838327433347185)] 12 | 4 13 | front-end 14 | [(2, 0.2364128341771112), (3, 0.2364128341771112), (5, 0.2364128341771112), (6, 0.2005251706680458), (0, 0.030078775600206872), (1, 0.030078775600206872), (4, 0.030078775600206872)] 15 | 5 16 | front-end 17 | [(5, 0.32815513927264506), (6, 0.26103612113516406), (3, 0.20805664971453233), (0, 0.08516330040084792), (2, 0.043584071449100405), (4, 0.042796220570530376), (1, 0.031208497457179855)] 18 | 6 19 | front-end 20 | [(5, 0.2654549046131544), (3, 0.22696132739370345), (1, 0.19353271765478536), (6, 0.19353271765478536), (0, 0.06245851738713588), (2, 0.02902990764821781), (4, 0.02902990764821781)] 21 | 7 22 | front-end 23 | [(1, 0.3933771010245893), (3, 0.19641390722479402), (4, 0.15529355166020992), (5, 0.11836502205626757), (0, 0.06965114154715149), (2, 0.03977737833831385), (6, 0.027121898148673974)] 24 | 8 25 | front-end 26 | [(2, 0.22781638001950752), (1, 0.1733410848890412), (3, 0.1733410848890412), (4, 0.1733410848890412), (6, 0.1733410848890412), (5, 0.05281811769097149), (0, 0.026001162733356186)] 27 | 9 28 | front-end 29 | [(1, 0.3590480388731828), (6, 0.181987093698495), (3, 0.16272468675617052), (4, 0.12353532382192652), (5, 0.08863474727046376), (0, 0.05677204552498716), (2, 0.027298064054774255)] 30 | 10 31 | front-end 32 | [(1, 0.22781638001950752), (3, 0.17334108488904118), (4, 0.17334108488904118), (5, 0.17334108488904118), (6, 0.17334108488904118), (2, 0.05281811769097149), (0, 0.026001162733356183)] 33 | -----one service finish----- 34 | 1 35 | 1 36 | user 37 | [(6, 0.3254914010566658), (2, 0.18463668471924016), (4, 0.18463668471924016), (5, 0.12803714324734466), (0, 0.0914143952165899), (3, 0.0580881883330334), (1, 0.02769550270788603)] 38 | 2 39 | user 40 | [(2, 0.4204216728223541), (6, 0.20898221071396256), (5, 0.16260313595099482), (0, 0.07437927767344274), (3, 0.07026119482609955), (1, 0.0352991608208576), (4, 0.028053347192288587)] 41 | 3 42 | user 43 | [(2, 0.3076251884134296), (4, 0.23045162126315474), (3, 0.1921319695776652), (6, 0.10105597917941844), (5, 0.09675430654985319), (0, 0.04316113957982898), (1, 0.028819795436649788)] 44 | 4 45 | user 46 | [(2, 0.35904803887318276), (4, 0.181987093698495), (6, 0.16272468675617052), (5, 0.12353532382192653), (0, 0.08863474727046378), (1, 0.05677204552498717), (3, 0.02729806405477426)] 47 | 5 48 | user 49 | [(2, 0.3957069767215311), (4, 0.19249764345638556), (6, 0.17831828852994933), (5, 0.13064341236009422), (0, 0.045084385895124035), (1, 0.028874646518457843), (3, 0.028874646518457843)] 50 | 6 51 | user 52 | [(2, 0.45276285199743455), (6, 0.226191515748751), (5, 0.17049761075786293), (0, 0.06311473854673937), (1, 0.029144427649737403), (3, 0.029144427649737403), (4, 0.029144427649737403)] 53 | 7 54 | user 55 | [(2, 0.3047735612146317), (4, 0.233056502560225), (3, 0.19206169211999294), (6, 0.09864818509920831), (5, 0.09703578951003831), (0, 0.045615015677904776), (1, 0.028809253817998946)] 56 | 8 57 | user 58 | [(2, 0.3590480388731828), (4, 0.181987093698495), (6, 0.16272468675617052), (5, 0.12353532382192653), (0, 0.08863474727046376), (3, 0.056772045524987165), (1, 0.02729806405477426)] 59 | 9 60 | user 61 | [(2, 0.45276285199743455), (6, 0.226191515748751), (5, 0.17049761075786293), (0, 0.06311473854673937), (1, 0.029144427649737403), (3, 0.029144427649737403), (4, 0.029144427649737403)] 62 | 10 63 | user 64 | [(2, 0.4475699657469712), (6, 0.19319226342913562), (5, 0.14130482934343655), (0, 0.0987355960542465), (3, 0.06156781096589782), (1, 0.028814767230156083), (4, 0.028814767230156083)] 65 | -----one service finish----- 66 | 2 67 | 1 68 | catalogue 69 | [(3, 0.30961416596794505), (6, 0.24073204592591704), (4, 0.19565554493356205), (5, 0.09791026518165472), (2, 0.06351482758631687), (1, 0.06322481866456996), (0, 0.029348331740034313)] 70 | 2 71 | catalogue 72 | [(3, 0.3687772842279966), (6, 0.2363783525652416), (4, 0.14772018542958854), (5, 0.10153788101443155), (2, 0.06374019911086695), (1, 0.05241885893419067), (0, 0.029427238717684143)] 73 | 3 74 | catalogue 75 | [(3, 0.40056747798322984), (6, 0.2430461437397048), (5, 0.11433349471239558), (4, 0.11198254073527131), (2, 0.0684471786965042), (0, 0.03081158206644717), (1, 0.03081158206644717)] 76 | 4 77 | catalogue 78 | [(5, 0.2487629571655979), (6, 0.22926342461108792), (3, 0.18803720296788323), (4, 0.18803720296788323), (2, 0.05976265913008986), (1, 0.05793097271227543), (0, 0.02820558044518249)] 79 | 5 80 | catalogue 81 | [(5, 0.2495922086911604), (2, 0.24579087495937121), (3, 0.194739071047228), (4, 0.194739071047228), (0, 0.04405961479064092), (6, 0.041868298807287235), (1, 0.029210860657084208)] 82 | 6 83 | catalogue 84 | [(5, 0.2792191169526763), (1, 0.1823761919426405), (3, 0.1823761919426405), (6, 0.1823761919426405), (2, 0.08925362756641578), (0, 0.05704225086159036), (4, 0.02735642879139608)] 85 | 7 86 | catalogue 87 | [(5, 0.25003015371908377), (6, 0.2278569886529322), (3, 0.18798666368389955), (4, 0.18798666368389955), (2, 0.05973811850766386), (1, 0.05820341219993612), (0, 0.028197999552584938)] 88 | 8 89 | catalogue 90 | [(5, 0.2716988361385928), (6, 0.2080112061162219), (3, 0.18865207565906844), (4, 0.18865207565906844), (2, 0.07398553352227902), (0, 0.04070246155590923), (1, 0.028297811348860273)] 91 | 9 92 | catalogue 93 | [(5, 0.24946286265498008), (6, 0.22883355922184523), (3, 0.18813535307283294), (4, 0.18813535307283294), (2, 0.059810331439655716), (1, 0.05740223757692817), (0, 0.02822030296092495)] 94 | 10 95 | catalogue 96 | -----one service finish----- 97 | 3 98 | 1 99 | orders 100 | [(6, 0.44026894993612214), (1, 0.292264462030387), (5, 0.1292626309912971), (0, 0.034550989260548455), (2, 0.034550989260548455), (3, 0.034550989260548455), (4, 0.034550989260548455)] 101 | 2 102 | orders 103 | [(6, 0.44009830436783465), (1, 0.22435468722707824), (5, 0.14264381420962186), (0, 0.08296886119741256), (3, 0.04096414133833223), (2, 0.03736371500913157), (4, 0.03160647665058891)] 104 | 3 105 | orders 106 | [(6, 0.2723004976861766), (1, 0.2364128341771112), (3, 0.20052517066804576), (4, 0.20052517066804576), (0, 0.030078775600206872), (2, 0.030078775600206872), (5, 0.030078775600206872)] 107 | 4 108 | orders 109 | [(6, 0.45483766269747844), (1, 0.2776957492690307), (5, 0.1292626309912971), (0, 0.03455098926054846), (2, 0.03455098926054846), (3, 0.03455098926054846), (4, 0.03455098926054846)] 110 | 5 111 | orders 112 | [(6, 0.432745254448745), (1, 0.23532900584337962), (5, 0.1362892302448562), (0, 0.08303170255377422), (3, 0.04071413781722244), (2, 0.040140028626488085), (4, 0.031750640465534524)] 113 | 6 114 | orders 115 | [(6, 0.43522173079833293), (1, 0.22736817971907053), (5, 0.1380623861182508), (0, 0.08521185935291427), (3, 0.04291534958258469), (2, 0.03968586077215687), (4, 0.03153463365668996)] 116 | 7 117 | orders 118 | [(6, 0.35328257491283127), (1, 0.24808909337742688), (3, 0.2269991471621368), (5, 0.07425170747995473), (0, 0.03245915902255015), (2, 0.03245915902255015), (4, 0.03245915902255015)] 119 | 8 120 | orders 121 | [(6, 0.2723004976861766), (1, 0.2364128341771112), (3, 0.20052517066804576), (4, 0.20052517066804576), (0, 0.030078775600206872), (2, 0.030078775600206872), (5, 0.030078775600206872)] 122 | 9 123 | orders 124 | [(6, 0.3309402087321899), (1, 0.26642208528150485), (3, 0.224374827953064), (5, 0.07327181093215825), (2, 0.04062253129985289), (0, 0.03218426790061513), (4, 0.03218426790061513)] 125 | 10 126 | orders 127 | [(1, 0.43410507158601713), (3, 0.2038162278360899), (6, 0.1896687942625573), (5, 0.08069260378909511), (0, 0.030572434175413492), (2, 0.030572434175413492), (4, 0.030572434175413492)] 128 | -----one service finish----- 129 | 4 130 | 1 131 | carts 132 | [(5, 0.4574595218219463), (1, 0.1949305291490513), (2, 0.14310373305615703), (4, 0.0991455028644781), (3, 0.04689180557405156), (0, 0.029234453767157737), (6, 0.029234453767157737)] 133 | 2 134 | carts 135 | [(5, 0.4197606482685633), (2, 0.2535921074904804), (3, 0.1431437996027125), (0, 0.08762627161916783), (1, 0.03195905767302535), (4, 0.03195905767302535), (6, 0.03195905767302535)] 136 | 3 137 | carts 138 | [(2, 0.3579984547357211), (0, 0.21931971735962644), (4, 0.20404796302832198), (5, 0.12681228151358553), (1, 0.030607194454248306), (3, 0.030607194454248306), (6, 0.030607194454248306)] 139 | 4 140 | carts 141 | [(3, 0.3381037345640901), (2, 0.2507148559642551), (4, 0.2143322415946346), (1, 0.08834220996869588), (0, 0.04563041803909891), (5, 0.03143826993461274), (6, 0.03143826993461274)] 142 | 5 143 | carts 144 | [(2, 0.3554274375085961), (0, 0.21713619619408603), (4, 0.2024949450045138), (5, 0.12693009798464377), (3, 0.03726283980680626), (1, 0.03037424175067708), (6, 0.03037424175067708)] 145 | 6 146 | carts 147 | [(1, 0.42062997052266), (5, 0.25402103270583565), (2, 0.16315858024057955), (4, 0.06616752326755734), (0, 0.03200763108778918), (3, 0.03200763108778918), (6, 0.03200763108778918)] 148 | 7 149 | carts 150 | [(2, 0.33811423793203305), (0, 0.20826208933038326), (4, 0.20105038400153075), (5, 0.12193984464808257), (3, 0.07184054744525013), (1, 0.029396448321360177), (6, 0.029396448321360177)] 151 | 8 152 | carts 153 | [(1, 0.3870418138399692), (5, 0.2229271740156597), (2, 0.1615778097892161), (4, 0.10491561384632901), (0, 0.0643215225147396), (3, 0.029608032997043187), (6, 0.029608032997043187)] 154 | 9 155 | carts 156 | [(2, 0.275524499925663), (0, 0.20416060232424138), (3, 0.19763633933383914), (4, 0.19109847819272593), (5, 0.07425053676571271), (1, 0.028664771728908897), (6, 0.028664771728908897)] 157 | 10 158 | carts 159 | [(2, 0.2794132712655861), (0, 0.20433782399657882), (3, 0.19015702285998906), (4, 0.19015702285998906), (5, 0.07888775215986027), (1, 0.028523553428998365), (6, 0.028523553428998365)] 160 | -----one service finish----- 161 | 5 162 | 1 163 | payment 164 | [(5, 0.39178179625422416), (2, 0.34209757530797436), (0, 0.08986294947569), (1, 0.07241328263132021), (3, 0.03461479877693041), (4, 0.03461479877693041), (6, 0.03461479877693041)] 165 | 2 166 | payment 167 | [(5, 0.49864326615001947), (0, 0.16582182955259292), (1, 0.13902787986408252), (2, 0.07754997360175477), (3, 0.05747342616456725), (4, 0.03074181233349153), (6, 0.03074181233349153)] 168 | 3 169 | payment 170 | [(5, 0.5149042181813179), (0, 0.16297932996206083), (1, 0.13292712273635907), (2, 0.07571581282839657), (3, 0.05075267997974946), (4, 0.03136041815605813), (6, 0.03136041815605813)] 171 | 4 172 | payment 173 | [(5, 0.448433368102048), (0, 0.19763369876964054), (3, 0.14220555299405305), (2, 0.09960780997275244), (1, 0.05436493488885133), (4, 0.02887731763632735), (6, 0.02887731763632735)] 174 | 5 175 | payment 176 | [(5, 0.4849506968822753), (0, 0.2130246899184954), (3, 0.1456047225592063), (1, 0.04906569463191312), (2, 0.04677797478361205), (4, 0.030288110612248904), (6, 0.030288110612248904)] 177 | 6 178 | payment 179 | [(5, 0.1708955173933549), (1, 0.16952124505062802), (0, 0.1589357198930162), (2, 0.1589357198930162), (4, 0.1589357198930162), (6, 0.1589357198930162), (3, 0.023840357983952433)] 180 | 7 181 | payment 182 | [(5, 0.4907878063551053), (0, 0.2147349730824297), (3, 0.14536266146059595), (1, 0.05009461019159779), (2, 0.03809325845654061), (4, 0.030463345226865297), (6, 0.030463345226865297)] 183 | 8 184 | payment 185 | [(5, 0.4918049263350703), (1, 0.15460180618903271), (0, 0.15204343913260804), (3, 0.09428541077490467), (2, 0.0462926882558587), (4, 0.03048586465626284), (6, 0.03048586465626284)] 186 | 9 187 | payment 188 | [(5, 0.4570927526502856), (2, 0.22338338899431529), (1, 0.1068136932971631), (0, 0.10096416854322925), (3, 0.047242669843171), (4, 0.032251663335917884), (6, 0.032251663335917884)] 189 | 10 190 | payment 191 | [(5, 0.34588016732427534), (2, 0.19209558222525402), (6, 0.19209558222525402), (0, 0.11873171131090607), (3, 0.07870924846515288), (1, 0.043673371115369584), (4, 0.02881433733378811)] 192 | -----one service finish----- 193 | 6 194 | 1 195 | shipping 196 | [(6, 0.21135927932087537), (2, 0.19597221925332242), (0, 0.1757954010399974), (1, 0.1757954010399974), (4, 0.1757954010399974), (5, 0.03891298814981027), (3, 0.026369310155999613)] 197 | 2 198 | shipping 199 | [(6, 0.25141341098236425), (2, 0.22255383916674681), (1, 0.18658286042428057), (4, 0.18658286042428057), (5, 0.06582130711408743), (0, 0.059058292824598416), (3, 0.027987429063642093)] 200 | 3 201 | shipping 202 | [(6, 0.29637010099017974), (2, 0.2451584993262698), (4, 0.1931054457626051), (0, 0.09779551787485402), (5, 0.09434675315651096), (1, 0.04425786602518965), (3, 0.028965816864390773)] 203 | 4 204 | shipping 205 | [(4, 0.34406528618908244), (2, 0.2217182826103886), (1, 0.2003381108669737), (6, 0.10873073222521817), (0, 0.048013873568678055), (5, 0.04708299790961284), (3, 0.030050716630046066)] 206 | 5 207 | shipping 208 | [(5, 0.18148104255096673), (1, 0.1589357198930162), (2, 0.1589357198930162), (3, 0.1589357198930162), (4, 0.1589357198930162), (6, 0.1589357198930162), (0, 0.023840357983952436)] 209 | 6 210 | shipping 211 | [(2, 0.2013667313213764), (6, 0.19979073358717234), (1, 0.1733410848890412), (3, 0.1733410848890412), (4, 0.1733410848890412), (5, 0.05281811769097149), (0, 0.026001162733356186)] 212 | 7 213 | shipping 214 | [(2, 0.3261518836003567), (4, 0.31823888433429043), (6, 0.14468258729457384), (5, 0.06772825876632199), (0, 0.06132271132694906), (3, 0.051047225628964764), (1, 0.030828449048543264)] 215 | 8 216 | shipping 217 | [(6, 0.2973768248213641), (2, 0.24614069100177374), (4, 0.1936726152639527), (0, 0.09829375731626692), (5, 0.09133235541317805), (1, 0.04413286389387147), (3, 0.029050892289592914)] 218 | 9 219 | shipping 220 | 10 221 | shipping 222 | -----one service finish----- 223 | 2023-03-14 17:32:49 224 | -------------------------------------------------------------------------------- /pa_result/latency_gamma_eta.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/pa_result/latency_gamma_eta.pdf -------------------------------------------------------------------------------- /pa_result/single_service/cpu_single_service_eta1000.txt: -------------------------------------------------------------------------------- 1 | 2023-03-14 02:44:05 2 | 0 3 | 1 4 | front-end_ctn_write 5 | front-end_ctn_read 6 | front-end 7 | [(2, 0.2671544743143986), (0, 0.23264937323352425), (1, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.03489740598502864)] 8 | 2 9 | front-end_ctn_write 10 | front-end_ctn_read 11 | front-end 12 | 3 13 | front-end_ctn_write 14 | front-end_ctn_read 15 | front-end 16 | [(0, 0.36975502727891413), (1, 0.2740195533569939), (2, 0.2740195533569939), (3, 0.04110293300354909), (4, 0.04110293300354909)] 17 | 4 18 | front-end_ctn_write 19 | front-end_ctn_read 20 | front-end 21 | [(2, 0.45849394164652296), (1, 0.28117342573551113), (0, 0.1483732325096285), (4, 0.06978338624801077), (3, 0.042176013860326676)] 22 | 5 23 | front-end_ctn_write 24 | front-end_ctn_read 25 | front-end 26 | [(0, 0.3629782311425121), (1, 0.2683700294005322), (2, 0.2683700294005322), (4, 0.06002620564634367), (3, 0.04025550441007983)] 27 | 6 28 | front-end_ctn_write 29 | front-end_ctn_read 30 | front-end 31 | [(2, 0.45977772560702784), (1, 0.2818259308604502), (0, 0.14855505440596967), (3, 0.06756739949748479), (4, 0.042273889629067536)] 32 | 7 33 | front-end_ctn_write 34 | front-end_ctn_read 35 | front-end 36 | [(2, 0.32188745496186943), (3, 0.32188745496186943), (1, 0.2740193000586625), (0, 0.04110289500879938), (4, 0.04110289500879938)] 37 | 8 38 | front-end_ctn_write 39 | front-end_ctn_read 40 | front-end 41 | [(0, 0.35413349561556273), (1, 0.261812424760226), (2, 0.261812424760226), (3, 0.08296979114995144), (4, 0.0392718637140339)] 42 | 9 43 | front-end_ctn_write 44 | front-end_ctn_read 45 | front-end 46 | [(0, 0.36098254317830303), (1, 0.2668183093239413), (2, 0.2668183093239413), (3, 0.06535809177522324), (4, 0.0400227463985912)] 47 | 10 48 | front-end_ctn_write 49 | front-end_ctn_read 50 | front-end 51 | [(0, 0.361314619575405), (1, 0.26707393581185446), (2, 0.26707393581185446), (3, 0.06447641842910792), (4, 0.04006109037177817)] 52 | -----one service finish----- 53 | 1 54 | 1 55 | user_ctn_read 56 | user 57 | [(2, 0.2161666639933777), (0, 0.1888755026521981), (1, 0.1888755026521981), (3, 0.1888755026521981), (5, 0.1888755026521981), (4, 0.028331325397829718)] 58 | 2 59 | user_ctn_read 60 | user 61 | [(2, 0.3433467128859532), (3, 0.2960271848201741), (1, 0.24870765675439502), (0, 0.037306148513159255), (4, 0.037306148513159255), (5, 0.037306148513159255)] 62 | 3 63 | user_ctn_read 64 | user 65 | [(4, 0.23749949595918743), (5, 0.23627287487718843), (2, 0.2325586525810361), (3, 0.21202714798375918), (1, 0.049837756401265025), (0, 0.031804072197563885)] 66 | 4 67 | user_ctn_read 68 | user 69 | 5 70 | user_ctn_read 71 | user 72 | [(2, 0.2873222850143301), (0, 0.21596294393505147), (1, 0.21596294393505147), (3, 0.21596294393505147), (4, 0.03239444159025773), (5, 0.03239444159025773)] 73 | 6 74 | user_ctn_read 75 | user 76 | [(2, 0.2829548920274886), (0, 0.21252927485343637), (1, 0.21252927485343637), (3, 0.21252927485343637), (5, 0.04757789218418696), (4, 0.03187939122801547)] 77 | 7 78 | user_ctn_read 79 | user 80 | [(2, 0.2161666639933777), (0, 0.1888755026521981), (3, 0.1888755026521981), (4, 0.1888755026521981), (5, 0.1888755026521981), (1, 0.028331325397829718)] 81 | 8 82 | user_ctn_read 83 | user 84 | [(1, 0.5067186849417253), (3, 0.2660292202604417), (5, 0.08900652327637662), (4, 0.05843680544332411), (0, 0.039904383039066245), (2, 0.039904383039066245)] 85 | 9 86 | user_ctn_read 87 | user 88 | [(2, 0.33249187041323663), (3, 0.28036390226058916), (1, 0.23728961125685588), (5, 0.07866773269226168), (0, 0.03559344168852839), (4, 0.03559344168852839)] 89 | 10 90 | user_ctn_read 91 | user 92 | [(2, 0.2873222850143301), (0, 0.21596294393505147), (1, 0.21596294393505147), (3, 0.21596294393505147), (4, 0.03239444159025773), (5, 0.03239444159025773)] 93 | -----one service finish----- 94 | 2 95 | 1 96 | catalogue_ctn_write 97 | catalogue_ctn_read 98 | catalogue 99 | [(3, 0.32796615169393817), (4, 0.30155967463833844), (0, 0.26756991444648953), (2, 0.06276877205426043), (1, 0.04013548716697343)] 100 | 2 101 | catalogue_ctn_write 102 | catalogue_ctn_read 103 | catalogue 104 | [(0, 0.2671544743143986), (2, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 105 | 3 106 | catalogue_ctn_write 107 | catalogue_ctn_read 108 | catalogue 109 | [(3, 0.36367111952720643), (0, 0.2689200278860017), (4, 0.2689200278860017), (2, 0.05815082051788992), (1, 0.04033800418290027)] 110 | 4 111 | catalogue_ctn_write 112 | catalogue_ctn_read 113 | catalogue 114 | [(0, 0.2671544743143986), (2, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 115 | 5 116 | catalogue_ctn_write 117 | catalogue_ctn_read 118 | catalogue 119 | [(3, 0.613863411631167), (0, 0.18668363515409364), (1, 0.10692323292102818), (2, 0.04626486014685561), (4, 0.04626486014685561)] 120 | 6 121 | catalogue_ctn_write 122 | catalogue_ctn_read 123 | catalogue 124 | [(4, 0.32360023307661717), (3, 0.3046426263868276), (0, 0.2670105876510494), (2, 0.06469496473784829), (1, 0.04005158814765741)] 125 | 7 126 | catalogue_ctn_write 127 | catalogue_ctn_read 128 | catalogue 129 | [(0, 0.4136556450796955), (4, 0.3739912513055149), (3, 0.08979193355048855), (1, 0.07781851933912784), (2, 0.044742650725173284)] 130 | 8 131 | catalogue_ctn_write 132 | catalogue_ctn_read 133 | catalogue 134 | [(0, 0.2671544743143986), (2, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 135 | 9 136 | catalogue_ctn_write 137 | catalogue_ctn_read 138 | catalogue 139 | [(3, 0.3634512127384522), (0, 0.2687448492369341), (4, 0.2687448492369341), (2, 0.058747361402139674), (1, 0.04031172738554011)] 140 | 10 141 | catalogue_ctn_write 142 | catalogue_ctn_read 143 | catalogue 144 | [(0, 0.2671544743143986), (2, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 145 | -----one service finish----- 146 | 3 147 | 1 148 | orders_ctn_read 149 | orders 150 | [(2, 0.2873222850143301), (0, 0.21596294393505147), (1, 0.21596294393505147), (5, 0.21596294393505147), (3, 0.03239444159025773), (4, 0.03239444159025773)] 151 | 2 152 | orders_ctn_read 153 | orders 154 | [(2, 0.45735036279462654), (1, 0.32525160826613536), (4, 0.09606275219647857), (0, 0.040445092247586495), (3, 0.040445092247586495), (5, 0.040445092247586495)] 155 | 3 156 | orders_ctn_read 157 | orders 158 | [(2, 0.2161666639933777), (0, 0.1888755026521981), (1, 0.1888755026521981), (3, 0.1888755026521981), (5, 0.1888755026521981), (4, 0.028331325397829718)] 159 | 4 160 | orders_ctn_read 161 | orders 162 | [(2, 0.45735036279462654), (1, 0.32525160826613536), (4, 0.09606275219647857), (0, 0.040445092247586495), (3, 0.040445092247586495), (5, 0.040445092247586495)] 163 | 5 164 | orders_ctn_read 165 | orders 166 | [(2, 0.3755661614169698), (0, 0.23728961125685585), (1, 0.23728961125685585), (4, 0.07866773269226165), (3, 0.035593441688528384), (5, 0.035593441688528384)] 167 | 6 168 | orders_ctn_read 169 | orders 170 | [(2, 0.45735036279462654), (1, 0.32525160826613536), (4, 0.09606275219647857), (0, 0.040445092247586495), (3, 0.040445092247586495), (5, 0.040445092247586495)] 171 | 7 172 | orders_ctn_read 173 | orders 174 | [(2, 0.27764766752696973), (3, 0.20878901666033023), (4, 0.20878901666033023), (5, 0.20878901666033023), (1, 0.06466692999299019), (0, 0.03131835249904954)] 175 | 8 176 | orders_ctn_read 177 | orders 178 | [(3, 0.4699374248149906), (5, 0.21387026212064666), (4, 0.1577260878010212), (2, 0.06662207919257045), (1, 0.058100817125989405), (0, 0.03374332894478173)] 179 | 9 180 | orders_ctn_read 181 | orders 182 | [(2, 0.3539150865905991), (1, 0.24083472383684298), (0, 0.23029366943484128), (4, 0.09587742500947001), (5, 0.04453504471302023), (3, 0.0345440504152262)] 183 | 10 184 | orders_ctn_read 185 | orders 186 | [(4, 0.25164261447469083), (1, 0.23440876832927504), (2, 0.2331967900804672), (0, 0.21596294393505147), (3, 0.03239444159025773), (5, 0.03239444159025773)] 187 | -----one service finish----- 188 | 4 189 | 1 190 | carts_ctn_read 191 | carts 192 | [(2, 0.41574848938044867), (3, 0.22598492171942097), (1, 0.16479373125577654), (4, 0.08662576387157528), (5, 0.07294935551486538), (0, 0.03389773825791315)] 193 | 2 194 | carts_ctn_read 195 | carts 196 | [(2, 0.4020816963341794), (3, 0.22138100615172365), (1, 0.16001352676620728), (4, 0.11268659494013945), (5, 0.07063002488499176), (0, 0.033207150922758555)] 197 | 3 198 | carts_ctn_read 199 | carts 200 | [(2, 0.4764474168249641), (1, 0.2160389100460662), (4, 0.10928760178800646), (5, 0.09116064731510354), (0, 0.07310346066486889), (3, 0.03396196336099074)] 201 | 4 202 | carts_ctn_read 203 | carts 204 | [(2, 0.48888857631837096), (3, 0.25651979969786914), (1, 0.13915771411971894), (0, 0.03847796995468037), (4, 0.03847796995468037), (5, 0.03847796995468037)] 205 | 5 206 | carts_ctn_read 207 | carts 208 | [(2, 0.417459058711447), (3, 0.33059599855460514), (5, 0.07898279643399791), (4, 0.06884811433789831), (1, 0.06550295979068216), (0, 0.03861107217136946)] 209 | 6 210 | carts_ctn_read 211 | carts 212 | [(2, 0.479510222065053), (3, 0.30309174899570884), (1, 0.09606275219647858), (0, 0.040445092247586495), (4, 0.040445092247586495), (5, 0.040445092247586495)] 213 | 7 214 | carts_ctn_read 215 | carts 216 | [(2, 0.47639624998009555), (1, 0.21602200833446794), (4, 0.11042169698630107), (5, 0.09009901096329302), (0, 0.07310021958442713), (3, 0.03396081415141538)] 217 | 8 218 | carts_ctn_read 219 | carts 220 | [(2, 0.48888857631837096), (3, 0.25651979969786914), (1, 0.13915771411971894), (0, 0.03847796995468037), (4, 0.03847796995468037), (5, 0.03847796995468037)] 221 | 9 222 | carts_ctn_read 223 | carts 224 | [(3, 0.3626907079226023), (2, 0.2926380928778693), (4, 0.12179904601936202), (5, 0.11370739928143256), (1, 0.07472993764255986), (0, 0.03443481625617402)] 225 | 10 226 | carts_ctn_read 227 | carts 228 | [(2, 0.3324918704132366), (3, 0.2803639022605891), (4, 0.23728961125685585), (1, 0.07866773269226167), (0, 0.035593441688528384), (5, 0.035593441688528384)] 229 | -----one service finish----- 230 | 5 231 | 1 232 | payment_ctn_write 233 | payment_ctn_read 234 | payment 235 | [(0, 0.2671544743143986), (2, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 236 | 2 237 | payment_ctn_write 238 | payment_ctn_read 239 | payment 240 | [(0, 0.2671544743143986), (2, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 241 | 3 242 | payment_ctn_write 243 | payment_ctn_read 244 | payment 245 | [(4, 0.5418161682868553), (3, 0.2171075633602206), (2, 0.10558816041615456), (1, 0.0931546756829628), (0, 0.04233343225380674)] 246 | 4 247 | payment_ctn_write 248 | payment_ctn_read 249 | payment 250 | [(0, 0.2671544743143986), (2, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 251 | 5 252 | payment_ctn_write 253 | payment_ctn_read 254 | payment 255 | [(0, 0.2671544743143986), (2, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 256 | 6 257 | payment_ctn_write 258 | payment_ctn_read 259 | payment 260 | [(0, 0.2671544743143986), (2, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 261 | 7 262 | payment_ctn_write 263 | payment_ctn_read 264 | payment 265 | [(3, 0.3113114120347152), (0, 0.3046340924416547), (2, 0.2618125117236332), (1, 0.0829701070414521), (4, 0.03927187675854498)] 266 | 8 267 | payment_ctn_write 268 | payment_ctn_read 269 | payment 270 | [(0, 0.2671544743143986), (2, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 271 | 9 272 | payment_ctn_write 273 | payment_ctn_read 274 | payment 275 | 10 276 | payment_ctn_write 277 | payment_ctn_read 278 | payment 279 | [(0, 0.35413349561556273), (3, 0.261812424760226), (4, 0.261812424760226), (1, 0.08296979114995144), (2, 0.0392718637140339)] 280 | -----one service finish----- 281 | 6 282 | 1 283 | shipping_ctn_write 284 | shipping_ctn_read 285 | shipping 286 | [(3, 0.4981904620335506), (2, 0.35534720641467527), (0, 0.04882077718392473), (1, 0.04882077718392473), (4, 0.04882077718392473)] 287 | 2 288 | shipping_ctn_write 289 | shipping_ctn_read 290 | shipping 291 | [(2, 0.2499246840233927), (3, 0.2498791635245301), (0, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 292 | 3 293 | shipping_ctn_write 294 | shipping_ctn_read 295 | shipping 296 | [(3, 0.3327978163784649), (2, 0.29472721848542316), (4, 0.2902691751185132), (0, 0.04110289500879938), (1, 0.04110289500879938)] 297 | 4 298 | shipping_ctn_write 299 | shipping_ctn_read 300 | shipping 301 | [(3, 0.33042994197471154), (2, 0.29495995817353565), (4, 0.29240430983415405), (0, 0.04110289500879938), (1, 0.04110289500879938)] 302 | 5 303 | shipping_ctn_write 304 | shipping_ctn_read 305 | shipping 306 | [(3, 0.25228154974547135), (2, 0.2475222978024515), (0, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 307 | 6 308 | shipping_ctn_write 309 | shipping_ctn_read 310 | shipping 311 | [(3, 0.49859367949946987), (2, 0.35494398894875595), (0, 0.04882077718392473), (1, 0.04882077718392473), (4, 0.04882077718392473)] 312 | 7 313 | shipping_ctn_write 314 | shipping_ctn_read 315 | shipping 316 | [(3, 0.25333839210705433), (2, 0.24646545544086854), (0, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 317 | 8 318 | shipping_ctn_write 319 | shipping_ctn_read 320 | shipping 321 | [(3, 0.25313403508377175), (2, 0.24666981246415107), (0, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 322 | 9 323 | shipping_ctn_write 324 | shipping_ctn_read 325 | shipping 326 | [(3, 0.3337778777506682), (2, 0.2944450767464809), (4, 0.2895712554852522), (0, 0.04110289500879938), (1, 0.04110289500879938)] 327 | 10 328 | shipping_ctn_write 329 | shipping_ctn_read 330 | shipping 331 | [(3, 0.34962221342757804), (2, 0.29415269649616077), (4, 0.2740193000586625), (0, 0.04110289500879938), (1, 0.04110289500879938)] 332 | -----one service finish----- 333 | 2023-03-14 02:57:46 334 | -------------------------------------------------------------------------------- /pa_result/single_service/memory_single_service_eta1000.txt: -------------------------------------------------------------------------------- 1 | 2023-03-14 04:13:53 2 | 0 3 | 1 4 | front-end_ctn_write 5 | front-end_ctn_read 6 | front-end 7 | [(4, 0.26715447431439865), (1, 0.23264937323352428), (2, 0.23264937323352428), (3, 0.23264937323352428), (0, 0.03489740598502865)] 8 | 2 9 | front-end_ctn_write 10 | front-end_ctn_read 11 | front-end 12 | [(1, 0.4401782017592745), (2, 0.3636580715644612), (4, 0.104803174919066), (0, 0.04568027587859916), (3, 0.04568027587859916)] 13 | 3 14 | front-end_ctn_write 15 | front-end_ctn_read 16 | front-end 17 | [(4, 0.39643454158164143), (1, 0.38842283816568535), (2, 0.12597195212695372), (0, 0.04458533406285982), (3, 0.04458533406285982)] 18 | 4 19 | front-end_ctn_write 20 | front-end_ctn_read 21 | front-end 22 | [(2, 0.44695191878094614), (1, 0.27553011662952515), (4, 0.14644796327584), (0, 0.08974048381925999), (3, 0.04132951749442878)] 23 | 5 24 | front-end_ctn_write 25 | front-end_ctn_read 26 | front-end 27 | [(3, 0.3218874549618694), (4, 0.3218874549618694), (1, 0.2740193000586625), (0, 0.04110289500879938), (2, 0.04110289500879938)] 28 | 6 29 | front-end_ctn_write 30 | front-end_ctn_read 31 | front-end 32 | [(1, 0.5164900857974944), (4, 0.20853597315392147), (0, 0.1446484930638321), (3, 0.08914631565165089), (2, 0.0411791323331012)] 33 | 7 34 | front-end_ctn_write 35 | front-end_ctn_read 36 | front-end 37 | [(2, 0.44695191878094614), (1, 0.27553011662952515), (4, 0.14644796327584), (0, 0.08974048381925999), (3, 0.04132951749442878)] 38 | 8 39 | front-end_ctn_write 40 | front-end_ctn_read 41 | front-end 42 | [(1, 0.3541334956155628), (2, 0.261812424760226), (3, 0.261812424760226), (4, 0.08296979114995144), (0, 0.039271863714033906)] 43 | 9 44 | front-end_ctn_write 45 | front-end_ctn_read 46 | front-end 47 | [(4, 0.26715447431439865), (1, 0.23264937323352428), (2, 0.23264937323352428), (3, 0.23264937323352428), (0, 0.03489740598502865)] 48 | 10 49 | front-end_ctn_write 50 | front-end_ctn_read 51 | front-end 52 | [(2, 0.4176020205902834), (1, 0.33617227862482474), (3, 0.10827583687643702), (4, 0.09504956741453664), (0, 0.04290029649391823)] 53 | -----one service finish----- 54 | 1 55 | 1 56 | user_ctn_read 57 | user 58 | [(5, 0.3259139368989766), (2, 0.2699200602065594), (3, 0.23069664649381952), (4, 0.08092883756671895), (0, 0.057936021859852596), (1, 0.03460449697407293)] 59 | 2 60 | user_ctn_read 61 | user 62 | [(2, 0.26841690920938754), (4, 0.2242486245949614), (3, 0.2113379940479622), (5, 0.2113379940479622), (0, 0.05295777899253235), (1, 0.031700699107194336)] 63 | 3 64 | user_ctn_read 65 | user 66 | [(2, 0.27764766752696973), (3, 0.20878901666033023), (4, 0.20878901666033023), (5, 0.20878901666033023), (0, 0.06466692999299019), (1, 0.03131835249904954)] 67 | 4 68 | user_ctn_read 69 | user 70 | [(2, 0.26759981678971884), (4, 0.2237882891146647), (5, 0.22150594841166757), (3, 0.21400340163900292), (0, 0.041002033799095575), (1, 0.03210051024585044)] 71 | 5 72 | user_ctn_read 73 | user 74 | [(4, 0.3888451236752908), (2, 0.3372692712869557), (1, 0.085421413860875), (0, 0.08269204955552273), (5, 0.068242664620295), (3, 0.0375294770010607)] 75 | 6 76 | user_ctn_read 77 | user 78 | [(0, 0.3324918704132366), (2, 0.2803639022605891), (4, 0.23728961125685585), (1, 0.07866773269226167), (3, 0.035593441688528384), (5, 0.035593441688528384)] 79 | 7 80 | user_ctn_read 81 | user 82 | [(4, 0.33022609167125033), (2, 0.29342928221663866), (3, 0.2417562512079529), (0, 0.06206149954177208), (1, 0.036263437681192945), (5, 0.036263437681192945)] 83 | 8 84 | user_ctn_read 85 | user 86 | [(0, 0.2161666639933777), (2, 0.1888755026521981), (3, 0.1888755026521981), (4, 0.1888755026521981), (5, 0.1888755026521981), (1, 0.028331325397829718)] 87 | 9 88 | user_ctn_read 89 | user 90 | [(0, 0.2776476675269697), (2, 0.20878901666033023), (4, 0.20878901666033023), (5, 0.20878901666033023), (1, 0.06466692999299019), (3, 0.03131835249904954)] 91 | 10 92 | user_ctn_read 93 | user 94 | [(2, 0.2776476675269697), (3, 0.20878901666033023), (4, 0.20878901666033023), (5, 0.20878901666033023), (0, 0.06466692999299019), (1, 0.03131835249904954)] 95 | -----one service finish----- 96 | 2 97 | 1 98 | catalogue_ctn_write 99 | catalogue_ctn_read 100 | catalogue 101 | [(2, 0.46200362025391745), (4, 0.32596477255599315), (3, 0.09330134296481374), (1, 0.07396936724266993), (0, 0.04476089698260565)] 102 | 2 103 | catalogue_ctn_write 104 | catalogue_ctn_read 105 | catalogue 106 | [(0, 0.2671544743143986), (2, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 107 | 3 108 | catalogue_ctn_write 109 | catalogue_ctn_read 110 | catalogue 111 | [(0, 0.4401782017592745), (3, 0.3636580715644612), (1, 0.10480317491906602), (2, 0.045680275878599165), (4, 0.045680275878599165)] 112 | 4 113 | catalogue_ctn_write 114 | catalogue_ctn_read 115 | catalogue 116 | [(3, 0.44695191878094614), (4, 0.27553011662952515), (2, 0.14644796327583995), (1, 0.08974048381925999), (0, 0.04132951749442878)] 117 | 5 118 | catalogue_ctn_write 119 | catalogue_ctn_read 120 | catalogue 121 | 6 122 | catalogue_ctn_write 123 | catalogue_ctn_read 124 | catalogue 125 | [(2, 0.3218129573490603), (3, 0.29413254712730946), (4, 0.2618125117236331), (1, 0.08297010704145208), (0, 0.039271876758544974)] 126 | 7 127 | catalogue_ctn_write 128 | catalogue_ctn_read 129 | catalogue 130 | [(3, 0.3200914099943234), (2, 0.3056525791874785), (4, 0.26593079128654173), (0, 0.06843560083867498), (1, 0.03988961869298127)] 131 | 8 132 | catalogue_ctn_write 133 | catalogue_ctn_read 134 | catalogue 135 | [(3, 0.3472109482604896), (0, 0.2965639616632492), (2, 0.2740193000586625), (1, 0.04110289500879938), (4, 0.04110289500879938)] 136 | 9 137 | catalogue_ctn_write 138 | catalogue_ctn_read 139 | catalogue 140 | [(2, 0.35413349561556273), (3, 0.261812424760226), (4, 0.261812424760226), (1, 0.08296979114995144), (0, 0.0392718637140339)] 141 | 10 142 | catalogue_ctn_write 143 | catalogue_ctn_read 144 | catalogue 145 | [(0, 0.2671544743143986), (2, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 146 | -----one service finish----- 147 | 3 148 | 1 149 | orders_ctn_read 150 | orders 151 | [(5, 0.2651462783597737), (2, 0.23092731484405032), (3, 0.21278166635918425), (4, 0.21278166635918425), (0, 0.04644582412392986), (1, 0.03191724995387764)] 152 | 2 153 | orders_ctn_read 154 | orders 155 | [(2, 0.27764766752696973), (3, 0.20878901666033023), (4, 0.20878901666033023), (5, 0.20878901666033023), (0, 0.06466692999299019), (1, 0.03131835249904954)] 156 | 3 157 | orders_ctn_read 158 | orders 159 | [(2, 0.2447639582325552), (4, 0.23444835617597337), (5, 0.22549148908603409), (3, 0.21154480852329588), (1, 0.05201966670364688), (0, 0.03173172127849439)] 160 | 4 161 | orders_ctn_read 162 | orders 163 | [(2, 0.37556616141696986), (3, 0.23728961125685585), (4, 0.23728961125685585), (0, 0.07866773269226167), (1, 0.035593441688528384), (5, 0.035593441688528384)] 164 | 5 165 | orders_ctn_read 166 | orders 167 | [(4, 0.3186290480744075), (2, 0.3063998932782637), (3, 0.24234142704017264), (0, 0.059927203495104435), (1, 0.036351214056025906), (5, 0.036351214056025906)] 168 | 6 169 | orders_ctn_read 170 | orders 171 | [(2, 0.3176334281274044), (4, 0.2952223445464213), (3, 0.23728961125685588), (0, 0.07866773269226168), (1, 0.035593441688528384), (5, 0.035593441688528384)] 172 | 7 173 | orders_ctn_read 174 | orders 175 | [(2, 0.4713503280781408), (5, 0.1655928435242005), (4, 0.14148956606871843), (0, 0.11575390314813921), (1, 0.07211882845335464), (3, 0.03369453072744638)] 176 | 8 177 | orders_ctn_read 178 | orders 179 | [(2, 0.27764766752696973), (3, 0.20878901666033023), (4, 0.20878901666033023), (5, 0.20878901666033023), (0, 0.06466692999299019), (1, 0.03131835249904954)] 180 | 9 181 | orders_ctn_read 182 | orders 183 | [(2, 0.2161666639933777), (0, 0.1888755026521981), (3, 0.1888755026521981), (4, 0.1888755026521981), (5, 0.1888755026521981), (1, 0.028331325397829718)] 184 | 10 185 | orders_ctn_read 186 | orders 187 | [(2, 0.45683616331833937), (4, 0.20769150631554234), (5, 0.15477965831619617), (0, 0.09362521049602271), (1, 0.05386544090144506), (3, 0.03320202065245442)] 188 | -----one service finish----- 189 | 4 190 | 1 191 | carts_ctn_read 192 | carts 193 | 2 194 | carts_ctn_read 195 | carts 196 | [(0, 0.37270418800812), (2, 0.37270418800812), (1, 0.08881784203719967), (4, 0.08881784203719967), (3, 0.03847796995468037), (5, 0.03847796995468037)] 197 | 3 198 | carts_ctn_read 199 | carts 200 | [(0, 0.37270418800812), (2, 0.37270418800811994), (1, 0.08881784203719965), (5, 0.08881784203719965), (3, 0.03847796995468036), (4, 0.03847796995468036)] 201 | 4 202 | carts_ctn_read 203 | carts 204 | [(1, 0.25164261447469083), (2, 0.25164261447469083), (0, 0.21596294393505147), (4, 0.21596294393505147), (3, 0.03239444159025773), (5, 0.03239444159025773)] 205 | 5 206 | carts_ctn_read 207 | carts 208 | [(2, 0.2161666639933777), (0, 0.1888755026521981), (3, 0.1888755026521981), (4, 0.1888755026521981), (5, 0.1888755026521981), (1, 0.028331325397829718)] 209 | 6 210 | carts_ctn_read 211 | carts 212 | [(2, 0.3324918704132366), (1, 0.2803639022605891), (0, 0.23728961125685585), (4, 0.07866773269226168), (3, 0.035593441688528384), (5, 0.035593441688528384)] 213 | 7 214 | carts_ctn_read 215 | carts 216 | [(0, 0.33748792381028425), (4, 0.2868857708235624), (2, 0.24206150995721504), (1, 0.06094634242177383), (3, 0.036309226493582265), (5, 0.036309226493582265)] 217 | 8 218 | carts_ctn_read 219 | carts 220 | [(0, 0.37270418800812), (2, 0.37270418800812), (1, 0.08881784203719967), (4, 0.08881784203719967), (3, 0.03847796995468037), (5, 0.03847796995468037)] 221 | 9 222 | carts_ctn_read 223 | carts 224 | [(2, 0.2776476675269697), (0, 0.2087890166603302), (1, 0.2087890166603302), (3, 0.2087890166603302), (5, 0.06466692999299016), (4, 0.03131835249904953)] 225 | 10 226 | carts_ctn_read 227 | carts 228 | [(0, 0.37270418800812), (2, 0.37270418800812), (1, 0.08881784203719965), (5, 0.08881784203719965), (3, 0.03847796995468036), (4, 0.03847796995468036)] 229 | -----one service finish----- 230 | 5 231 | 1 232 | payment_ctn_write 233 | payment_ctn_read 234 | payment 235 | [(2, 0.35413349561556273), (3, 0.261812424760226), (4, 0.261812424760226), (1, 0.08296979114995144), (0, 0.0392718637140339)] 236 | 2 237 | payment_ctn_write 238 | payment_ctn_read 239 | payment 240 | [(0, 0.2671544743143986), (2, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 241 | 3 242 | payment_ctn_write 243 | payment_ctn_read 244 | payment 245 | [(3, 0.613863411631167), (0, 0.18668363515409364), (1, 0.10692323292102818), (2, 0.04626486014685561), (4, 0.04626486014685561)] 246 | 4 247 | payment_ctn_write 248 | payment_ctn_read 249 | payment 250 | [(4, 0.34203456475491334), (0, 0.3017403451688254), (2, 0.2740193000586625), (1, 0.04110289500879938), (3, 0.04110289500879938)] 251 | 5 252 | payment_ctn_write 253 | payment_ctn_read 254 | payment 255 | [(0, 0.2671544743143986), (2, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 256 | 6 257 | payment_ctn_write 258 | payment_ctn_read 259 | payment 260 | [(3, 0.4008982829442757), (0, 0.3693224123414632), (1, 0.09807116588039942), (4, 0.08793273735363404), (2, 0.04377540148022769)] 261 | 7 262 | payment_ctn_write 263 | payment_ctn_read 264 | payment 265 | [(0, 0.4401782017592745), (3, 0.3636580715644612), (1, 0.10480317491906602), (2, 0.045680275878599165), (4, 0.045680275878599165)] 266 | 8 267 | payment_ctn_write 268 | payment_ctn_read 269 | payment 270 | [(3, 0.5698367111888665), (0, 0.1975459282371802), (1, 0.12410761356216896), (2, 0.06472077163260569), (4, 0.04378897537917873)] 271 | 9 272 | payment_ctn_write 273 | payment_ctn_read 274 | payment 275 | [(0, 0.25581808806468787), (4, 0.24398575948323503), (2, 0.23264937323352425), (3, 0.23264937323352425), (1, 0.03489740598502864)] 276 | 10 277 | payment_ctn_write 278 | payment_ctn_read 279 | payment 280 | [(0, 0.2671544743143986), (2, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 281 | -----one service finish----- 282 | 6 283 | 1 284 | shipping_ctn_write 285 | shipping_ctn_read 286 | shipping 287 | [(3, 0.3443889598225398), (0, 0.29938595010119895), (4, 0.2740193000586625), (1, 0.04110289500879938), (2, 0.04110289500879938)] 288 | 2 289 | shipping_ctn_write 290 | shipping_ctn_read 291 | shipping 292 | [(2, 0.2671544743143986), (0, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 293 | 3 294 | shipping_ctn_write 295 | shipping_ctn_read 296 | shipping 297 | [(0, 0.2539239404433908), (3, 0.24587990710453209), (2, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 298 | 4 299 | shipping_ctn_write 300 | shipping_ctn_read 301 | shipping 302 | [(0, 0.2671544743143986), (2, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 303 | 5 304 | shipping_ctn_write 305 | shipping_ctn_read 306 | shipping 307 | [(4, 0.31702965717625464), (2, 0.30396984850967157), (0, 0.27753374027403893), (3, 0.06126555621341327), (1, 0.04020119782662163)] 308 | 6 309 | shipping_ctn_write 310 | shipping_ctn_read 311 | shipping 312 | [(3, 0.42179777224424175), (0, 0.38203850107949394), (1, 0.10480317491906602), (2, 0.04568027587859916), (4, 0.04568027587859916)] 313 | 7 314 | shipping_ctn_write 315 | shipping_ctn_read 316 | shipping 317 | [(0, 0.2539847111028863), (3, 0.24581913644503658), (2, 0.23264937323352425), (4, 0.23264937323352425), (1, 0.03489740598502864)] 318 | 8 319 | shipping_ctn_write 320 | shipping_ctn_read 321 | shipping 322 | [(4, 0.5869130037745139), (3, 0.18399579284259618), (2, 0.10681929372794159), (1, 0.077571359338134), (0, 0.04470055031681428)] 323 | 9 324 | shipping_ctn_write 325 | shipping_ctn_read 326 | shipping 327 | [(4, 0.5361581522885163), (3, 0.215358732640642), (2, 0.1141829233307139), (1, 0.09223322896629514), (0, 0.04206696277383269)] 328 | 10 329 | shipping_ctn_write 330 | shipping_ctn_read 331 | shipping 332 | [(3, 0.34268785874689056), (0, 0.30108705117684825), (4, 0.2740193000586625), (1, 0.04110289500879938), (2, 0.04110289500879938)] 333 | -----one service finish----- 334 | 2023-03-14 04:27:27 335 | -------------------------------------------------------------------------------- /pa_result/single_service/memory_single_service_gamma75.txt: -------------------------------------------------------------------------------- 1 | 2023-03-14 03:22:30 2 | 0 3 | 1 4 | front-end_ctn_write 5 | front-end_ctn_read 6 | front-end 7 | [(2, 0.43593332440565297), (1, 0.327407487731715), (4, 0.0967866587797515), (3, 0.0964675465361559), (0, 0.04340498254672468)] 8 | 2 9 | front-end_ctn_write 10 | front-end_ctn_read 11 | front-end 12 | [(4, 0.2530373964024693), (1, 0.24676645114545356), (0, 0.23264937323352425), (3, 0.23264937323352425), (2, 0.03489740598502865)] 13 | 3 14 | front-end_ctn_write 15 | front-end_ctn_read 16 | front-end 17 | [(1, 0.5164900857974944), (4, 0.20853597315392147), (0, 0.1446484930638321), (3, 0.08914631565165089), (2, 0.0411791323331012)] 18 | 4 19 | front-end_ctn_write 20 | front-end_ctn_read 21 | front-end 22 | 5 23 | front-end_ctn_write 24 | front-end_ctn_read 25 | front-end 26 | [(2, 0.44518832684893905), (1, 0.3241504424025957), (3, 0.10460226620069116), (4, 0.08233144815074721), (0, 0.04372751639702685)] 27 | 6 28 | front-end_ctn_write 29 | front-end_ctn_read 30 | front-end 31 | [(2, 0.43583542274742104), (1, 0.3274455589217482), (4, 0.09677561924055697), (3, 0.09654161018003371), (0, 0.043401788910240026)] 32 | 7 33 | front-end_ctn_write 34 | front-end_ctn_read 35 | front-end 36 | [(1, 0.5164900857974944), (4, 0.20853597315392147), (0, 0.1446484930638321), (3, 0.08914631565165089), (2, 0.0411791323331012)] 37 | 8 38 | front-end_ctn_write 39 | front-end_ctn_read 40 | front-end 41 | [(2, 0.43599370471427873), (1, 0.32670456466940506), (3, 0.09726280318624599), (4, 0.09666821782984819), (0, 0.04337070960022198)] 42 | 9 43 | front-end_ctn_write 44 | front-end_ctn_read 45 | front-end 46 | [(2, 0.4358070573563996), (1, 0.32703012516194), (3, 0.0970909008698844), (4, 0.09669380207099815), (0, 0.043378114540778015)] 47 | 10 48 | front-end_ctn_write 49 | front-end_ctn_read 50 | front-end 51 | [(1, 0.5164900857974944), (4, 0.20853597315392147), (0, 0.14464849306383207), (3, 0.08914631565165089), (2, 0.04117913233310119)] 52 | -----one service finish----- 53 | 1 54 | 1 55 | user_ctn_read 56 | user 57 | [(4, 0.3493565722739658), (2, 0.27695567299896945), (3, 0.24289185691468326), (0, 0.057928340737976515), (1, 0.0364337785372025), (5, 0.0364337785372025)] 58 | 2 59 | user_ctn_read 60 | user 61 | [(2, 0.31549396955563563), (5, 0.27993284921249334), (3, 0.23054507797369175), (4, 0.08229347268037857), (0, 0.05715286888174701), (1, 0.03458176169605376)] 62 | 3 63 | user_ctn_read 64 | user 65 | [(2, 0.32245756954616833), (5, 0.2750493871444268), (3, 0.23132228128569673), (4, 0.07277841370744426), (0, 0.06369400612340949), (1, 0.03469834219285452)] 66 | 4 67 | user_ctn_read 68 | user 69 | [(4, 0.366657809137519), (2, 0.3625525106840934), (0, 0.10086017034219287), (5, 0.06941600969856819), (1, 0.06283564311298974), (3, 0.037677857024636746)] 70 | 5 71 | user_ctn_read 72 | user 73 | [(5, 0.4528641459360306), (3, 0.23998537646083037), (2, 0.15509195057752467), (0, 0.06513681165829582), (4, 0.050923908898194016), (1, 0.03599780646912456)] 74 | 6 75 | user_ctn_read 76 | user 77 | [(2, 0.3132699232785554), (5, 0.2810281111810693), (3, 0.23012633824891285), (4, 0.08377100095366634), (0, 0.057285675600459204), (1, 0.034518950737336934)] 78 | 7 79 | user_ctn_read 80 | user 81 | [(4, 0.3910452027213846), (2, 0.3483955593409751), (0, 0.11357941553530357), (1, 0.07062261704591175), (3, 0.0381786026782125), (5, 0.0381786026782125)] 82 | 8 83 | user_ctn_read 84 | user 85 | [(2, 0.373234054948349), (5, 0.3081279705479703), (4, 0.12096418940190154), (0, 0.09720436397680741), (1, 0.06496310807916862), (3, 0.03550631304580322)] 86 | 9 87 | user_ctn_read 88 | user 89 | [(5, 0.5509909353557858), (4, 0.17232905076490376), (2, 0.13808737285043782), (0, 0.06434632170081869), (1, 0.03712315966402694), (3, 0.03712315966402694)] 90 | 10 91 | user_ctn_read 92 | user 93 | [(2, 0.3780840295673031), (5, 0.31505811651735055), (4, 0.12462325734557589), (0, 0.08709382723244112), (1, 0.059124276570618965), (3, 0.0360164927667105)] 94 | -----one service finish----- 95 | 2 96 | 1 97 | catalogue_ctn_write 98 | catalogue_ctn_read 99 | catalogue 100 | [(3, 0.5243584449097527), (4, 0.2116803238839984), (0, 0.1453790947023043), (1, 0.07706951102136636), (2, 0.041512625482578266)] 101 | 2 102 | catalogue_ctn_write 103 | catalogue_ctn_read 104 | catalogue 105 | [(3, 0.4310859502021189), (0, 0.35846061603328255), (1, 0.09276280033250453), (2, 0.07283996253104333), (4, 0.044850670901050764)] 106 | 3 107 | catalogue_ctn_write 108 | catalogue_ctn_read 109 | catalogue 110 | [(4, 0.5164900857974944), (3, 0.2085359731539215), (2, 0.14464849306383207), (1, 0.08914631565165088), (0, 0.04117913233310119)] 111 | 4 112 | catalogue_ctn_write 113 | catalogue_ctn_read 114 | catalogue 115 | [(3, 0.626330917700737), (0, 0.18748748236255217), (1, 0.09210351900681714), (2, 0.04703904046494691), (4, 0.04703904046494691)] 116 | 5 117 | catalogue_ctn_write 118 | catalogue_ctn_read 119 | catalogue 120 | [(2, 0.4167165406866054), (4, 0.3734704296115094), (3, 0.09085378747644822), (1, 0.07407203935357204), (0, 0.04488720287186494)] 121 | 6 122 | catalogue_ctn_write 123 | catalogue_ctn_read 124 | catalogue 125 | [(4, 0.5164900857974944), (3, 0.20853597315392142), (2, 0.1446484930638321), (1, 0.08914631565165088), (0, 0.04117913233310119)] 126 | 7 127 | catalogue_ctn_write 128 | catalogue_ctn_read 129 | catalogue 130 | [(3, 0.4236292470575037), (2, 0.40250223515232375), (0, 0.07979347160391298), (1, 0.04703752309312984), (4, 0.04703752309312984)] 131 | 8 132 | catalogue_ctn_write 133 | catalogue_ctn_read 134 | catalogue 135 | [(4, 0.5164900857974944), (3, 0.20853597315392142), (2, 0.14464849306383204), (1, 0.08914631565165088), (0, 0.04117913233310119)] 136 | 9 137 | catalogue_ctn_write 138 | catalogue_ctn_read 139 | catalogue 140 | [(2, 0.40730491694061555), (4, 0.36334056117071034), (3, 0.11320287524062089), (0, 0.07235316343843794), (1, 0.04379848320961526)] 141 | 10 142 | catalogue_ctn_write 143 | catalogue_ctn_read 144 | catalogue 145 | [(2, 0.4152155070386622), (4, 0.37356098553316797), (3, 0.08632783418296694), (1, 0.0800888574528634), (0, 0.04480681579233959)] 146 | -----one service finish----- 147 | 3 148 | 1 149 | orders_ctn_read 150 | orders 151 | [(4, 0.3437744467904903), (2, 0.2805258608489612), (3, 0.24203050169526308), (0, 0.06106004015670651), (1, 0.03630457525428947), (5, 0.03630457525428947)] 152 | 2 153 | orders_ctn_read 154 | orders 155 | [(2, 0.3114530178022429), (4, 0.2834198934163577), (3, 0.23033938599937354), (5, 0.07842935523624966), (0, 0.06180743964587025), (1, 0.03455090789990604)] 156 | 3 157 | orders_ctn_read 158 | orders 159 | [(4, 0.3116219111820942), (2, 0.30324476622162166), (3, 0.23810391563147243), (0, 0.05916730379805618), (5, 0.05214651582203469), (1, 0.03571558734472086)] 160 | 4 161 | orders_ctn_read 162 | orders 163 | [(5, 0.4249421706487247), (2, 0.32649283747947305), (0, 0.10523618041641428), (1, 0.06576322973527139), (3, 0.03878279086005831), (4, 0.03878279086005831)] 164 | 5 165 | orders_ctn_read 166 | orders 167 | [(4, 0.33385817280241137), (2, 0.2789975998714143), (3, 0.23728961125685585), (1, 0.06317249297273264), (5, 0.05108868140805741), (0, 0.035593441688528384)] 168 | 6 169 | orders_ctn_read 170 | orders 171 | [(4, 0.5113442657909334), (5, 0.16845126452056833), (0, 0.16461807121058816), (1, 0.08475678830577767), (2, 0.03541480508606618), (3, 0.03541480508606618)] 172 | 7 173 | orders_ctn_read 174 | orders 175 | [(4, 0.6006722778376464), (0, 0.13637421034972133), (1, 0.09769804537373787), (5, 0.08603963308621115), (2, 0.03960791667634155), (3, 0.03960791667634155)] 176 | 8 177 | orders_ctn_read 178 | orders 179 | [(4, 0.6148540112856042), (0, 0.11386184774236004), (5, 0.1016680845672094), (1, 0.0888447574529137), (2, 0.04038564947595633), (3, 0.04038564947595633)] 180 | 9 181 | orders_ctn_read 182 | orders 183 | [(2, 0.3207813284570688), (4, 0.29207444421675693), (3, 0.23728961125685585), (1, 0.07866773269226167), (0, 0.035593441688528384), (5, 0.035593441688528384)] 184 | 10 185 | orders_ctn_read 186 | orders 187 | [(4, 0.44754489265560865), (3, 0.23785180569388867), (2, 0.13095724484249682), (5, 0.08501454884419543), (1, 0.06295373710972715), (0, 0.03567777085408331)] 188 | -----one service finish----- 189 | 4 190 | 1 191 | carts_ctn_read 192 | carts 193 | [(3, 0.47892569212384467), (5, 0.21715629835602868), (2, 0.1201415136240084), (4, 0.07618977205705765), (1, 0.07350911010036389), (0, 0.034077613738696604)] 194 | 2 195 | carts_ctn_read 196 | carts 197 | [(0, 0.5705161756696883), (1, 0.22806749678241875), (4, 0.08723713746144997), (2, 0.038059730028814326), (3, 0.038059730028814326), (5, 0.038059730028814326)] 198 | 3 199 | carts_ctn_read 200 | carts 201 | [(2, 0.44056012504461173), (4, 0.20111221810401686), (0, 0.15042659248178392), (5, 0.1072244683298822), (1, 0.06818465499489275), (3, 0.03249194104481256)] 202 | 4 203 | carts_ctn_read 204 | carts 205 | [(2, 0.5247933967211246), (4, 0.19849121748787177), (0, 0.09782579783433215), (5, 0.07544303592378522), (1, 0.06748463425970756), (3, 0.03596191777317869)] 206 | 5 207 | carts_ctn_read 208 | carts 209 | [(0, 0.570261083999306), (1, 0.22840460831372178), (4, 0.08719219693478457), (2, 0.03804737025072926), (3, 0.03804737025072926), (5, 0.03804737025072926)] 210 | 6 211 | carts_ctn_read 212 | carts 213 | [(0, 0.5705186013367683), (1, 0.2280642907991216), (4, 0.08723756503358138), (2, 0.038059847610176264), (3, 0.038059847610176264), (5, 0.038059847610176264)] 214 | 7 215 | carts_ctn_read 216 | carts 217 | [(0, 0.5834035906239646), (1, 0.2308634005916724), (4, 0.06953345042529191), (2, 0.03873318611969035), (3, 0.03873318611969035), (5, 0.03873318611969035)] 218 | 8 219 | carts_ctn_read 220 | carts 221 | [(2, 0.5501279545475827), (4, 0.15755203108249546), (5, 0.13430819807474445), (1, 0.07044732847393699), (0, 0.05049009243355285), (3, 0.03707439538768751)] 222 | 9 223 | carts_ctn_read 224 | carts 225 | [(2, 0.5514864703616372), (4, 0.15115772795944013), (5, 0.1390078517123544), (1, 0.07062672832844906), (0, 0.050585087220304616), (3, 0.03713613441781474)] 226 | 10 227 | carts_ctn_read 228 | carts 229 | [(0, 0.5838729903519467), (1, 0.23068461629799242), (4, 0.06917035574172081), (2, 0.038757345869446694), (3, 0.038757345869446694), (5, 0.038757345869446694)] 230 | -----one service finish----- 231 | 5 232 | 1 233 | payment_ctn_write 234 | payment_ctn_read 235 | payment 236 | [(4, 0.34531213603812727), (0, 0.2984627738856115), (2, 0.2740193000586625), (1, 0.04110289500879938), (3, 0.04110289500879938)] 237 | 2 238 | payment_ctn_write 239 | payment_ctn_read 240 | payment 241 | [(3, 0.6089604823821166), (0, 0.19259125824357315), (2, 0.08388092001648675), (4, 0.06858354315833741), (1, 0.045983796199486074)] 242 | 3 243 | payment_ctn_write 244 | payment_ctn_read 245 | payment 246 | [(0, 0.3248182482452184), (4, 0.3056401841444445), (2, 0.2679786287597896), (3, 0.0613661445365791), (1, 0.04019679431396845)] 247 | 4 248 | payment_ctn_write 249 | payment_ctn_read 250 | payment 251 | [(2, 0.3979855663626718), (4, 0.3803777618468116), (3, 0.10827582312588582), (0, 0.06913874621730697), (1, 0.04422210244732385)] 252 | 5 253 | payment_ctn_write 254 | payment_ctn_read 255 | payment 256 | [(4, 0.4318438844719544), (0, 0.3598927792820103), (1, 0.09158725549627687), (2, 0.07170022893400785), (3, 0.04497585181575059)] 257 | 6 258 | payment_ctn_write 259 | payment_ctn_read 260 | payment 261 | [(3, 0.4220378048369303), (2, 0.3709174948243539), (4, 0.0909349849069731), (0, 0.0710639021047799), (1, 0.04504581332696284)] 262 | 7 263 | payment_ctn_write 264 | payment_ctn_read 265 | payment 266 | [(3, 0.5777906342648721), (0, 0.2040342231415822), (1, 0.11638928480514624), (2, 0.057590631187132826), (4, 0.04419522660126671)] 267 | 8 268 | payment_ctn_write 269 | payment_ctn_read 270 | payment 271 | [(3, 0.4309611575377728), (2, 0.3671393795050009), (4, 0.07992465681868313), (0, 0.07663121657302994), (1, 0.04534358956551322)] 272 | 9 273 | payment_ctn_write 274 | payment_ctn_read 275 | payment 276 | [(2, 0.32455300084366695), (4, 0.30575635242175125), (0, 0.2679131736288291), (3, 0.06159049706142843), (1, 0.04018697604432437)] 277 | 10 278 | payment_ctn_write 279 | payment_ctn_read 280 | payment 281 | [(0, 0.39346630643430586), (4, 0.3789468414444357), (1, 0.12138607084148006), (2, 0.06230599641902398), (3, 0.04389478486075449)] 282 | -----one service finish----- 283 | 6 284 | 1 285 | shipping_ctn_write 286 | shipping_ctn_read 287 | shipping 288 | [(0, 0.4241031438922821), (3, 0.4164958712023578), (2, 0.06347633226776801), (1, 0.047962326318796114), (4, 0.047962326318796114)] 289 | 2 290 | shipping_ctn_write 291 | shipping_ctn_read 292 | shipping 293 | [(3, 0.32743408424181036), (4, 0.2862612723513288), (0, 0.2853677067636014), (1, 0.060711499883178505), (2, 0.04022543676008101)] 294 | 3 295 | shipping_ctn_write 296 | shipping_ctn_read 297 | shipping 298 | [(2, 0.4254384980899788), (4, 0.35659422540865504), (3, 0.09412087855439616), (1, 0.07941996100232371), (0, 0.044426436944646396)] 299 | 4 300 | shipping_ctn_write 301 | shipping_ctn_read 302 | shipping 303 | [(4, 0.6369579769511625), (3, 0.16541571113340378), (2, 0.09081944744475724), (1, 0.05908172302796055), (0, 0.047725141442715856)] 304 | 5 305 | shipping_ctn_write 306 | shipping_ctn_read 307 | shipping 308 | [(0, 0.4240588387155112), (3, 0.4168794876528121), (2, 0.06309276437259806), (1, 0.04798445462953935), (4, 0.04798445462953935)] 309 | 6 310 | shipping_ctn_write 311 | shipping_ctn_read 312 | shipping 313 | [(0, 0.4258873826889018), (4, 0.3679679991602787), (2, 0.08975888033180132), (3, 0.07128812262179428), (1, 0.04509761519722377)] 314 | 7 315 | shipping_ctn_write 316 | shipping_ctn_read 317 | shipping 318 | [(2, 0.40475756213420044), (4, 0.38186841696849805), (3, 0.11425738779550479), (0, 0.054431719648954956), (1, 0.04468491345284182)] 319 | 8 320 | shipping_ctn_write 321 | shipping_ctn_read 322 | shipping 323 | [(0, 0.42355115575099356), (3, 0.4168308272723844), (2, 0.06372165082553981), (1, 0.04794818307554115), (4, 0.04794818307554115)] 324 | 9 325 | shipping_ctn_write 326 | shipping_ctn_read 327 | shipping 328 | [(3, 0.41180729274647304), (0, 0.3920289805772627), (1, 0.10480317491906602), (2, 0.045680275878599165), (4, 0.045680275878599165)] 329 | 10 330 | shipping_ctn_write 331 | shipping_ctn_read 332 | shipping 333 | [(2, 0.40252946747148355), (4, 0.3800312502948642), (3, 0.09406702335803356), (1, 0.07891627299314323), (0, 0.044455985882475624)] 334 | -----one service finish----- 335 | 2023-03-14 03:58:16 336 | -------------------------------------------------------------------------------- /pa_result/single_service/network_single_service_eta100.txt: -------------------------------------------------------------------------------- 1 | 2023-03-14 05:28:24 2 | 0 3 | 1 4 | front-end_ctn_write 5 | front-end_ctn_read 6 | front-end 7 | [(1, 0.3218874549618694), (2, 0.3218874549618694), (3, 0.2740193000586625), (0, 0.04110289500879938), (4, 0.04110289500879938)] 8 | 2 9 | front-end_ctn_write 10 | front-end_ctn_read 11 | front-end 12 | [(0, 0.49706309772887675), (1, 0.356474570719349), (2, 0.048820777183924725), (3, 0.048820777183924725), (4, 0.048820777183924725)] 13 | 3 14 | front-end_ctn_write 15 | front-end_ctn_read 16 | front-end 17 | [(4, 0.2671544743143986), (0, 0.23264937323352425), (1, 0.23264937323352425), (2, 0.23264937323352425), (3, 0.03489740598502865)] 18 | 4 19 | front-end_ctn_write 20 | front-end_ctn_read 21 | front-end 22 | [(0, 0.4519605504444568), (1, 0.3732623198689283), (3, 0.08081588352022145), (2, 0.04698062308319679), (4, 0.04698062308319679)] 23 | 5 24 | front-end_ctn_write 25 | front-end_ctn_read 26 | front-end 27 | [(4, 0.2671544743143986), (0, 0.23264937323352425), (1, 0.23264937323352425), (2, 0.23264937323352425), (3, 0.03489740598502865)] 28 | 6 29 | front-end_ctn_write 30 | front-end_ctn_read 31 | front-end 32 | [(0, 0.36975502727891413), (2, 0.2740195533569939), (4, 0.2740195533569939), (1, 0.041102933003549086), (3, 0.041102933003549086)] 33 | 7 34 | front-end_ctn_write 35 | front-end_ctn_read 36 | front-end 37 | [(0, 0.4296643460200568), (1, 0.39756485029709804), (4, 0.07855790828153456), (2, 0.047106447700655314), (3, 0.047106447700655314)] 38 | 8 39 | front-end_ctn_write 40 | front-end_ctn_read 41 | front-end 42 | 9 43 | front-end_ctn_write 44 | front-end_ctn_read 45 | front-end 46 | [(1, 0.35413349561556273), (3, 0.261812424760226), (4, 0.261812424760226), (0, 0.08296979114995144), (2, 0.0392718637140339)] 47 | 10 48 | front-end_ctn_write 49 | front-end_ctn_read 50 | front-end 51 | [(0, 0.36975502727891413), (2, 0.2740195533569939), (4, 0.2740195533569939), (1, 0.041102933003549086), (3, 0.041102933003549086)] 52 | -----one service finish----- 53 | 1 54 | 1 55 | user_ctn_mem 56 | user_ctn_read 57 | user 58 | [(0, 0.4443304173442074), (2, 0.35700276540530057), (1, 0.0840922735049128), (4, 0.06904177319785515), (3, 0.045532770547724054)] 59 | 2 60 | user_ctn_mem 61 | user_ctn_read 62 | user 63 | [(2, 0.5164900857974946), (1, 0.20853597315392136), (4, 0.144648493063832), (3, 0.08914631565165086), (0, 0.04117913233310119)] 64 | 3 65 | user_ctn_mem 66 | user_ctn_read 67 | user 68 | [(0, 0.46888903405188), (2, 0.36441656510050396), (4, 0.07171112820567865), (1, 0.04749163632096875), (3, 0.04749163632096875)] 69 | 4 70 | user_ctn_mem 71 | user_ctn_read 72 | user 73 | [(3, 0.25572688791957077), (2, 0.24407695962835213), (1, 0.23264937323352428), (4, 0.23264937323352428), (0, 0.03489740598502865)] 74 | 5 75 | user_ctn_mem 76 | user_ctn_read 77 | user 78 | [(3, 0.2671544743143986), (1, 0.23264937323352425), (2, 0.23264937323352425), (4, 0.23264937323352425), (0, 0.03489740598502864)] 79 | 6 80 | user_ctn_mem 81 | user_ctn_read 82 | user 83 | [(1, 0.5058366043915514), (2, 0.3088883474324119), (3, 0.06980556522475695), (4, 0.06913623083641796), (0, 0.04633325211486179)] 84 | 7 85 | user_ctn_mem 86 | user_ctn_read 87 | user 88 | [(3, 0.2671544743143986), (1, 0.23264937323352425), (2, 0.23264937323352425), (4, 0.23264937323352425), (0, 0.03489740598502864)] 89 | 8 90 | user_ctn_mem 91 | user_ctn_read 92 | user 93 | [(1, 0.4993011007997413), (2, 0.3045351725239943), (3, 0.07537881113353292), (4, 0.07510463966413225), (0, 0.04568027587859916)] 94 | 9 95 | user_ctn_mem 96 | user_ctn_read 97 | user 98 | [(0, 0.44650711279430955), (2, 0.35614209393370905), (1, 0.08204480034079709), (4, 0.06969579583959183), (3, 0.045610197091592565)] 99 | 10 100 | user_ctn_mem 101 | user_ctn_read 102 | user 103 | [(2, 0.2671544743143986), (1, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.23264937323352425), (0, 0.03489740598502864)] 104 | -----one service finish----- 105 | 2 106 | 1 107 | catalogue_ctn_write 108 | catalogue_ctn_read 109 | catalogue 110 | [(2, 0.6138634116311671), (4, 0.18668363515409364), (0, 0.10692323292102819), (1, 0.046264860146855616), (3, 0.046264860146855616)] 111 | 2 112 | catalogue_ctn_write 113 | catalogue_ctn_read 114 | catalogue 115 | [(3, 0.583387939525771), (1, 0.2271982690783448), (0, 0.10046691727657621), (2, 0.044473437059653995), (4, 0.044473437059653995)] 116 | 3 117 | catalogue_ctn_write 118 | catalogue_ctn_read 119 | catalogue 120 | [(3, 0.44695191878094614), (4, 0.27553011662952515), (1, 0.14644796327584), (2, 0.08974048381925999), (0, 0.04132951749442878)] 121 | 4 122 | catalogue_ctn_write 123 | catalogue_ctn_read 124 | catalogue 125 | [(1, 0.4993011007997414), (3, 0.30453517252399437), (0, 0.10480317491906602), (2, 0.045680275878599165), (4, 0.045680275878599165)] 126 | 5 127 | catalogue_ctn_write 128 | catalogue_ctn_read 129 | catalogue 130 | [(3, 0.5721927678549036), (1, 0.22485811791642682), (0, 0.11529032180689729), (2, 0.04382939621088616), (4, 0.04382939621088616)] 131 | 6 132 | catalogue_ctn_write 133 | catalogue_ctn_read 134 | catalogue 135 | [(2, 0.6519163831518229), (4, 0.16582600835991876), (0, 0.08484132846479421), (1, 0.048708140011732036), (3, 0.048708140011732036)] 136 | 7 137 | catalogue_ctn_write 138 | catalogue_ctn_read 139 | catalogue 140 | [(2, 0.614866190937218), (4, 0.15244752535196757), (3, 0.10556525653062054), (0, 0.08077416835489229), (1, 0.04634685882530173)] 141 | 8 142 | catalogue_ctn_write 143 | catalogue_ctn_read 144 | catalogue 145 | [(0, 0.4819021480281392), (2, 0.344233925662358), (4, 0.07978821304718242), (1, 0.047037856631160224), (3, 0.047037856631160224)] 146 | 9 147 | catalogue_ctn_write 148 | catalogue_ctn_read 149 | catalogue 150 | [(0, 0.6864552439080985), (2, 0.16010578329854483), (1, 0.0511463242644522), (3, 0.0511463242644522), (4, 0.0511463242644522)] 151 | 10 152 | catalogue_ctn_write 153 | catalogue_ctn_read 154 | catalogue 155 | [(2, 0.651110272339743), (4, 0.11242682775096902), (3, 0.10250640156607882), (0, 0.0853026573334825), (1, 0.04865384100972667)] 156 | -----one service finish----- 157 | 3 158 | 1 159 | orders_ctn_read 160 | orders 161 | [(1, 0.5851807786375871), (0, 0.2085497816790593), (5, 0.08990301632824217), (2, 0.03878880778503709), (3, 0.03878880778503709), (4, 0.03878880778503709)] 162 | 2 163 | orders_ctn_read 164 | orders 165 | [(2, 0.46420036765160866), (5, 0.20288455672359915), (4, 0.1496553964772206), (1, 0.10302443780975831), (0, 0.04672834936671527), (3, 0.03350689197109804)] 166 | 3 167 | orders_ctn_read 168 | orders 169 | [(5, 0.41331837780048736), (2, 0.23635077797902312), (4, 0.15953734785886806), (1, 0.10788174433741242), (0, 0.04870133107087125), (3, 0.03421042095333783)] 170 | 4 171 | orders_ctn_read 172 | orders 173 | [(5, 0.522466912609427), (0, 0.21798333956817406), (1, 0.12864849014987942), (2, 0.059180952526484185), (3, 0.03586015257301766), (4, 0.03586015257301766)] 174 | 5 175 | orders_ctn_read 176 | orders 177 | [(3, 0.36367302194417367), (2, 0.352827543489645), (0, 0.15200887703723817), (5, 0.05733882088379005), (1, 0.03707586832257658), (4, 0.03707586832257658)] 178 | 6 179 | orders_ctn_read 180 | orders 181 | [(1, 0.25381198199318683), (2, 0.2415463092275777), (3, 0.21247551706141743), (4, 0.21247551706141743), (0, 0.04781934709718803), (5, 0.03187132755921262)] 182 | 7 183 | orders_ctn_read 184 | orders 185 | [(5, 0.4130835271563209), (2, 0.23632265058605287), (4, 0.15943785090171184), (1, 0.10784296771726264), (0, 0.049112814804548835), (3, 0.03420018883410298)] 186 | 8 187 | orders_ctn_read 188 | orders 189 | [(1, 0.2161666639933778), (0, 0.18887550265219816), (2, 0.18887550265219816), (3, 0.18887550265219816), (4, 0.18887550265219816), (5, 0.02833132539782973)] 190 | 9 191 | orders_ctn_read 192 | orders 193 | [(2, 0.4523622044525494), (5, 0.20452466690092755), (4, 0.1536169648837772), (1, 0.10851725446474325), (0, 0.04791348877145666), (3, 0.033065420526546034)] 194 | 10 195 | orders_ctn_read 196 | orders 197 | [(1, 0.4284084767299287), (2, 0.2417387580962365), (4, 0.1486293230741947), (5, 0.09544957919275573), (0, 0.0507356504964864), (3, 0.0350382124103979)] 198 | -----one service finish----- 199 | 4 200 | 1 201 | carts_ctn_mem 202 | carts_ctn_read 203 | carts 204 | [(0, 0.4623732899739073), (2, 0.3657684499215278), (1, 0.07753046362981297), (3, 0.047163898237376), (4, 0.047163898237376)] 205 | 2 206 | carts_ctn_mem 207 | carts_ctn_read 208 | carts 209 | [(0, 0.5280658205553943), (2, 0.32547184789283146), (1, 0.04882077718392473), (3, 0.04882077718392473), (4, 0.04882077718392473)] 210 | 3 211 | carts_ctn_mem 212 | carts_ctn_read 213 | carts 214 | [(2, 0.2671544743143986), (0, 0.23264937323352425), (1, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.03489740598502864)] 215 | 4 216 | carts_ctn_mem 217 | carts_ctn_read 218 | carts 219 | [(2, 0.5164900857974944), (4, 0.20853597315392144), (3, 0.14464849306383204), (1, 0.08914631565165086), (0, 0.04117913233310119)] 220 | 5 221 | carts_ctn_mem 222 | carts_ctn_read 223 | carts 224 | 6 225 | carts_ctn_mem 226 | carts_ctn_read 227 | carts 228 | [(2, 0.35413349561556273), (0, 0.261812424760226), (3, 0.261812424760226), (1, 0.08296979114995144), (4, 0.0392718637140339)] 229 | 7 230 | carts_ctn_mem 231 | carts_ctn_read 232 | carts 233 | [(2, 0.5164900857974944), (4, 0.20853597315392142), (3, 0.1446484930638321), (1, 0.08914631565165088), (0, 0.04117913233310119)] 234 | 8 235 | carts_ctn_mem 236 | carts_ctn_read 237 | carts 238 | [(2, 0.5164900857974944), (4, 0.20853597315392142), (3, 0.1446484930638321), (1, 0.08914631565165088), (0, 0.04117913233310119)] 239 | 9 240 | carts_ctn_mem 241 | carts_ctn_read 242 | carts 243 | [(0, 0.32140559129397656), (2, 0.306737730180702), (3, 0.26696733376270504), (1, 0.06484424469821055), (4, 0.040045100064405766)] 244 | 10 245 | carts_ctn_mem 246 | carts_ctn_read 247 | carts 248 | [(2, 0.5164900857974944), (4, 0.20853597315392144), (3, 0.14464849306383204), (1, 0.08914631565165088), (0, 0.04117913233310119)] 249 | -----one service finish----- 250 | 5 251 | 1 252 | payment_ctn_mem 253 | payment_ctn_write 254 | payment_ctn_read 255 | payment 256 | [(3, 0.49504836053941226), (0, 0.3565700847910523), (2, 0.09489604195087767), (1, 0.053485512718657854)] 257 | 2 258 | payment_ctn_mem 259 | payment_ctn_write 260 | payment_ctn_read 261 | payment 262 | [(0, 0.4474234880921389), (3, 0.4135229779862842), (2, 0.08495612501908376), (1, 0.054097408902493135)] 263 | 3 264 | payment_ctn_mem 265 | payment_ctn_write 266 | payment_ctn_read 267 | payment 268 | [(3, 0.48447504385448875), (0, 0.3488228137534127), (2, 0.11437872032908668), (1, 0.05232342206301191)] 269 | 4 270 | payment_ctn_mem 271 | payment_ctn_write 272 | payment_ctn_read 273 | payment 274 | [(3, 0.692302036580213), (2, 0.1875095111996546), (0, 0.0600942261100662), (1, 0.0600942261100662)] 275 | 5 276 | payment_ctn_mem 277 | payment_ctn_write 278 | payment_ctn_read 279 | payment 280 | 6 281 | payment_ctn_mem 282 | payment_ctn_write 283 | payment_ctn_read 284 | payment 285 | [(3, 0.6704179631556049), (2, 0.21285294798994395), (0, 0.058364544427225655), (1, 0.058364544427225655)] 286 | 7 287 | payment_ctn_mem 288 | payment_ctn_write 289 | payment_ctn_read 290 | payment 291 | [(0, 0.48094835757728877), (3, 0.40713302732044454), (1, 0.0559593075511334), (2, 0.0559593075511334)] 292 | 8 293 | payment_ctn_mem 294 | payment_ctn_write 295 | payment_ctn_read 296 | payment 297 | [(3, 0.3493408776261355), (0, 0.3026321499413323), (1, 0.3026321499413323), (2, 0.045394822491199846)] 298 | 9 299 | payment_ctn_mem 300 | payment_ctn_write 301 | payment_ctn_read 302 | payment 303 | [(3, 0.6704179631556049), (2, 0.21285294798994395), (0, 0.058364544427225655), (1, 0.058364544427225655)] 304 | 10 305 | payment_ctn_mem 306 | payment_ctn_write 307 | payment_ctn_read 308 | payment 309 | -----one service finish----- 310 | 6 311 | 1 312 | shipping_ctn_mem 313 | shipping_ctn_write 314 | shipping_ctn_read 315 | shipping 316 | [(1, 0.4844750438544888), (3, 0.3488228137534127), (0, 0.11437872032908669), (2, 0.05232342206301191)] 317 | 2 318 | shipping_ctn_mem 319 | shipping_ctn_write 320 | shipping_ctn_read 321 | shipping 322 | 3 323 | shipping_ctn_mem 324 | shipping_ctn_write 325 | shipping_ctn_read 326 | shipping 327 | [(1, 0.3493408776261355), (2, 0.30263214994133236), (3, 0.30263214994133236), (0, 0.04539482249119986)] 328 | 4 329 | shipping_ctn_mem 330 | shipping_ctn_write 331 | shipping_ctn_read 332 | shipping 333 | [(1, 0.6178445252575919), (0, 0.20564434754342936), (2, 0.12198159569902098), (3, 0.054529531499957644)] 334 | 5 335 | shipping_ctn_mem 336 | shipping_ctn_write 337 | shipping_ctn_read 338 | shipping 339 | 6 340 | shipping_ctn_mem 341 | shipping_ctn_write 342 | shipping_ctn_read 343 | shipping 344 | [(0, 0.3493408776261355), (2, 0.30263214994133236), (3, 0.30263214994133236), (1, 0.04539482249119986)] 345 | 7 346 | shipping_ctn_mem 347 | shipping_ctn_write 348 | shipping_ctn_read 349 | shipping 350 | 8 351 | shipping_ctn_mem 352 | shipping_ctn_write 353 | shipping_ctn_read 354 | shipping 355 | [(0, 0.3493408776261355), (2, 0.30263214994133236), (3, 0.30263214994133236), (1, 0.04539482249119986)] 356 | 9 357 | shipping_ctn_mem 358 | shipping_ctn_write 359 | shipping_ctn_read 360 | shipping 361 | 10 362 | shipping_ctn_mem 363 | shipping_ctn_write 364 | shipping_ctn_read 365 | shipping 366 | [(0, 0.4440406924488666), (2, 0.4440406924488666), (1, 0.0559593075511334), (3, 0.0559593075511334)] 367 | -----one service finish----- 368 | 2023-03-14 05:44:07 369 | -------------------------------------------------------------------------------- /pa_result/single_service/network_single_service_eta1000.txt: -------------------------------------------------------------------------------- 1 | 2023-03-14 05:44:07 2 | 0 3 | 1 4 | front-end_ctn_write 5 | front-end_ctn_read 6 | front-end 7 | [(0, 0.36975502727891413), (2, 0.2740195533569939), (4, 0.2740195533569939), (1, 0.041102933003549086), (3, 0.041102933003549086)] 8 | 2 9 | front-end_ctn_write 10 | front-end_ctn_read 11 | front-end 12 | [(1, 0.35413349561556273), (3, 0.261812424760226), (4, 0.261812424760226), (0, 0.08296979114995144), (2, 0.0392718637140339)] 13 | 3 14 | front-end_ctn_write 15 | front-end_ctn_read 16 | front-end 17 | [(0, 0.7679786244215081), (1, 0.058005343894623), (2, 0.058005343894623), (3, 0.058005343894623), (4, 0.058005343894623)] 18 | 4 19 | front-end_ctn_write 20 | front-end_ctn_read 21 | front-end 22 | [(1, 0.34559750437261433), (4, 0.27809011834774316), (3, 0.26505078437140833), (0, 0.07150397525252292), (2, 0.03975761765571126)] 23 | 5 24 | front-end_ctn_write 25 | front-end_ctn_read 26 | front-end 27 | [(1, 0.35413349561556273), (3, 0.261812424760226), (4, 0.261812424760226), (0, 0.08296979114995144), (2, 0.0392718637140339)] 28 | 6 29 | front-end_ctn_write 30 | front-end_ctn_read 31 | front-end 32 | [(1, 0.35413349561556273), (3, 0.261812424760226), (4, 0.261812424760226), (0, 0.08296979114995144), (2, 0.0392718637140339)] 33 | 7 34 | front-end_ctn_write 35 | front-end_ctn_read 36 | front-end 37 | [(1, 0.42586684834892935), (4, 0.3938832243319475), (0, 0.08696808867102904), (2, 0.04664091932404698), (3, 0.04664091932404698)] 38 | 8 39 | front-end_ctn_write 40 | front-end_ctn_read 41 | front-end 42 | [(2, 0.6484300690388577), (4, 0.1395786209162655), (1, 0.11504303251119571), (0, 0.0484741387668406), (3, 0.0484741387668406)] 43 | 9 44 | front-end_ctn_write 45 | front-end_ctn_read 46 | front-end 47 | [(1, 0.35413349561556273), (3, 0.261812424760226), (4, 0.261812424760226), (0, 0.08296979114995144), (2, 0.0392718637140339)] 48 | 10 49 | front-end_ctn_write 50 | front-end_ctn_read 51 | front-end 52 | [(0, 0.2671544743143986), (1, 0.23264937323352425), (2, 0.23264937323352425), (4, 0.23264937323352425), (3, 0.03489740598502864)] 53 | -----one service finish----- 54 | 1 55 | 1 56 | user_ctn_mem 57 | user_ctn_read 58 | user 59 | [(2, 0.256596870868619), (3, 0.2432069766793039), (1, 0.23264937323352425), (4, 0.23264937323352425), (0, 0.03489740598502864)] 60 | 2 61 | user_ctn_mem 62 | user_ctn_read 63 | user 64 | [(2, 0.3218874549618694), (3, 0.3218874549618694), (4, 0.2740193000586625), (0, 0.04110289500879938), (1, 0.04110289500879938)] 65 | 3 66 | user_ctn_mem 67 | user_ctn_read 68 | user 69 | [(1, 0.39968215925886397), (2, 0.3836343504308451), (0, 0.1276859834377792), (3, 0.04449875343625588), (4, 0.04449875343625588)] 70 | 4 71 | user_ctn_mem 72 | user_ctn_read 73 | user 74 | 5 75 | user_ctn_mem 76 | user_ctn_read 77 | user 78 | [(4, 0.35413349561556273), (1, 0.261812424760226), (2, 0.261812424760226), (3, 0.08296979114995144), (0, 0.0392718637140339)] 79 | 6 80 | user_ctn_mem 81 | user_ctn_read 82 | user 83 | 7 84 | user_ctn_mem 85 | user_ctn_read 86 | user 87 | [(0, 0.2671544743143986), (1, 0.23264937323352425), (2, 0.23264937323352425), (4, 0.23264937323352425), (3, 0.03489740598502864)] 88 | 8 89 | user_ctn_mem 90 | user_ctn_read 91 | user 92 | 9 93 | user_ctn_mem 94 | user_ctn_read 95 | user 96 | [(0, 0.2671544743143986), (1, 0.23264937323352425), (2, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.03489740598502864)] 97 | 10 98 | user_ctn_mem 99 | user_ctn_read 100 | user 101 | [(1, 0.40399508284201624), (2, 0.3665731517297255), (0, 0.11248118414516561), (4, 0.07315630311537648), (3, 0.04379427816771636)] 102 | -----one service finish----- 103 | 2 104 | 1 105 | catalogue_ctn_write 106 | catalogue_ctn_read 107 | catalogue 108 | 2 109 | catalogue_ctn_write 110 | catalogue_ctn_read 111 | catalogue 112 | [(2, 0.6138634116311671), (4, 0.18668363515409359), (0, 0.10692323292102816), (1, 0.0462648601468556), (3, 0.0462648601468556)] 113 | 3 114 | catalogue_ctn_write 115 | catalogue_ctn_read 116 | catalogue 117 | [(0, 0.6986594486686147), (2, 0.14513492000987374), (1, 0.05206854377383721), (3, 0.05206854377383721), (4, 0.05206854377383721)] 118 | 4 119 | catalogue_ctn_write 120 | catalogue_ctn_read 121 | catalogue 122 | [(3, 0.35413349561556273), (0, 0.261812424760226), (4, 0.261812424760226), (1, 0.08296979114995144), (2, 0.0392718637140339)] 123 | 5 124 | catalogue_ctn_write 125 | catalogue_ctn_read 126 | catalogue 127 | [(2, 0.5492569603282005), (1, 0.2199037332251837), (0, 0.14565537068983725), (3, 0.0425919678783893), (4, 0.0425919678783893)] 128 | 6 129 | catalogue_ctn_write 130 | catalogue_ctn_read 131 | catalogue 132 | [(0, 0.4605344963345401), (2, 0.39300317211368574), (1, 0.04882077718392473), (3, 0.04882077718392473), (4, 0.04882077718392473)] 133 | 7 134 | catalogue_ctn_write 135 | catalogue_ctn_read 136 | catalogue 137 | [(2, 0.5492569603282006), (4, 0.21990373322518367), (0, 0.14565537068983722), (1, 0.04259196787838931), (3, 0.04259196787838931)] 138 | 8 139 | catalogue_ctn_write 140 | catalogue_ctn_read 141 | catalogue 142 | [(0, 0.6276171095266273), (2, 0.2063818751520442), (3, 0.07172598403546208), (1, 0.04713751564293324), (4, 0.04713751564293324)] 143 | 9 144 | catalogue_ctn_write 145 | catalogue_ctn_read 146 | catalogue 147 | [(4, 0.2671544743143986), (0, 0.23264937323352425), (1, 0.23264937323352425), (2, 0.23264937323352425), (3, 0.03489740598502865)] 148 | 10 149 | catalogue_ctn_write 150 | catalogue_ctn_read 151 | catalogue 152 | [(2, 0.4605344963345401), (0, 0.3930031721136858), (1, 0.04882077718392473), (3, 0.04882077718392473), (4, 0.04882077718392473)] 153 | -----one service finish----- 154 | 3 155 | 1 156 | orders_ctn_read 157 | orders 158 | [(1, 0.2630232483936169), (2, 0.23287685803872976), (4, 0.2127072653941202), (5, 0.2127072653941202), (0, 0.04677927297029493), (3, 0.031906089809118035)] 159 | 2 160 | orders_ctn_read 161 | orders 162 | [(1, 0.2829633077924856), (2, 0.2125355537618461), (4, 0.2125355537618461), (5, 0.2125355537618461), (0, 0.04754969785769926), (3, 0.03188033306427692)] 163 | 3 164 | orders_ctn_read 165 | orders 166 | [(1, 0.5980742618810065), (0, 0.19115863763668683), (5, 0.09238389090487466), (2, 0.03946106985914402), (3, 0.03946106985914402), (4, 0.03946106985914402)] 167 | 4 168 | orders_ctn_read 169 | orders 170 | 5 171 | orders_ctn_read 172 | orders 173 | [(5, 0.5004585080021208), (1, 0.22650708295420116), (4, 0.1454128839770682), (0, 0.05067630102482594), (2, 0.042033849797756775), (3, 0.034911374244027155)] 174 | 6 175 | orders_ctn_read 176 | orders 177 | [(3, 0.5011615668320927), (1, 0.22599706566312552), (0, 0.1261461395816993), (5, 0.07667040603548946), (2, 0.03501241094379656), (4, 0.03501241094379656)] 178 | 7 179 | orders_ctn_read 180 | orders 181 | [(1, 0.38391085202692554), (4, 0.24326378281234412), (5, 0.24326378281234412), (0, 0.05658244750468295), (2, 0.036489567421851624), (3, 0.036489567421851624)] 182 | 8 183 | orders_ctn_read 184 | orders 185 | [(1, 0.42599247122010986), (2, 0.22964880787975048), (4, 0.15233998113765404), (5, 0.10082725617784748), (0, 0.05674416240267555), (3, 0.03444732118196257)] 186 | 9 187 | orders_ctn_read 188 | orders 189 | [(5, 0.44315058292808923), (4, 0.2360169729732062), (0, 0.17200318363245998), (1, 0.07802416857428282), (2, 0.03540254594598093), (3, 0.03540254594598093)] 190 | 10 191 | orders_ctn_read 192 | orders 193 | [(2, 0.4670312806657079), (3, 0.246179629938935), (0, 0.1760082559228364), (1, 0.036926944490840254), (4, 0.036926944490840254), (5, 0.036926944490840254)] 194 | -----one service finish----- 195 | 4 196 | 1 197 | carts_ctn_mem 198 | carts_ctn_read 199 | carts 200 | [(0, 0.5280658205553943), (2, 0.32547184789283146), (1, 0.04882077718392473), (3, 0.04882077718392473), (4, 0.04882077718392473)] 201 | 2 202 | carts_ctn_mem 203 | carts_ctn_read 204 | carts 205 | [(0, 0.4947477849136569), (2, 0.3587898835345689), (1, 0.04882077718392473), (3, 0.04882077718392473), (4, 0.04882077718392473)] 206 | 3 207 | carts_ctn_mem 208 | carts_ctn_read 209 | carts 210 | [(3, 0.44017820175927447), (2, 0.36365807156446117), (1, 0.104803174919066), (0, 0.04568027587859916), (4, 0.04568027587859916)] 211 | 4 212 | carts_ctn_mem 213 | carts_ctn_read 214 | carts 215 | [(2, 0.5164900857974944), (4, 0.20853597315392142), (3, 0.1446484930638321), (1, 0.08914631565165088), (0, 0.04117913233310119)] 216 | 5 217 | carts_ctn_mem 218 | carts_ctn_read 219 | carts 220 | [(0, 0.3240060768960269), (2, 0.30461677186384106), (3, 0.2671758969620285), (1, 0.06412486973379944), (4, 0.040076384544304275)] 221 | 6 222 | carts_ctn_mem 223 | carts_ctn_read 224 | carts 225 | [(1, 0.2671544743143986), (0, 0.23264937323352425), (2, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.03489740598502864)] 226 | 7 227 | carts_ctn_mem 228 | carts_ctn_read 229 | carts 230 | [(0, 0.3246059540837357), (2, 0.3040307978321103), (4, 0.26718195095231356), (1, 0.06410400448899349), (3, 0.04007729264284704)] 231 | 8 232 | carts_ctn_mem 233 | carts_ctn_read 234 | carts 235 | [(2, 0.36975502727891413), (0, 0.2740195533569939), (1, 0.2740195533569939), (3, 0.04110293300354909), (4, 0.04110293300354909)] 236 | 9 237 | carts_ctn_mem 238 | carts_ctn_read 239 | carts 240 | [(2, 0.5278590150774326), (3, 0.21299493718708), (4, 0.14560476094342664), (1, 0.07187911923028603), (0, 0.041662167561774745)] 241 | 10 242 | carts_ctn_mem 243 | carts_ctn_read 244 | carts 245 | [(2, 0.5164900857974944), (3, 0.20853597315392142), (4, 0.1446484930638321), (1, 0.08914631565165088), (0, 0.04117913233310119)] 246 | -----one service finish----- 247 | 5 248 | 1 249 | payment_ctn_mem 250 | payment_ctn_write 251 | payment_ctn_read 252 | payment 253 | [(3, 0.32626026569460087), (2, 0.325712761872867), (0, 0.30263214994133236), (1, 0.04539482249119986)] 254 | 2 255 | payment_ctn_mem 256 | payment_ctn_write 257 | payment_ctn_read 258 | payment 259 | [(3, 0.6704179631556049), (2, 0.21285294798994395), (0, 0.058364544427225655), (1, 0.058364544427225655)] 260 | 3 261 | payment_ctn_mem 262 | payment_ctn_write 263 | payment_ctn_read 264 | payment 265 | [(3, 0.32659210483501727), (2, 0.32538092273245056), (0, 0.30263214994133236), (1, 0.04539482249119986)] 266 | 4 267 | payment_ctn_mem 268 | payment_ctn_write 269 | payment_ctn_read 270 | payment 271 | [(3, 0.48447504385448875), (0, 0.3488228137534127), (2, 0.11437872032908668), (1, 0.05232342206301191)] 272 | 5 273 | payment_ctn_mem 274 | payment_ctn_write 275 | payment_ctn_read 276 | payment 277 | [(3, 0.48447504385448875), (0, 0.3488228137534127), (2, 0.11437872032908668), (1, 0.05232342206301191)] 278 | 6 279 | payment_ctn_mem 280 | payment_ctn_write 281 | payment_ctn_read 282 | payment 283 | [(3, 0.6704179631556049), (2, 0.21285294798994395), (0, 0.058364544427225655), (1, 0.058364544427225655)] 284 | 7 285 | payment_ctn_mem 286 | payment_ctn_write 287 | payment_ctn_read 288 | payment 289 | [(0, 0.45347785405889407), (3, 0.4094297199976508), (2, 0.08286465406281776), (1, 0.05422777188063726)] 290 | 8 291 | payment_ctn_mem 292 | payment_ctn_write 293 | payment_ctn_read 294 | payment 295 | 9 296 | payment_ctn_mem 297 | payment_ctn_write 298 | payment_ctn_read 299 | payment 300 | [(3, 0.48447504385448875), (0, 0.3488228137534127), (2, 0.11437872032908668), (1, 0.05232342206301191)] 301 | 10 302 | payment_ctn_mem 303 | payment_ctn_write 304 | payment_ctn_read 305 | payment 306 | [(3, 0.48447504385448875), (0, 0.3488228137534127), (2, 0.11437872032908668), (1, 0.05232342206301191)] 307 | -----one service finish----- 308 | 6 309 | 1 310 | shipping_ctn_mem 311 | shipping_ctn_write 312 | shipping_ctn_read 313 | shipping 314 | [(1, 0.4440406924488666), (2, 0.4440406924488666), (0, 0.0559593075511334), (3, 0.0559593075511334)] 315 | 2 316 | shipping_ctn_mem 317 | shipping_ctn_write 318 | shipping_ctn_read 319 | shipping 320 | [(1, 0.4440406924488666), (2, 0.4440406924488666), (0, 0.0559593075511334), (3, 0.0559593075511334)] 321 | 3 322 | shipping_ctn_mem 323 | shipping_ctn_write 324 | shipping_ctn_read 325 | shipping 326 | 4 327 | shipping_ctn_mem 328 | shipping_ctn_write 329 | shipping_ctn_read 330 | shipping 331 | [(0, 0.3493408776261355), (2, 0.30263214994133236), (3, 0.30263214994133236), (1, 0.04539482249119986)] 332 | 5 333 | shipping_ctn_mem 334 | shipping_ctn_write 335 | shipping_ctn_read 336 | shipping 337 | [(1, 0.4440406924488666), (2, 0.4440406924488666), (0, 0.0559593075511334), (3, 0.0559593075511334)] 338 | 6 339 | shipping_ctn_mem 340 | shipping_ctn_write 341 | shipping_ctn_read 342 | shipping 343 | 7 344 | shipping_ctn_mem 345 | shipping_ctn_write 346 | shipping_ctn_read 347 | shipping 348 | [(0, 0.3493408776261355), (2, 0.30263214994133236), (3, 0.30263214994133236), (1, 0.04539482249119986)] 349 | 8 350 | shipping_ctn_mem 351 | shipping_ctn_write 352 | shipping_ctn_read 353 | shipping 354 | 9 355 | shipping_ctn_mem 356 | shipping_ctn_write 357 | shipping_ctn_read 358 | shipping 359 | [(1, 0.3493408776261355), (2, 0.30263214994133236), (3, 0.30263214994133236), (0, 0.04539482249119986)] 360 | 10 361 | shipping_ctn_mem 362 | shipping_ctn_write 363 | shipping_ctn_read 364 | shipping 365 | [(0, 0.3493408776261355), (2, 0.30263214994133236), (3, 0.30263214994133236), (1, 0.04539482249119986)] 366 | -----one service finish----- 367 | 2023-03-14 05:57:39 368 | -------------------------------------------------------------------------------- /pa_result/single_service/network_single_service_gamma5.txt: -------------------------------------------------------------------------------- 1 | 2023-03-14 04:27:27 2 | 0 3 | 1 4 | front-end_ctn_write 5 | front-end_ctn_read 6 | front-end 7 | [(0, 0.43749274363981977), (1, 0.3900149736094806), (4, 0.07824434739988331), (2, 0.04712396767540818), (3, 0.04712396767540818)] 8 | 2 9 | front-end_ctn_write 10 | front-end_ctn_read 11 | front-end 12 | [(2, 0.5164900857974944), (4, 0.20853597315392142), (3, 0.1446484930638321), (1, 0.08914631565165088), (0, 0.04117913233310119)] 13 | 3 14 | front-end_ctn_write 15 | front-end_ctn_read 16 | front-end 17 | [(1, 0.5833562721700816), (0, 0.22732116332509789), (4, 0.08400645026338066), (2, 0.06082798117801918), (3, 0.044488133063420726)] 18 | 4 19 | front-end_ctn_write 20 | front-end_ctn_read 21 | front-end 22 | [(0, 0.6771881242625891), (2, 0.10179104243539981), (1, 0.09211631677496036), (4, 0.07843242255060284), (3, 0.05047209397644791)] 23 | 5 24 | front-end_ctn_write 25 | front-end_ctn_read 26 | front-end 27 | [(1, 0.5829273512199146), (0, 0.22723050385265345), (4, 0.08494358954288946), (2, 0.06043525754421072), (3, 0.04446329784033192)] 28 | 6 29 | front-end_ctn_write 30 | front-end_ctn_read 31 | front-end 32 | [(4, 0.2503224095830336), (0, 0.2494814379648893), (1, 0.23264937323352425), (2, 0.23264937323352425), (3, 0.03489740598502865)] 33 | 7 34 | front-end_ctn_write 35 | front-end_ctn_read 36 | front-end 37 | [(2, 0.5164900857974944), (4, 0.20853597315392142), (3, 0.14464849306383204), (1, 0.08914631565165088), (0, 0.04117913233310119)] 38 | 8 39 | front-end_ctn_write 40 | front-end_ctn_read 41 | front-end 42 | [(2, 0.6024275855251331), (1, 0.2020176047018922), (4, 0.10441202734502497), (0, 0.045571391213974896), (3, 0.045571391213974896)] 43 | 9 44 | front-end_ctn_write 45 | front-end_ctn_read 46 | front-end 47 | [(1, 0.5827426937598598), (0, 0.22719381490828303), (4, 0.08497461257642572), (2, 0.060635952061304024), (3, 0.04445292669412751)] 48 | 10 49 | front-end_ctn_write 50 | front-end_ctn_read 51 | front-end 52 | [(1, 0.5827013233533577), (0, 0.22718251688348054), (4, 0.08545757770395025), (2, 0.06020838383351396), (3, 0.04445019822569756)] 53 | -----one service finish----- 54 | 1 55 | 1 56 | user_ctn_mem 57 | user_ctn_read 58 | user 59 | [(2, 0.5566000799057892), (1, 0.22130955254624413), (4, 0.10845495081654567), (3, 0.07059814324249071), (0, 0.04303727348893031)] 60 | 2 61 | user_ctn_mem 62 | user_ctn_read 63 | user 64 | [(2, 0.5454817258759063), (4, 0.19288867592573067), (3, 0.12528902056921062), (0, 0.09378787453629124), (1, 0.04255270309286134)] 65 | 3 66 | user_ctn_mem 67 | user_ctn_read 68 | user 69 | [(2, 0.5164900857974944), (1, 0.20853597315392144), (4, 0.1446484930638321), (3, 0.08914631565165088), (0, 0.04117913233310119)] 70 | 4 71 | user_ctn_mem 72 | user_ctn_read 73 | user 74 | [(0, 0.4374721538866981), (2, 0.3577178498607213), (4, 0.08875953059349406), (3, 0.07087580400146729), (1, 0.04517466165761932)] 75 | 5 76 | user_ctn_mem 77 | user_ctn_read 78 | user 79 | [(1, 0.3224230480340931), (2, 0.3065547754111652), (3, 0.26733059134997994), (0, 0.06359199650226467), (4, 0.040099588702497)] 80 | 6 81 | user_ctn_mem 82 | user_ctn_read 83 | user 84 | [(1, 0.4053296669592293), (2, 0.3642856552627443), (0, 0.11600246083693068), (4, 0.07063966327903823), (3, 0.04374255366205747)] 85 | 7 86 | user_ctn_mem 87 | user_ctn_read 88 | user 89 | [(0, 0.4344726903318292), (2, 0.3571714942655823), (1, 0.09043270367950158), (4, 0.07295255975542105), (3, 0.04497055196766591)] 90 | 8 91 | user_ctn_mem 92 | user_ctn_read 93 | user 94 | [(0, 0.43556536565732296), (2, 0.3572529757200562), (4, 0.08995581366356889), (3, 0.07218790881699773), (1, 0.04503793614205431)] 95 | 9 96 | user_ctn_mem 97 | user_ctn_read 98 | user 99 | [(2, 0.5921840570270902), (3, 0.18324770384723066), (0, 0.13462756429761644), (1, 0.04497033741403135), (4, 0.04497033741403135)] 100 | 10 101 | user_ctn_mem 102 | user_ctn_read 103 | user 104 | [(0, 0.4347377655806662), (2, 0.35725694504384237), (4, 0.09047779494683372), (3, 0.07253684708653946), (1, 0.044990647342118106)] 105 | -----one service finish----- 106 | 2 107 | 1 108 | catalogue_ctn_write 109 | catalogue_ctn_read 110 | catalogue 111 | [(2, 0.6066944970156707), (4, 0.1904965293634469), (0, 0.11114878028253167), (1, 0.04583009666917541), (3, 0.04583009666917541)] 112 | 2 113 | catalogue_ctn_write 114 | catalogue_ctn_read 115 | catalogue 116 | [(3, 0.583387939525771), (1, 0.2271982690783448), (0, 0.1004669172765762), (2, 0.04447343705965399), (4, 0.04447343705965399)] 117 | 3 118 | catalogue_ctn_write 119 | catalogue_ctn_read 120 | catalogue 121 | [(3, 0.5713342768779391), (1, 0.22467576129327024), (0, 0.11642797709393012), (2, 0.04378099236743029), (4, 0.04378099236743029)] 122 | 4 123 | catalogue_ctn_write 124 | catalogue_ctn_read 125 | catalogue 126 | [(1, 0.6632372926833998), (0, 0.1883179458415517), (2, 0.04948158715834954), (3, 0.04948158715834954), (4, 0.04948158715834954)] 127 | 5 128 | catalogue_ctn_write 129 | catalogue_ctn_read 130 | catalogue 131 | [(2, 0.6047920734947618), (4, 0.19150259217352728), (0, 0.11227337932522961), (1, 0.0457159775032406), (3, 0.0457159775032406)] 132 | 6 133 | catalogue_ctn_write 134 | catalogue_ctn_read 135 | catalogue 136 | [(4, 0.251043771463169), (0, 0.24876007608475392), (1, 0.23264937323352425), (2, 0.23264937323352425), (3, 0.03489740598502864)] 137 | 7 138 | catalogue_ctn_write 139 | catalogue_ctn_read 140 | catalogue 141 | [(3, 0.5711737312833898), (1, 0.22464162062348142), (0, 0.11664073404202253), (2, 0.04377195702555317), (4, 0.04377195702555317)] 142 | 8 143 | catalogue_ctn_write 144 | catalogue_ctn_read 145 | catalogue 146 | [(2, 0.6053314260403039), (4, 0.19121760776122887), (0, 0.11195440908815893), (1, 0.04574827855515422), (3, 0.04574827855515422)] 147 | 9 148 | catalogue_ctn_write 149 | catalogue_ctn_read 150 | catalogue 151 | [(0, 0.537594875231304), (2, 0.1753369098174229), (1, 0.15237409013619796), (3, 0.09252795224745816), (4, 0.042166172567617066)] 152 | 10 153 | catalogue_ctn_write 154 | catalogue_ctn_read 155 | catalogue 156 | [(3, 0.583387939525771), (1, 0.2271982690783448), (0, 0.10046691727657618), (2, 0.04447343705965399), (4, 0.04447343705965399)] 157 | -----one service finish----- 158 | 3 159 | 1 160 | orders_ctn_read 161 | orders 162 | [(2, 0.4799960186885895), (1, 0.21068090093420067), (4, 0.13741194861130052), (5, 0.0896666055677141), (0, 0.04816377005978415), (3, 0.03408075613841116)] 163 | 2 164 | orders_ctn_read 165 | orders 166 | [(5, 0.6393555743979406), (0, 0.19352158354032906), (1, 0.041780710515432605), (2, 0.041780710515432605), (3, 0.041780710515432605), (4, 0.041780710515432605)] 167 | 3 168 | orders_ctn_read 169 | orders 170 | [(3, 0.36287830330423193), (2, 0.32964165394806527), (0, 0.14217849232384475), (5, 0.0725238168243802), (4, 0.05679020507109985), (1, 0.03598752852837801)] 171 | 4 172 | orders_ctn_read 173 | orders 174 | [(1, 0.31450532765634254), (2, 0.28069296007864786), (3, 0.23046012530602092), (0, 0.08353658591310943), (4, 0.056235982249975995), (5, 0.03456901879590314)] 175 | 5 176 | orders_ctn_read 177 | orders 178 | [(5, 0.5699778534209895), (0, 0.22877881131227318), (1, 0.08714235581553713), (2, 0.03803365981706675), (3, 0.03803365981706675), (4, 0.03803365981706675)] 179 | 6 180 | orders_ctn_read 181 | orders 182 | [(3, 0.3791164374643352), (2, 0.33738547841744904), (0, 0.1469300062407122), (1, 0.06241706678698826), (4, 0.03707550554525759), (5, 0.03707550554525759)] 183 | 7 184 | orders_ctn_read 185 | orders 186 | [(5, 0.47632407695373163), (2, 0.2504167456354665), (0, 0.1605716418748418), (1, 0.03756251184531998), (3, 0.03756251184531998), (4, 0.03756251184531998)] 187 | 8 188 | orders_ctn_read 189 | orders 190 | [(2, 0.46318686639943113), (5, 0.2030822557492024), (4, 0.15005709268044837), (1, 0.10354393978304352), (0, 0.04665911281634991), (3, 0.03347073257152475)] 191 | 9 192 | orders_ctn_read 193 | orders 194 | [(1, 0.42552600172048005), (2, 0.24554474487990113), (4, 0.14838767304851974), (5, 0.09510661404835295), (0, 0.050358436337153735), (3, 0.03507652996559247)] 195 | 10 196 | orders_ctn_read 197 | orders 198 | [(2, 0.44342468357382275), (3, 0.23617986843613642), (0, 0.1720619879578108), (5, 0.0609954777449237), (4, 0.05191100202188589), (1, 0.03542698026542047)] 199 | -----one service finish----- 200 | 4 201 | 1 202 | carts_ctn_mem 203 | carts_ctn_read 204 | carts 205 | [(2, 0.5164900857974944), (4, 0.20853597315392142), (3, 0.1446484930638321), (1, 0.08914631565165088), (0, 0.04117913233310119)] 206 | 2 207 | carts_ctn_mem 208 | carts_ctn_read 209 | carts 210 | [(2, 0.5164900857974944), (4, 0.20853597315392142), (3, 0.1446484930638321), (1, 0.08914631565165088), (0, 0.04117913233310119)] 211 | 3 212 | carts_ctn_mem 213 | carts_ctn_read 214 | carts 215 | [(2, 0.5164900857974944), (4, 0.20853597315392142), (3, 0.1446484930638321), (1, 0.08914631565165088), (0, 0.04117913233310119)] 216 | 4 217 | carts_ctn_mem 218 | carts_ctn_read 219 | carts 220 | [(2, 0.5164900857974944), (4, 0.20853597315392142), (3, 0.14464849306383204), (1, 0.08914631565165088), (0, 0.04117913233310119)] 221 | 5 222 | carts_ctn_mem 223 | carts_ctn_read 224 | carts 225 | [(2, 0.32848448012780235), (3, 0.3011961218595133), (4, 0.26763762008787295), (0, 0.06253613491163051), (1, 0.04014564301318095)] 226 | 6 227 | carts_ctn_mem 228 | carts_ctn_read 229 | carts 230 | [(2, 0.5164900857974944), (4, 0.20853597315392142), (3, 0.1446484930638321), (1, 0.08914631565165088), (0, 0.04117913233310119)] 231 | 7 232 | carts_ctn_mem 233 | carts_ctn_read 234 | carts 235 | [(2, 0.5164900857974944), (4, 0.20853597315392142), (3, 0.1446484930638321), (1, 0.08914631565165088), (0, 0.04117913233310119)] 236 | 8 237 | carts_ctn_mem 238 | carts_ctn_read 239 | carts 240 | [(2, 0.5164900857974944), (4, 0.20853597315392142), (3, 0.1446484930638321), (1, 0.08914631565165088), (0, 0.04117913233310119)] 241 | 9 242 | carts_ctn_mem 243 | carts_ctn_read 244 | carts 245 | [(2, 0.2671544743143986), (1, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.23264937323352425), (0, 0.03489740598502864)] 246 | 10 247 | carts_ctn_mem 248 | carts_ctn_read 249 | carts 250 | [(2, 0.5164900857974944), (4, 0.20853597315392142), (3, 0.1446484930638321), (1, 0.08914631565165088), (0, 0.04117913233310119)] 251 | -----one service finish----- 252 | 5 253 | 1 254 | payment_ctn_mem 255 | payment_ctn_write 256 | payment_ctn_read 257 | payment 258 | [(3, 0.6853141043827661), (2, 0.19562624733322898), (0, 0.05952982414200252), (1, 0.05952982414200252)] 259 | 2 260 | payment_ctn_mem 261 | payment_ctn_write 262 | payment_ctn_read 263 | payment 264 | [(3, 0.6704179631556048), (2, 0.21285294798994397), (0, 0.05836454442722565), (1, 0.05836454442722565)] 265 | 3 266 | payment_ctn_mem 267 | payment_ctn_write 268 | payment_ctn_read 269 | payment 270 | [(3, 0.6804112592767355), (2, 0.20130744688224686), (0, 0.05914064692050879), (1, 0.05914064692050879)] 271 | 4 272 | payment_ctn_mem 273 | payment_ctn_write 274 | payment_ctn_read 275 | payment 276 | [(3, 0.6795910236036184), (2, 0.2022568116869543), (0, 0.05907608235471362), (1, 0.05907608235471362)] 277 | 5 278 | payment_ctn_mem 279 | payment_ctn_write 280 | payment_ctn_read 281 | payment 282 | [(0, 0.44867204559032836), (3, 0.4120331618907587), (2, 0.08521337706701884), (1, 0.054081415451894106)] 283 | 6 284 | payment_ctn_mem 285 | payment_ctn_write 286 | payment_ctn_read 287 | payment 288 | [(0, 0.44716906124484906), (3, 0.4122054026074267), (2, 0.08663235772320196), (1, 0.05399317842452224)] 289 | 7 290 | payment_ctn_mem 291 | payment_ctn_write 292 | payment_ctn_read 293 | payment 294 | [(3, 0.6807651517581746), (2, 0.20089774535598826), (0, 0.0591685514429186), (1, 0.0591685514429186)] 295 | 8 296 | payment_ctn_mem 297 | payment_ctn_write 298 | payment_ctn_read 299 | payment 300 | [(3, 0.6704179631556049), (2, 0.21285294798994395), (0, 0.058364544427225655), (1, 0.058364544427225655)] 301 | 9 302 | payment_ctn_mem 303 | payment_ctn_write 304 | payment_ctn_read 305 | payment 306 | [(3, 0.6922830049202551), (2, 0.18753164844180195), (0, 0.0600926733189715), (1, 0.0600926733189715)] 307 | 10 308 | payment_ctn_mem 309 | payment_ctn_write 310 | payment_ctn_read 311 | payment 312 | [(0, 0.4488949004924264), (1, 0.41207933958357745), (2, 0.0849265092172428), (3, 0.05409925070675342)] 313 | -----one service finish----- 314 | 6 315 | 1 316 | shipping_ctn_mem 317 | shipping_ctn_write 318 | shipping_ctn_read 319 | shipping 320 | [(1, 0.5150200693289297), (3, 0.3730614851315925), (0, 0.05595922276973888), (2, 0.05595922276973888)] 321 | 2 322 | shipping_ctn_mem 323 | shipping_ctn_write 324 | shipping_ctn_read 325 | shipping 326 | [(1, 0.5150200693289297), (3, 0.3730614851315925), (0, 0.05595922276973888), (2, 0.05595922276973888)] 327 | 3 328 | shipping_ctn_mem 329 | shipping_ctn_write 330 | shipping_ctn_read 331 | shipping 332 | [(1, 0.3493408776261355), (2, 0.30263214994133236), (3, 0.30263214994133236), (0, 0.04539482249119986)] 333 | 4 334 | shipping_ctn_mem 335 | shipping_ctn_write 336 | shipping_ctn_read 337 | shipping 338 | 5 339 | shipping_ctn_mem 340 | shipping_ctn_write 341 | shipping_ctn_read 342 | shipping 343 | [(1, 0.3493408776261355), (2, 0.30263214994133236), (3, 0.30263214994133236), (0, 0.04539482249119986)] 344 | 6 345 | shipping_ctn_mem 346 | shipping_ctn_write 347 | shipping_ctn_read 348 | shipping 349 | 7 350 | shipping_ctn_mem 351 | shipping_ctn_write 352 | shipping_ctn_read 353 | shipping 354 | [(0, 0.3493408776261355), (2, 0.30263214994133236), (3, 0.30263214994133236), (1, 0.04539482249119986)] 355 | 8 356 | shipping_ctn_mem 357 | shipping_ctn_write 358 | shipping_ctn_read 359 | shipping 360 | [(1, 0.5150200693289297), (3, 0.3730614851315925), (0, 0.05595922276973888), (2, 0.05595922276973888)] 361 | 9 362 | shipping_ctn_mem 363 | shipping_ctn_write 364 | shipping_ctn_read 365 | shipping 366 | 10 367 | shipping_ctn_mem 368 | shipping_ctn_write 369 | shipping_ctn_read 370 | shipping 371 | [(1, 0.791544237249603), (0, 0.0694852542501323), (2, 0.0694852542501323), (3, 0.0694852542501323)] 372 | -----one service finish----- 373 | 2023-03-14 04:52:31 374 | -------------------------------------------------------------------------------- /pa_result/single_service/network_single_service_gamma75.txt: -------------------------------------------------------------------------------- 1 | 2023-03-14 04:52:31 2 | 0 3 | 1 4 | front-end_ctn_write 5 | front-end_ctn_read 6 | front-end 7 | [(0, 0.7679786244215081), (1, 0.058005343894623), (2, 0.058005343894623), (3, 0.058005343894623), (4, 0.058005343894623)] 8 | 2 9 | front-end_ctn_write 10 | front-end_ctn_read 11 | front-end 12 | [(2, 0.5164900857974944), (4, 0.20853597315392142), (3, 0.1446484930638321), (1, 0.08914631565165088), (0, 0.04117913233310119)] 13 | 3 14 | front-end_ctn_write 15 | front-end_ctn_read 16 | front-end 17 | [(0, 0.6748311189165678), (2, 0.10251682902393636), (1, 0.09299897515013812), (4, 0.07935062381794378), (3, 0.05030245309141389)] 18 | 4 19 | front-end_ctn_write 20 | front-end_ctn_read 21 | front-end 22 | [(1, 0.5818029306093048), (0, 0.22698610479655199), (4, 0.08815730516440427), (2, 0.05865623197806889), (3, 0.04439742745167012)] 23 | 5 24 | front-end_ctn_write 25 | front-end_ctn_read 26 | front-end 27 | [(0, 0.43863660010796807), (4, 0.3543730917363935), (3, 0.08901151003230723), (1, 0.07292986135584102), (2, 0.04504893676749019)] 28 | 6 29 | front-end_ctn_write 30 | front-end_ctn_read 31 | front-end 32 | [(2, 0.5164900857974944), (4, 0.20853597315392142), (3, 0.1446484930638321), (1, 0.08914631565165088), (0, 0.04117913233310119)] 33 | 7 34 | front-end_ctn_write 35 | front-end_ctn_read 36 | front-end 37 | [(2, 0.3405700299842187), (0, 0.28754676176888444), (1, 0.2669558087782021), (4, 0.06488402815196452), (3, 0.04004337131673032)] 38 | 8 39 | front-end_ctn_write 40 | front-end_ctn_read 41 | front-end 42 | [(1, 0.5889906551628162), (0, 0.22227234411300834), (4, 0.08721420597834144), (2, 0.056717607287891256), (3, 0.04480518745794284)] 43 | 9 44 | front-end_ctn_write 45 | front-end_ctn_read 46 | front-end 47 | [(2, 0.5164900857974944), (4, 0.20853597315392142), (3, 0.1446484930638321), (1, 0.08914631565165088), (0, 0.04117913233310119)] 48 | 10 49 | front-end_ctn_write 50 | front-end_ctn_read 51 | front-end 52 | [(1, 0.4008891181827779), (0, 0.4006287018418299), (2, 0.08054385366484539), (4, 0.07239480823725304), (3, 0.04554351807329381)] 53 | -----one service finish----- 54 | 1 55 | 1 56 | user_ctn_mem 57 | user_ctn_read 58 | user 59 | [(2, 0.5456331059749362), (4, 0.19280454408334122), (3, 0.1251887138346297), (0, 0.0938134570539781), (1, 0.04256017905311468)] 60 | 2 61 | user_ctn_mem 62 | user_ctn_read 63 | user 64 | [(2, 0.5454412285702079), (4, 0.19291117855936687), (3, 0.12531585606117623), (0, 0.09378103314025538), (1, 0.04255070366899348)] 65 | 3 66 | user_ctn_mem 67 | user_ctn_read 68 | user 69 | [(0, 0.43397329160316545), (2, 0.35702938448930105), (1, 0.09079383423216103), (4, 0.07326966848083258), (3, 0.0449338211945399)] 70 | 4 71 | user_ctn_mem 72 | user_ctn_read 73 | user 74 | [(1, 0.40589436655928296), (2, 0.3637553665900216), (0, 0.11553693169608718), (4, 0.07106891870089375), (3, 0.04374441645371452)] 75 | 5 76 | user_ctn_mem 77 | user_ctn_read 78 | user 79 | [(1, 0.40409066778639924), (2, 0.37831389868616544), (0, 0.12870009022154447), (3, 0.04444767165294541), (4, 0.04444767165294541)] 80 | 6 81 | user_ctn_mem 82 | user_ctn_read 83 | user 84 | [(1, 0.4060124072761325), (2, 0.3635906264278701), (0, 0.11576627943525107), (4, 0.0708888017957135), (3, 0.043741885065032814)] 85 | 7 86 | user_ctn_mem 87 | user_ctn_read 88 | user 89 | [(2, 0.5448712283680303), (4, 0.19322770773460177), (3, 0.12569362725415442), (0, 0.09368485028141167), (1, 0.04252258636180199)] 90 | 8 91 | user_ctn_mem 92 | user_ctn_read 93 | user 94 | [(2, 0.5164900857974944), (1, 0.20853597315392144), (4, 0.1446484930638321), (3, 0.08914631565165088), (0, 0.04117913233310119)] 95 | 9 96 | user_ctn_mem 97 | user_ctn_read 98 | user 99 | [(2, 0.5456258118301347), (4, 0.19280859850924192), (3, 0.1251935468599993), (0, 0.09381222404586202), (1, 0.04255981875476189)] 100 | 10 101 | user_ctn_mem 102 | user_ctn_read 103 | user 104 | [(2, 0.5452202722536644), (4, 0.19303392210109033), (3, 0.1254622826683723), (0, 0.09374372422576949), (1, 0.04253979875110365)] 105 | -----one service finish----- 106 | 2 107 | 1 108 | catalogue_ctn_write 109 | catalogue_ctn_read 110 | catalogue 111 | [(4, 0.2515542502852423), (0, 0.24824959726268053), (1, 0.23264937323352422), (2, 0.23264937323352422), (3, 0.03489740598502864)] 112 | 2 113 | catalogue_ctn_write 114 | catalogue_ctn_read 115 | catalogue 116 | [(0, 0.2501840959959007), (1, 0.24961975155202212), (2, 0.23264937323352425), (3, 0.23264937323352425), (4, 0.03489740598502864)] 117 | 3 118 | catalogue_ctn_write 119 | catalogue_ctn_read 120 | catalogue 121 | [(2, 0.2511347660897829), (1, 0.24866908145813996), (0, 0.23264937323352425), (4, 0.23264937323352425), (3, 0.03489740598502864)] 122 | 4 123 | catalogue_ctn_write 124 | catalogue_ctn_read 125 | catalogue 126 | 5 127 | catalogue_ctn_write 128 | catalogue_ctn_read 129 | catalogue 130 | [(2, 0.6039435332725055), (4, 0.19195055897279806), (0, 0.1127754205576719), (1, 0.045665243598512276), (3, 0.045665243598512276)] 131 | 6 132 | catalogue_ctn_write 133 | catalogue_ctn_read 134 | catalogue 135 | [(2, 0.2521037784277622), (4, 0.24770006912016068), (0, 0.23264937323352425), (1, 0.23264937323352425), (3, 0.03489740598502865)] 136 | 7 137 | catalogue_ctn_write 138 | catalogue_ctn_read 139 | catalogue 140 | [(4, 0.5492569603282005), (3, 0.2199037332251837), (1, 0.14565537068983725), (0, 0.0425919678783893), (2, 0.0425919678783893)] 141 | 8 142 | catalogue_ctn_write 143 | catalogue_ctn_read 144 | catalogue 145 | [(2, 0.4132887235660018), (4, 0.363344799461453), (3, 0.11257800786183862), (0, 0.06666191885832176), (1, 0.04412655025238481)] 146 | 9 147 | catalogue_ctn_write 148 | catalogue_ctn_read 149 | catalogue 150 | 10 151 | catalogue_ctn_write 152 | catalogue_ctn_read 153 | catalogue 154 | [(0, 0.25061841401203777), (1, 0.24918543353588501), (2, 0.23264937323352425), (4, 0.23264937323352425), (3, 0.03489740598502864)] 155 | -----one service finish----- 156 | 3 157 | 1 158 | orders_ctn_read 159 | orders 160 | [(5, 0.5698484252463927), (0, 0.22894977799766633), (1, 0.08711959973789057), (2, 0.038027399006016815), (3, 0.038027399006016815), (4, 0.038027399006016815)] 161 | 2 162 | orders_ctn_read 163 | orders 164 | [(3, 0.3653047466782713), (2, 0.3246443297671809), (0, 0.14084628194587961), (1, 0.07415323836720232), (5, 0.05917620824922582), (4, 0.03587519499223998)] 165 | 3 166 | orders_ctn_read 167 | orders 168 | [(5, 0.6394264315501195), (0, 0.19343389940105118), (1, 0.041784917262207365), (2, 0.041784917262207365), (3, 0.041784917262207365), (4, 0.041784917262207365)] 169 | 4 170 | orders_ctn_read 171 | orders 172 | [(3, 0.3629046133972714), (2, 0.32614417312729266), (0, 0.1397754005008086), (1, 0.07562796513175), (5, 0.05971175809752695), (4, 0.035836089745350505)] 173 | 5 174 | orders_ctn_read 175 | orders 176 | [(5, 0.6394564742483129), (0, 0.1933967211066471), (1, 0.04178670116126003), (2, 0.04178670116126003), (3, 0.04178670116126003), (4, 0.04178670116126003)] 177 | 6 178 | orders_ctn_read 179 | orders 180 | [(1, 0.3152186997134145), (2, 0.27888605865567134), (3, 0.23005483462183415), (0, 0.08539527971420283), (5, 0.05593690210160202), (4, 0.03450822519327513)] 181 | 7 182 | orders_ctn_read 183 | orders 184 | [(1, 0.5123248786370262), (4, 0.21968429403456216), (5, 0.14520281964575504), (0, 0.046850845233987776), (2, 0.0405437500129101), (3, 0.03539341243575882)] 185 | 8 186 | orders_ctn_read 187 | orders 188 | [(2, 0.47854795918569165), (5, 0.2113057692063332), (4, 0.15390809117860052), (1, 0.07655969903340819), (0, 0.04557526670462927), (3, 0.034103214691337265)] 189 | 9 190 | orders_ctn_read 191 | orders 192 | [(5, 0.5699664264036484), (0, 0.22879390654106815), (1, 0.08714034621437022), (2, 0.038033106946971075), (3, 0.038033106946971075), (4, 0.038033106946971075)] 193 | 10 194 | orders_ctn_read 195 | orders 196 | [(5, 0.5989897360776161), (0, 0.2151554862313825), (1, 0.06724554786530552), (2, 0.039536409941898644), (3, 0.039536409941898644), (4, 0.039536409941898644)] 197 | -----one service finish----- 198 | 4 199 | 1 200 | carts_ctn_mem 201 | carts_ctn_read 202 | carts 203 | [(2, 0.5299765771909428), (4, 0.21376492896691088), (3, 0.1457115084334407), (1, 0.06879400546378774), (0, 0.04175297994491779)] 204 | 2 205 | carts_ctn_mem 206 | carts_ctn_read 207 | carts 208 | [(3, 0.40403299002465803), (2, 0.37879438623478445), (0, 0.12822994319809464), (1, 0.044471340271231466), (4, 0.044471340271231466)] 209 | 3 210 | carts_ctn_mem 211 | carts_ctn_read 212 | carts 213 | [(0, 0.4386080280041627), (2, 0.355596926312561), (1, 0.08904245423379469), (3, 0.0716348180987926), (4, 0.04511777335068912)] 214 | 4 215 | carts_ctn_mem 216 | carts_ctn_read 217 | carts 218 | [(2, 0.5341359945713127), (1, 0.21471101158958986), (4, 0.11727234119739828), (3, 0.09190767303751934), (0, 0.04197297960417979)] 219 | 5 220 | carts_ctn_mem 221 | carts_ctn_read 222 | carts 223 | [(0, 0.444621345626494), (2, 0.3516351291481151), (1, 0.0869842376242169), (3, 0.0715228734668997), (4, 0.04523641413427421)] 224 | 6 225 | carts_ctn_mem 226 | carts_ctn_read 227 | carts 228 | [(2, 0.555027697707703), (4, 0.2209040789801532), (3, 0.10816336128592755), (1, 0.07294439317751149), (0, 0.04296046884870478)] 229 | 7 230 | carts_ctn_mem 231 | carts_ctn_read 232 | carts 233 | [(2, 0.5475016850010053), (4, 0.21896549834766119), (3, 0.13207890564801578), (1, 0.05887513555081187), (0, 0.04257877545250596)] 234 | 8 235 | carts_ctn_mem 236 | carts_ctn_read 237 | carts 238 | [(2, 0.5290482516589), (4, 0.2134297123618907), (3, 0.14566747327140922), (1, 0.07014142694766964), (0, 0.04171313576013054)] 239 | 9 240 | carts_ctn_mem 241 | carts_ctn_read 242 | carts 243 | [(2, 0.526457173167383), (4, 0.21247472733538791), (3, 0.1455217493777742), (1, 0.0739441552446783), (0, 0.04160219487477654)] 244 | 10 245 | carts_ctn_mem 246 | carts_ctn_read 247 | carts 248 | [(2, 0.5269628837063534), (4, 0.2126633531899743), (3, 0.14555282933468225), (1, 0.07319711730585757), (0, 0.04162381646313264)] 249 | -----one service finish----- 250 | 5 251 | 1 252 | payment_ctn_mem 253 | payment_ctn_write 254 | payment_ctn_read 255 | payment 256 | [(3, 0.6838157437558374), (2, 0.19736366687974471), (0, 0.059410294682208965), (1, 0.059410294682208965)] 257 | 2 258 | payment_ctn_mem 259 | payment_ctn_write 260 | payment_ctn_read 261 | payment 262 | [(3, 0.7122362007668318), (2, 0.1642267032842205), (0, 0.06176854797447384), (1, 0.06176854797447384)] 263 | 3 264 | payment_ctn_mem 265 | payment_ctn_write 266 | payment_ctn_read 267 | payment 268 | [(3, 0.7095571814041762), (2, 0.1673670500172389), (0, 0.06153788428929251), (1, 0.06153788428929251)] 269 | 4 270 | payment_ctn_mem 271 | payment_ctn_write 272 | payment_ctn_read 273 | payment 274 | [(3, 0.7119336218524696), (2, 0.16458156494879395), (0, 0.061742406599368214), (1, 0.061742406599368214)] 275 | 5 276 | payment_ctn_mem 277 | payment_ctn_write 278 | payment_ctn_read 279 | payment 280 | [(3, 0.7166516332533726), (2, 0.15904311705723895), (0, 0.062152624844694235), (1, 0.062152624844694235)] 281 | 6 282 | payment_ctn_mem 283 | payment_ctn_write 284 | payment_ctn_read 285 | payment 286 | [(3, 0.7092338825931171), (2, 0.16774578036247906), (0, 0.0615101685222019), (1, 0.0615101685222019)] 287 | 7 288 | payment_ctn_mem 289 | payment_ctn_write 290 | payment_ctn_read 291 | payment 292 | [(0, 0.4469819085844217), (3, 0.41234779282442824), (2, 0.08668007808687204), (1, 0.053990220504278014)] 293 | 8 294 | payment_ctn_mem 295 | payment_ctn_write 296 | payment_ctn_read 297 | payment 298 | [(2, 0.7374671820105041), (3, 0.10084626564623059), (1, 0.09765568605374511), (0, 0.06403086628952022)] 299 | 9 300 | payment_ctn_mem 301 | payment_ctn_write 302 | payment_ctn_read 303 | payment 304 | [(3, 0.3276570240620703), (0, 0.3243160035053975), (1, 0.30263214994133236), (2, 0.04539482249119986)] 305 | 10 306 | payment_ctn_mem 307 | payment_ctn_write 308 | payment_ctn_read 309 | payment 310 | [(0, 0.7548971866077623), (1, 0.11371697038835715), (2, 0.06569292150194028), (3, 0.06569292150194028)] 311 | -----one service finish----- 312 | 6 313 | 1 314 | shipping_ctn_mem 315 | shipping_ctn_write 316 | shipping_ctn_read 317 | shipping 318 | [(1, 0.6178445252575919), (0, 0.20564434754342936), (2, 0.12198159569902098), (3, 0.054529531499957644)] 319 | 2 320 | shipping_ctn_mem 321 | shipping_ctn_write 322 | shipping_ctn_read 323 | shipping 324 | [(1, 0.6628097999410686), (0, 0.14603167296539293), (2, 0.13339484554271083), (3, 0.05776368155082773)] 325 | 3 326 | shipping_ctn_mem 327 | shipping_ctn_write 328 | shipping_ctn_read 329 | shipping 330 | [(1, 0.5150200693289297), (3, 0.3730614851315925), (0, 0.05595922276973888), (2, 0.05595922276973888)] 331 | 4 332 | shipping_ctn_mem 333 | shipping_ctn_write 334 | shipping_ctn_read 335 | shipping 336 | [(1, 0.5150200693289297), (3, 0.3730614851315925), (0, 0.05595922276973888), (2, 0.05595922276973888)] 337 | 5 338 | shipping_ctn_mem 339 | shipping_ctn_write 340 | shipping_ctn_read 341 | shipping 342 | [(2, 0.6639703803172465), (3, 0.14446363311510627), (0, 0.13371398564467274), (1, 0.05785200092297459)] 343 | 6 344 | shipping_ctn_mem 345 | shipping_ctn_write 346 | shipping_ctn_read 347 | shipping 348 | 7 349 | shipping_ctn_mem 350 | shipping_ctn_write 351 | shipping_ctn_read 352 | shipping 353 | [(1, 0.6178445252575919), (0, 0.20564434754342942), (2, 0.121981595699021), (3, 0.05452953149995766)] 354 | 8 355 | shipping_ctn_mem 356 | shipping_ctn_write 357 | shipping_ctn_read 358 | shipping 359 | 9 360 | shipping_ctn_mem 361 | shipping_ctn_write 362 | shipping_ctn_read 363 | shipping 364 | [(1, 0.6178445252575919), (0, 0.20564434754342942), (2, 0.121981595699021), (3, 0.05452953149995766)] 365 | 10 366 | shipping_ctn_mem 367 | shipping_ctn_write 368 | shipping_ctn_read 369 | shipping 370 | [(2, 0.7228063680591811), (0, 0.15180120441588416), (1, 0.06269621376246746), (3, 0.06269621376246746)] 371 | -----one service finish----- 372 | 2023-03-14 05:28:24 373 | -------------------------------------------------------------------------------- /pa_result/single_service_gamma_eta.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AXinx/CausalRCA_code/21f4eb0d8139aa5046402f15e66fd56b65fd3844/pa_result/single_service_gamma_eta.pdf -------------------------------------------------------------------------------- /requirements.txt: -------------------------------------------------------------------------------- 1 | matplotlib==3.4.3 2 | networkx==2.6.3 3 | numpy==1.20.3 4 | pandas==1.3.4 5 | requests==2.26.0 6 | scikit_learn==1.2.2 7 | scikit_network==0.24.0 8 | scipy==1.7.1 9 | torch==1.10.2 10 | tqdm==4.62.3 11 | -------------------------------------------------------------------------------- /test_all_service_per.sh: -------------------------------------------------------------------------------- 1 | output=cpu_all_service_gamma5.txt; 2 | 3 | echo $(date +%Y-%m-%d" "%H:%M:%S) | tee -a $output; 4 | 5 | for((i=0;i<=6;i++)); 6 | do 7 | echo $i | tee -a $output; 8 | for((j=1;j<=10;j++)); 9 | do 10 | echo $j | tee -a $output; 11 | python train_all_services.py --indx=$i --atype='cpu-hog1_' --gamma=0.5 | tee -a $output; 12 | done; 13 | echo "-----one service finish-----" | tee -a $output; 14 | done 15 | echo $(date +%Y-%m-%d" "%H:%M:%S) | tee -a $output; 16 | -------------------------------------------------------------------------------- /test_latency_per.sh: -------------------------------------------------------------------------------- 1 | output=cpu_latency_gamma5.txt; 2 | 3 | echo $(date +%Y-%m-%d" "%H:%M:%S) | tee -a $output; 4 | 5 | for((i=0;i<=6;i++)); 6 | do 7 | echo $i | tee -a $output; 8 | for((j=1;j<=10;j++)); 9 | do 10 | echo $j | tee -a $output; 11 | python train_latency.py --indx=$i --atype='cpu-hog1_' --gamma=0.5 | tee -a $output; 12 | done; 13 | echo "-----one service finish-----" | tee -a $output; 14 | done 15 | echo $(date +%Y-%m-%d" "%H:%M:%S) | tee -a $output; 16 | -------------------------------------------------------------------------------- /test_single_service_per.sh: -------------------------------------------------------------------------------- 1 | output=memory_single_service_eta1000.txt; 2 | 3 | echo $(date +%Y-%m-%d" "%H:%M:%S) | tee -a $output; 4 | 5 | for((i=6;i<=6;i++)); 6 | do 7 | echo $i | tee -a $output; 8 | for((j=1;j<=10;j++)); 9 | do 10 | echo $j | tee -a $output; 11 | python train_single_service.py --indx=$i --atype='memory-leak1_' --eta=1000 | tee -a $output; 12 | done; 13 | echo "-----one service finish-----" | tee -a $output; 14 | done 15 | echo $(date +%Y-%m-%d" "%H:%M:%S) | tee -a $output; 16 | -------------------------------------------------------------------------------- /train_all_services.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | # -*- coding: utf-8 -*- 3 | """ 4 | Created on Sun Apr 10 20:33:33 2022 5 | 6 | @author: ruyuexin 7 | """ 8 | 9 | import time, datetime 10 | import requests 11 | import pickle as pkl 12 | import numpy as np 13 | import pandas as pd 14 | import matplotlib.pyplot as plt 15 | from tqdm.notebook import tqdm, trange 16 | import json 17 | import networkx as nx 18 | import os 19 | # import torch 20 | import torch.optim as optim 21 | from torch.optim import lr_scheduler 22 | import math 23 | 24 | # import numpy as np 25 | from utils import * 26 | from modules import * 27 | #from paras import * 28 | from config import CONFIG 29 | 30 | import warnings 31 | warnings.filterwarnings('ignore') 32 | 33 | import argparse 34 | 35 | parser = argparse.ArgumentParser() 36 | parser.add_argument('--indx', type=int, default=0, help='index') 37 | parser.add_argument('--atype', type=str, default='cpu-hog1_', help='anomaly type') 38 | parser.add_argument('--gamma', type=float, default=0.25, help='gamma') 39 | parser.add_argument('--eta', type=int, default=10, help='eta') 40 | args = parser.parse_args() 41 | 42 | 43 | CONFIG.cuda = torch.cuda.is_available() 44 | CONFIG.factor = not CONFIG.no_factor 45 | 46 | # torch.manual_seed(CONFIG.seed) 47 | # if CONFIG.cuda: 48 | # torch.cuda.manual_seed(CONFIG.seed) 49 | 50 | from sklearn.metrics import confusion_matrix 51 | from sklearn.metrics import precision_recall_fscore_support 52 | 53 | #f = open('./data_old/collected_data_n_shipping_mem_rt.pkl', 'rb') 54 | 55 | #f = open('collected_data_all_cpu.pkl', 'rb') 56 | names = ['front-end', 'user', 'catalogue', 'orders', 'carts', 'payment', 'shipping'] 57 | metrics = ['ctn_latency', 'ctn_cpu', 'ctn_mem', 'ctn_write', 'ctn_read', 'ctn_net_in', 'ctn_net_out'] 58 | 59 | idx = args.indx 60 | atype = args.atype 61 | #idx = 2 62 | f = open('./data_collected/'+atype+names[idx]+'.pkl', 'rb') 63 | data = pkl.load(f) 64 | 65 | #data = data.iloc[:,1:] 66 | data_sample_size = data.shape[0] 67 | data_variable_size = data.shape[1] 68 | # torch.manual_seed(CONFIG.seed) 69 | # if CONFIG.cuda: 70 | # torch.cuda.manual_seed(CONFIG.seed) 71 | 72 | # ================================================ 73 | # get data: experiments = {synthetic SEM, ALARM} 74 | # ================================================ 75 | train_data = data 76 | 77 | #=================================== 78 | # load modules 79 | #=================================== 80 | # Generate off-diagonal interaction graph 81 | off_diag = np.ones([data_variable_size, data_variable_size]) - np.eye(data_variable_size) 82 | 83 | # add adjacency matrix A 84 | num_nodes = data_variable_size 85 | adj_A = np.zeros((num_nodes, num_nodes)) 86 | 87 | 88 | if CONFIG.encoder == 'mlp': 89 | encoder = MLPEncoder(data_variable_size * CONFIG.x_dims, CONFIG.x_dims, CONFIG.encoder_hidden, 90 | int(CONFIG.z_dims), adj_A, 91 | batch_size = CONFIG.batch_size, 92 | do_prob = CONFIG.encoder_dropout, factor = CONFIG.factor).double() 93 | elif CONFIG.encoder == 'sem': 94 | encoder = SEMEncoder(data_variable_size * CONFIG.x_dims, CONFIG.encoder_hidden, 95 | int(CONFIG.z_dims), adj_A, 96 | batch_size = CONFIG.batch_size, 97 | do_prob = CONFIG.encoder_dropout, factor = CONFIG.factor).double() 98 | 99 | if CONFIG.decoder == 'mlp': 100 | decoder = MLPDecoder(data_variable_size * CONFIG.x_dims, 101 | CONFIG.z_dims, CONFIG.x_dims, encoder, 102 | data_variable_size = data_variable_size, 103 | batch_size = CONFIG.batch_size, 104 | n_hid=CONFIG.decoder_hidden, 105 | do_prob=CONFIG.decoder_dropout).double() 106 | elif CONFIG.decoder == 'sem': 107 | decoder = SEMDecoder(data_variable_size * CONFIG.x_dims, 108 | CONFIG.z_dims, 2, encoder, 109 | data_variable_size = data_variable_size, 110 | batch_size = CONFIG.batch_size, 111 | n_hid=CONFIG.decoder_hidden, 112 | do_prob=CONFIG.decoder_dropout).double() 113 | 114 | #=================================== 115 | # set up training parameters 116 | #=================================== 117 | if CONFIG.optimizer == 'Adam': 118 | optimizer = optim.Adam(list(encoder.parameters()) + list(decoder.parameters()),lr=CONFIG.lr) 119 | elif CONFIG.optimizer == 'LBFGS': 120 | optimizer = optim.LBFGS(list(encoder.parameters()) + list(decoder.parameters()), 121 | lr=CONFIG.lr) 122 | elif CONFIG.optimizer == 'SGD': 123 | optimizer = optim.SGD(list(encoder.parameters()) + list(decoder.parameters()), 124 | lr=CONFIG.lr) 125 | 126 | scheduler = lr_scheduler.StepLR(optimizer, step_size=CONFIG.lr_decay, 127 | gamma=CONFIG.gamma) 128 | 129 | # Linear indices of an upper triangular mx, used for acc calculation 130 | triu_indices = get_triu_offdiag_indices(data_variable_size) 131 | tril_indices = get_tril_offdiag_indices(data_variable_size) 132 | 133 | if CONFIG.prior: 134 | prior = np.array([0.91, 0.03, 0.03, 0.03]) # hard coded for now 135 | print("Using prior") 136 | print(prior) 137 | log_prior = torch.DoubleTensor(np.log(prior)) 138 | log_prior = torch.unsqueeze(log_prior, 0) 139 | log_prior = torch.unsqueeze(log_prior, 0) 140 | log_prior = Variable(log_prior) 141 | 142 | if CONFIG.cuda: 143 | log_prior = log_prior.cuda() 144 | 145 | if CONFIG.cuda: 146 | encoder.cuda() 147 | decoder.cuda() 148 | triu_indices = triu_indices.cuda() 149 | tril_indices = tril_indices.cuda() 150 | 151 | # compute constraint h(A) value 152 | def _h_A(A, m): 153 | expm_A = matrix_poly(A*A, m) 154 | h_A = torch.trace(expm_A) - m 155 | return h_A 156 | 157 | prox_plus = torch.nn.Threshold(0.,0.) 158 | 159 | def stau(w, tau): 160 | w1 = prox_plus(torch.abs(w)-tau) 161 | return torch.sign(w)*w1 162 | 163 | 164 | def update_optimizer(optimizer, original_lr, c_A): 165 | '''related LR to c_A, whenever c_A gets big, reduce LR proportionally''' 166 | MAX_LR = 1e-2 167 | MIN_LR = 1e-4 168 | 169 | estimated_lr = original_lr / (math.log10(c_A) + 1e-10) 170 | if estimated_lr > MAX_LR: 171 | lr = MAX_LR 172 | elif estimated_lr < MIN_LR: 173 | lr = MIN_LR 174 | else: 175 | lr = estimated_lr 176 | 177 | # set LR 178 | for parame_group in optimizer.param_groups: 179 | parame_group['lr'] = lr 180 | 181 | return optimizer, lr 182 | 183 | #=================================== 184 | # training: 185 | #=================================== 186 | def train(epoch, best_val_loss, lambda_A, c_A, optimizer): 187 | t = time.time() 188 | nll_train = [] 189 | kl_train = [] 190 | mse_train = [] 191 | shd_trian = [] 192 | 193 | encoder.train() 194 | decoder.train() 195 | scheduler.step() 196 | 197 | # update optimizer 198 | optimizer, lr = update_optimizer(optimizer, CONFIG.lr, c_A) 199 | 200 | for i in range(1): 201 | data = train_data[i*data_sample_size:(i+1)*data_sample_size] 202 | data = torch.tensor(data.to_numpy().reshape(data_sample_size,data_variable_size,1)) 203 | if CONFIG.cuda: 204 | data = data.cuda() 205 | data = Variable(data).double() 206 | 207 | optimizer.zero_grad() 208 | 209 | enc_x, logits, origin_A, adj_A_tilt_encoder, z_gap, z_positive, myA, Wa = encoder(data) # logits is of size: [num_sims, z_dims] 210 | edges = logits 211 | #print(origin_A) 212 | dec_x, output, adj_A_tilt_decoder = decoder(data, edges, data_variable_size * CONFIG.x_dims, origin_A, adj_A_tilt_encoder, Wa) 213 | 214 | if torch.sum(output != output): 215 | print('nan error\n') 216 | 217 | target = data 218 | preds = output 219 | variance = 0. 220 | 221 | # reconstruction accuracy loss 222 | loss_nll = nll_gaussian(preds, target, variance) 223 | 224 | # KL loss 225 | loss_kl = kl_gaussian_sem(logits) 226 | 227 | # ELBO loss: 228 | loss = loss_kl + loss_nll 229 | # add A loss 230 | one_adj_A = origin_A # torch.mean(adj_A_tilt_decoder, dim =0) 231 | sparse_loss = CONFIG.tau_A * torch.sum(torch.abs(one_adj_A)) 232 | 233 | # other loss term 234 | if CONFIG.use_A_connect_loss: 235 | connect_gap = A_connect_loss(one_adj_A, CONFIG.graph_threshold, z_gap) 236 | loss += lambda_A * connect_gap + 0.5 * c_A * connect_gap * connect_gap 237 | 238 | if CONFIG.use_A_positiver_loss: 239 | positive_gap = A_positive_loss(one_adj_A, z_positive) 240 | loss += .1 * (lambda_A * positive_gap + 0.5 * c_A * positive_gap * positive_gap) 241 | 242 | # compute h(A) 243 | h_A = _h_A(origin_A, data_variable_size) 244 | loss += lambda_A * h_A + 0.5 * c_A * h_A * h_A + 100. * torch.trace(origin_A*origin_A) + sparse_loss #+ 0.01 * torch.sum(variance * variance) 245 | 246 | #print(loss) 247 | loss.backward() 248 | loss = optimizer.step() 249 | 250 | myA.data = stau(myA.data, CONFIG.tau_A*lr) 251 | 252 | if torch.sum(origin_A != origin_A): 253 | print('nan error\n') 254 | 255 | # compute metrics 256 | graph = origin_A.data.clone().cpu().numpy() 257 | graph[np.abs(graph) < CONFIG.graph_threshold] = 0 258 | 259 | mse_train.append(F.mse_loss(preds, target).item()) 260 | nll_train.append(loss_nll.item()) 261 | kl_train.append(loss_kl.item()) 262 | 263 | return np.mean(np.mean(kl_train) + np.mean(nll_train)), np.mean(nll_train), np.mean(mse_train), graph, origin_A 264 | 265 | #=================================== 266 | # main 267 | #=================================== 268 | 269 | gamma = args.gamma 270 | eta = args.eta 271 | 272 | t_total = time.time() 273 | best_ELBO_loss = np.inf 274 | best_NLL_loss = np.inf 275 | best_MSE_loss = np.inf 276 | best_epoch = 0 277 | best_ELBO_graph = [] 278 | best_NLL_graph = [] 279 | best_MSE_graph = [] 280 | # optimizer step on hyparameters 281 | c_A = CONFIG.c_A 282 | lambda_A = CONFIG.lambda_A 283 | h_A_new = torch.tensor(1.) 284 | h_tol = CONFIG.h_tol 285 | k_max_iter = int(CONFIG.k_max_iter) 286 | h_A_old = np.inf 287 | 288 | E_loss = [] 289 | N_loss = [] 290 | M_loss = [] 291 | start_time = time.time() 292 | try: 293 | for step_k in range(k_max_iter): 294 | #print(step_k) 295 | while c_A < 1e+20: 296 | for epoch in range(CONFIG.epochs): 297 | #print(epoch) 298 | ELBO_loss, NLL_loss, MSE_loss, graph, origin_A = train(epoch, best_ELBO_loss, lambda_A, c_A, optimizer) 299 | E_loss.append(ELBO_loss) 300 | N_loss.append(NLL_loss) 301 | M_loss.append(MSE_loss) 302 | if ELBO_loss < best_ELBO_loss: 303 | best_ELBO_loss = ELBO_loss 304 | best_epoch = epoch 305 | best_ELBO_graph = graph 306 | 307 | if NLL_loss < best_NLL_loss: 308 | best_NLL_loss = NLL_loss 309 | best_epoch = epoch 310 | best_NLL_graph = graph 311 | 312 | if MSE_loss < best_MSE_loss: 313 | best_MSE_loss = MSE_loss 314 | best_epoch = epoch 315 | best_MSE_graph = graph 316 | 317 | #print("Optimization Finished!") 318 | #print("Best Epoch: {:04d}".format(best_epoch)) 319 | if ELBO_loss > 2 * best_ELBO_loss: 320 | break 321 | 322 | # update parameters 323 | A_new = origin_A.data.clone() 324 | h_A_new = _h_A(A_new, data_variable_size) 325 | if h_A_new.item() > gamma * h_A_old: 326 | c_A*=eta 327 | else: 328 | break 329 | 330 | # update parameters 331 | # h_A, adj_A are computed in loss anyway, so no need to store 332 | h_A_old = h_A_new.item() 333 | lambda_A += c_A * h_A_new.item() 334 | 335 | if h_A_new.item() <= h_tol: 336 | break 337 | 338 | #print("Steps: {:04d}".format(step_k)) 339 | #print("Best Epoch: {:04d}".format(best_epoch)) 340 | 341 | # test() 342 | #print (best_ELBO_graph) 343 | #print(best_NLL_graph) 344 | #print (best_MSE_graph) 345 | 346 | graph = origin_A.data.clone().cpu().numpy() 347 | graph[np.abs(graph) < 0.1] = 0 348 | graph[np.abs(graph) < 0.2] = 0 349 | graph[np.abs(graph) < 0.3] = 0 350 | 351 | except KeyboardInterrupt: 352 | print('Done!') 353 | 354 | end_time = time.time() 355 | #print("Time spent: ",end_time-start_time) 356 | print(names[idx]) 357 | adj = graph 358 | #print(adj) 359 | org_G = nx.from_numpy_matrix(adj, parallel_edges=True, create_using=nx.DiGraph) 360 | pos=nx.circular_layout(org_G) 361 | nx.draw(org_G, pos=pos, with_labels=True) 362 | plt.savefig("metrics_causality.png") 363 | 364 | # PageRank in networkx 365 | #G = nx.from_numpy_matrix(adj.T, parallel_edges=True, create_using=nx.DiGraph) 366 | #scores = nx.pagerank(G, max_iter=1000) 367 | #print(sorted(scores.items(), key=lambda item:item[1], reverse=True)) 368 | 369 | # PageRank 370 | from sknetwork.ranking import PageRank 371 | pagerank = PageRank() 372 | scores = pagerank.fit_transform(np.abs(adj.T)) 373 | #print(scores) 374 | #cmap = plt.cm.coolwarm 375 | 376 | score_dict = {} 377 | for i,s in enumerate(scores): 378 | score_dict[i] = s 379 | #print(sorted(score_dict.items(), key=lambda item:item[1], reverse=True)) 380 | sorted_dict = sorted(score_dict.items(), key=lambda item:item[1], reverse=True) 381 | for i in range(len(sorted_dict)): 382 | print(i+1, sorted_dict[i]) 383 | -------------------------------------------------------------------------------- /train_latency.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | # -*- coding: utf-8 -*- 3 | """ 4 | Created on Sun Apr 10 20:33:33 2022 5 | 6 | @author: ruyuexin 7 | """ 8 | 9 | import time, datetime 10 | import requests 11 | import pickle as pkl 12 | import numpy as np 13 | import pandas as pd 14 | import matplotlib.pyplot as plt 15 | from tqdm.notebook import tqdm, trange 16 | import json 17 | import networkx as nx 18 | import os 19 | # import torch 20 | import torch.optim as optim 21 | from torch.optim import lr_scheduler 22 | import math 23 | 24 | # import numpy as np 25 | from utils import * 26 | from modules import * 27 | #from paras import * 28 | from config import CONFIG 29 | 30 | import warnings 31 | warnings.filterwarnings('ignore') 32 | 33 | import argparse 34 | 35 | parser = argparse.ArgumentParser() 36 | parser.add_argument('--indx', type=int, default=0, help='index') 37 | parser.add_argument('--atype', type=str, default='cpu-hog1_', help='anomaly type') 38 | parser.add_argument('--gamma', type=float, default=0.25, help='gamma') 39 | parser.add_argument('--eta', type=int, default=10, help='eta') 40 | args = parser.parse_args() 41 | 42 | CONFIG.cuda = torch.cuda.is_available() 43 | CONFIG.factor = not CONFIG.no_factor 44 | 45 | # torch.manual_seed(CONFIG.seed) 46 | # if CONFIG.cuda: 47 | # torch.cuda.manual_seed(CONFIG.seed) 48 | 49 | from sklearn.metrics import confusion_matrix 50 | from sklearn.metrics import precision_recall_fscore_support 51 | 52 | #f = open('./data_old/collected_data_n_shipping_mem_rt.pkl', 'rb') 53 | 54 | names = ['front-end', 'user', 'catalogue', 'orders', 'carts', 'payment', 'shipping'] 55 | metrics = ['ctn_latency', 'ctn_cpu', 'ctn_mem', 'ctn_write', 'ctn_read', 'ctn_net_in', 'ctn_net_out'] 56 | 57 | idx = args.indx 58 | atype = args.atype 59 | #idx = 3 60 | f = open('./data_collected/'+atype+names[idx]+'.pkl', 'rb') 61 | #f = open('without-stress.pkl', 'rb') 62 | all_data = pkl.load(f) 63 | 64 | name=[i+'_'+'ctn_latency' for i in names] 65 | data = all_data[name] 66 | 67 | #data = data.iloc[:,1:] 68 | data_sample_size = data.shape[0] 69 | data_variable_size = data.shape[1] 70 | # torch.manual_seed(CONFIG.seed) 71 | # if CONFIG.cuda: 72 | # torch.cuda.manual_seed(CONFIG.seed) 73 | 74 | # ================================================ 75 | # get data: experiments = {synthetic SEM, ALARM} 76 | # ================================================ 77 | train_data = data 78 | 79 | #=================================== 80 | # load modules 81 | #=================================== 82 | # Generate off-diagonal interaction graph 83 | off_diag = np.ones([data_variable_size, data_variable_size]) - np.eye(data_variable_size) 84 | 85 | # add adjacency matrix A 86 | num_nodes = data_variable_size 87 | adj_A = np.zeros((num_nodes, num_nodes)) 88 | 89 | 90 | if CONFIG.encoder == 'mlp': 91 | encoder = MLPEncoder(data_variable_size * CONFIG.x_dims, CONFIG.x_dims, CONFIG.encoder_hidden, 92 | int(CONFIG.z_dims), adj_A, 93 | batch_size = CONFIG.batch_size, 94 | do_prob = CONFIG.encoder_dropout, factor = CONFIG.factor).double() 95 | elif CONFIG.encoder == 'sem': 96 | encoder = SEMEncoder(data_variable_size * CONFIG.x_dims, CONFIG.encoder_hidden, 97 | int(CONFIG.z_dims), adj_A, 98 | batch_size = CONFIG.batch_size, 99 | do_prob = CONFIG.encoder_dropout, factor = CONFIG.factor).double() 100 | 101 | if CONFIG.decoder == 'mlp': 102 | decoder = MLPDecoder(data_variable_size * CONFIG.x_dims, 103 | CONFIG.z_dims, CONFIG.x_dims, encoder, 104 | data_variable_size = data_variable_size, 105 | batch_size = CONFIG.batch_size, 106 | n_hid=CONFIG.decoder_hidden, 107 | do_prob=CONFIG.decoder_dropout).double() 108 | elif CONFIG.decoder == 'sem': 109 | decoder = SEMDecoder(data_variable_size * CONFIG.x_dims, 110 | CONFIG.z_dims, 2, encoder, 111 | data_variable_size = data_variable_size, 112 | batch_size = CONFIG.batch_size, 113 | n_hid=CONFIG.decoder_hidden, 114 | do_prob=CONFIG.decoder_dropout).double() 115 | 116 | #=================================== 117 | # set up training parameters 118 | #=================================== 119 | if CONFIG.optimizer == 'Adam': 120 | optimizer = optim.Adam(list(encoder.parameters()) + list(decoder.parameters()),lr=CONFIG.lr) 121 | elif CONFIG.optimizer == 'LBFGS': 122 | optimizer = optim.LBFGS(list(encoder.parameters()) + list(decoder.parameters()), 123 | lr=CONFIG.lr) 124 | elif CONFIG.optimizer == 'SGD': 125 | optimizer = optim.SGD(list(encoder.parameters()) + list(decoder.parameters()), 126 | lr=CONFIG.lr) 127 | 128 | scheduler = lr_scheduler.StepLR(optimizer, step_size=CONFIG.lr_decay, 129 | gamma=CONFIG.gamma) 130 | 131 | # Linear indices of an upper triangular mx, used for acc calculation 132 | triu_indices = get_triu_offdiag_indices(data_variable_size) 133 | tril_indices = get_tril_offdiag_indices(data_variable_size) 134 | 135 | if CONFIG.prior: 136 | prior = np.array([0.91, 0.03, 0.03, 0.03]) # hard coded for now 137 | print("Using prior") 138 | print(prior) 139 | log_prior = torch.DoubleTensor(np.log(prior)) 140 | log_prior = torch.unsqueeze(log_prior, 0) 141 | log_prior = torch.unsqueeze(log_prior, 0) 142 | log_prior = Variable(log_prior) 143 | 144 | if CONFIG.cuda: 145 | log_prior = log_prior.cuda() 146 | 147 | if CONFIG.cuda: 148 | encoder.cuda() 149 | decoder.cuda() 150 | triu_indices = triu_indices.cuda() 151 | tril_indices = tril_indices.cuda() 152 | 153 | # compute constraint h(A) value 154 | def _h_A(A, m): 155 | expm_A = matrix_poly(A*A, m) 156 | h_A = torch.trace(expm_A) - m 157 | return h_A 158 | 159 | prox_plus = torch.nn.Threshold(0.,0.) 160 | 161 | def stau(w, tau): 162 | w1 = prox_plus(torch.abs(w)-tau) 163 | return torch.sign(w)*w1 164 | 165 | 166 | def update_optimizer(optimizer, original_lr, c_A): 167 | '''related LR to c_A, whenever c_A gets big, reduce LR proportionally''' 168 | MAX_LR = 1e-2 169 | MIN_LR = 1e-4 170 | 171 | estimated_lr = original_lr / (math.log10(c_A) + 1e-10) 172 | if estimated_lr > MAX_LR: 173 | lr = MAX_LR 174 | elif estimated_lr < MIN_LR: 175 | lr = MIN_LR 176 | else: 177 | lr = estimated_lr 178 | 179 | # set LR 180 | for parame_group in optimizer.param_groups: 181 | parame_group['lr'] = lr 182 | 183 | return optimizer, lr 184 | 185 | #=================================== 186 | # training: 187 | #=================================== 188 | def train(epoch, best_val_loss, lambda_A, c_A, optimizer): 189 | t = time.time() 190 | nll_train = [] 191 | kl_train = [] 192 | mse_train = [] 193 | shd_trian = [] 194 | 195 | encoder.train() 196 | decoder.train() 197 | scheduler.step() 198 | 199 | # update optimizer 200 | optimizer, lr = update_optimizer(optimizer, CONFIG.lr, c_A) 201 | 202 | for i in range(1): 203 | data = train_data[i*data_sample_size:(i+1)*data_sample_size] 204 | data = torch.tensor(data.to_numpy().reshape(data_sample_size,data_variable_size,1)) 205 | if CONFIG.cuda: 206 | data = data.cuda() 207 | data = Variable(data).double() 208 | 209 | optimizer.zero_grad() 210 | 211 | enc_x, logits, origin_A, adj_A_tilt_encoder, z_gap, z_positive, myA, Wa = encoder(data) # logits is of size: [num_sims, z_dims] 212 | edges = logits 213 | #print(origin_A) 214 | dec_x, output, adj_A_tilt_decoder = decoder(data, edges, data_variable_size * CONFIG.x_dims, origin_A, adj_A_tilt_encoder, Wa) 215 | 216 | if torch.sum(output != output): 217 | print('nan error\n') 218 | 219 | target = data 220 | preds = output 221 | variance = 0. 222 | 223 | # reconstruction accuracy loss 224 | loss_nll = nll_gaussian(preds, target, variance) 225 | 226 | # KL loss 227 | loss_kl = kl_gaussian_sem(logits) 228 | 229 | # ELBO loss: 230 | loss = loss_kl + loss_nll 231 | # add A loss 232 | one_adj_A = origin_A # torch.mean(adj_A_tilt_decoder, dim =0) 233 | sparse_loss = CONFIG.tau_A * torch.sum(torch.abs(one_adj_A)) 234 | 235 | # other loss term 236 | if CONFIG.use_A_connect_loss: 237 | connect_gap = A_connect_loss(one_adj_A, CONFIG.graph_threshold, z_gap) 238 | loss += lambda_A * connect_gap + 0.5 * c_A * connect_gap * connect_gap 239 | 240 | if CONFIG.use_A_positiver_loss: 241 | positive_gap = A_positive_loss(one_adj_A, z_positive) 242 | loss += .1 * (lambda_A * positive_gap + 0.5 * c_A * positive_gap * positive_gap) 243 | 244 | # compute h(A) 245 | h_A = _h_A(origin_A, data_variable_size) 246 | loss += lambda_A * h_A + 0.5 * c_A * h_A * h_A + 100. * torch.trace(origin_A*origin_A) + sparse_loss #+ 0.01 * torch.sum(variance * variance) 247 | 248 | #print(loss) 249 | loss.backward() 250 | loss = optimizer.step() 251 | 252 | myA.data = stau(myA.data, CONFIG.tau_A*lr) 253 | 254 | if torch.sum(origin_A != origin_A): 255 | print('nan error\n') 256 | 257 | # compute metrics 258 | graph = origin_A.data.clone().cpu().numpy() 259 | graph[np.abs(graph) < CONFIG.graph_threshold] = 0 260 | 261 | mse_train.append(F.mse_loss(preds, target).item()) 262 | nll_train.append(loss_nll.item()) 263 | kl_train.append(loss_kl.item()) 264 | 265 | return np.mean(np.mean(kl_train) + np.mean(nll_train)), np.mean(nll_train), np.mean(mse_train), graph, origin_A 266 | 267 | #=================================== 268 | # main 269 | #=================================== 270 | 271 | gamma = args.gamma 272 | eta = args.eta 273 | 274 | t_total = time.time() 275 | best_ELBO_loss = np.inf 276 | best_NLL_loss = np.inf 277 | best_MSE_loss = np.inf 278 | best_epoch = 0 279 | best_ELBO_graph = [] 280 | best_NLL_graph = [] 281 | best_MSE_graph = [] 282 | # optimizer step on hyparameters 283 | c_A = CONFIG.c_A 284 | lambda_A = CONFIG.lambda_A 285 | h_A_new = torch.tensor(1.) 286 | h_tol = CONFIG.h_tol 287 | k_max_iter = int(CONFIG.k_max_iter) 288 | h_A_old = np.inf 289 | 290 | E_loss = [] 291 | N_loss = [] 292 | M_loss = [] 293 | start_time = time.time() 294 | try: 295 | for step_k in range(k_max_iter): 296 | #print(step_k) 297 | while c_A < 1e+20: 298 | for epoch in range(CONFIG.epochs): 299 | #print(epoch) 300 | ELBO_loss, NLL_loss, MSE_loss, graph, origin_A = train(epoch, best_ELBO_loss, lambda_A, c_A, optimizer) 301 | E_loss.append(ELBO_loss) 302 | N_loss.append(NLL_loss) 303 | M_loss.append(MSE_loss) 304 | if ELBO_loss < best_ELBO_loss: 305 | best_ELBO_loss = ELBO_loss 306 | best_epoch = epoch 307 | best_ELBO_graph = graph 308 | 309 | if NLL_loss < best_NLL_loss: 310 | best_NLL_loss = NLL_loss 311 | best_epoch = epoch 312 | best_NLL_graph = graph 313 | 314 | if MSE_loss < best_MSE_loss: 315 | best_MSE_loss = MSE_loss 316 | best_epoch = epoch 317 | best_MSE_graph = graph 318 | 319 | #print("Optimization Finished!") 320 | #print("Best Epoch: {:04d}".format(best_epoch)) 321 | if ELBO_loss > 2 * best_ELBO_loss: 322 | break 323 | 324 | # update parameters 325 | A_new = origin_A.data.clone() 326 | h_A_new = _h_A(A_new, data_variable_size) 327 | if h_A_new.item() > gamma * h_A_old: 328 | c_A*=eta 329 | else: 330 | break 331 | 332 | # update parameters 333 | # h_A, adj_A are computed in loss anyway, so no need to store 334 | h_A_old = h_A_new.item() 335 | lambda_A += c_A * h_A_new.item() 336 | 337 | if h_A_new.item() <= h_tol: 338 | break 339 | 340 | #print("Steps: {:04d}".format(step_k)) 341 | #print("Best Epoch: {:04d}".format(best_epoch)) 342 | 343 | # test() 344 | #print (best_ELBO_graph) 345 | #print(best_NLL_graph) 346 | #print (best_MSE_graph) 347 | 348 | graph = origin_A.data.clone().cpu().numpy() 349 | graph[np.abs(graph) < 0.1] = 0 350 | graph[np.abs(graph) < 0.2] = 0 351 | graph[np.abs(graph) < 0.3] = 0 352 | 353 | except KeyboardInterrupt: 354 | print('Done!') 355 | 356 | end_time = time.time() 357 | #print("Time spent: ",end_time-start_time) 358 | print(names[idx]) 359 | adj = graph 360 | #print(adj) 361 | org_G = nx.from_numpy_matrix(adj, parallel_edges=True, create_using=nx.DiGraph) 362 | pos=nx.circular_layout(org_G) 363 | nx.draw(org_G, pos=pos, with_labels=True) 364 | plt.savefig("metrics_causality.png") 365 | 366 | f = open('cpu-hog_front-end_adj.pkl', 'wb') 367 | pkl.dump(adj, f) 368 | 369 | # PageRank in networkx 370 | #G = nx.from_numpy_matrix(adj.T, parallel_edges=True, create_using=nx.DiGraph) 371 | #scores = nx.pagerank(G, max_iter=1000) 372 | #print(sorted(scores.items(), key=lambda item:item[1], reverse=True)) 373 | 374 | # PageRank 375 | from sknetwork.ranking import PageRank 376 | pagerank = PageRank() 377 | scores = pagerank.fit_transform(np.abs(adj.T)) # add abd 378 | #print(scores) 379 | #cmap = plt.cm.coolwarm 380 | 381 | score_dict = {} 382 | for i,s in enumerate(scores): 383 | score_dict[i] = s 384 | print(sorted(score_dict.items(), key=lambda item:item[1], reverse=True)) 385 | -------------------------------------------------------------------------------- /train_single_service.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python3 2 | # -*- coding: utf-8 -*- 3 | """ 4 | Created on Sun Apr 10 20:33:33 2022 5 | 6 | @author: ruyuexin 7 | """ 8 | 9 | import time, datetime 10 | import requests 11 | import pickle as pkl 12 | import numpy as np 13 | import pandas as pd 14 | import matplotlib.pyplot as plt 15 | from tqdm.notebook import tqdm, trange 16 | import json 17 | import networkx as nx 18 | import os 19 | # import torch 20 | import torch.optim as optim 21 | from torch.optim import lr_scheduler 22 | import math 23 | 24 | # import numpy as np 25 | from utils import * 26 | from modules import * 27 | #from paras import * 28 | from config import CONFIG 29 | 30 | import warnings 31 | warnings.filterwarnings('ignore') 32 | 33 | import argparse 34 | 35 | parser = argparse.ArgumentParser() 36 | parser.add_argument('--indx', type=int, default=0, help='index') 37 | parser.add_argument('--atype', type=str, default='cpu-hog1_', help='anomaly type') 38 | parser.add_argument('--gamma', type=float, default=0.25, help='gamma') 39 | parser.add_argument('--eta', type=int, default=10, help='eta') 40 | args = parser.parse_args() 41 | 42 | 43 | CONFIG.cuda = torch.cuda.is_available() 44 | CONFIG.factor = not CONFIG.no_factor 45 | 46 | # torch.manual_seed(CONFIG.seed) 47 | # if CONFIG.cuda: 48 | # torch.cuda.manual_seed(CONFIG.seed) 49 | 50 | from sklearn.metrics import confusion_matrix 51 | from sklearn.metrics import precision_recall_fscore_support 52 | 53 | #f = open('./data_old/collected_data_n_shipping_mem_rt.pkl', 'rb') 54 | 55 | names = ['front-end', 'user', 'catalogue', 'orders', 'carts', 'payment', 'shipping'] 56 | metrics = ['ctn_latency', 'ctn_cpu', 'ctn_mem', 'ctn_write', 'ctn_read', 'ctn_net_in', 'ctn_net_out'] 57 | 58 | idx = args.indx 59 | atype = args.atype 60 | 61 | #idx = 6 62 | 63 | f = open('./data_collected/'+atype+names[idx]+'.pkl', 'rb') 64 | all_data = pkl.load(f) 65 | 66 | name=[] 67 | for i in metrics: 68 | n = names[idx]+'_'+i 69 | try: 70 | if len(all_data[[n]]) != 0 : 71 | name.append(n) 72 | except: 73 | print(n) 74 | data = all_data[name] 75 | 76 | #data = data.iloc[:,1:] 77 | data_sample_size = data.shape[0] 78 | data_variable_size = data.shape[1] 79 | # torch.manual_seed(CONFIG.seed) 80 | # if CONFIG.cuda: 81 | # torch.cuda.manual_seed(CONFIG.seed) 82 | 83 | # ================================================ 84 | # get data: experiments = {synthetic SEM, ALARM} 85 | # ================================================ 86 | train_data = data 87 | 88 | #=================================== 89 | # load modules 90 | #=================================== 91 | # Generate off-diagonal interaction graph 92 | off_diag = np.ones([data_variable_size, data_variable_size]) - np.eye(data_variable_size) 93 | 94 | # add adjacency matrix A 95 | num_nodes = data_variable_size 96 | adj_A = np.zeros((num_nodes, num_nodes)) 97 | 98 | 99 | if CONFIG.encoder == 'mlp': 100 | encoder = MLPEncoder(data_variable_size * CONFIG.x_dims, CONFIG.x_dims, CONFIG.encoder_hidden, 101 | int(CONFIG.z_dims), adj_A, 102 | batch_size = CONFIG.batch_size, 103 | do_prob = CONFIG.encoder_dropout, factor = CONFIG.factor).double() 104 | elif CONFIG.encoder == 'sem': 105 | encoder = SEMEncoder(data_variable_size * CONFIG.x_dims, CONFIG.encoder_hidden, 106 | int(CONFIG.z_dims), adj_A, 107 | batch_size = CONFIG.batch_size, 108 | do_prob = CONFIG.encoder_dropout, factor = CONFIG.factor).double() 109 | 110 | if CONFIG.decoder == 'mlp': 111 | decoder = MLPDecoder(data_variable_size * CONFIG.x_dims, 112 | CONFIG.z_dims, CONFIG.x_dims, encoder, 113 | data_variable_size = data_variable_size, 114 | batch_size = CONFIG.batch_size, 115 | n_hid=CONFIG.decoder_hidden, 116 | do_prob=CONFIG.decoder_dropout).double() 117 | elif CONFIG.decoder == 'sem': 118 | decoder = SEMDecoder(data_variable_size * CONFIG.x_dims, 119 | CONFIG.z_dims, 2, encoder, 120 | data_variable_size = data_variable_size, 121 | batch_size = CONFIG.batch_size, 122 | n_hid=CONFIG.decoder_hidden, 123 | do_prob=CONFIG.decoder_dropout).double() 124 | 125 | #=================================== 126 | # set up training parameters 127 | #=================================== 128 | if CONFIG.optimizer == 'Adam': 129 | optimizer = optim.Adam(list(encoder.parameters()) + list(decoder.parameters()),lr=CONFIG.lr) 130 | elif CONFIG.optimizer == 'LBFGS': 131 | optimizer = optim.LBFGS(list(encoder.parameters()) + list(decoder.parameters()), 132 | lr=CONFIG.lr) 133 | elif CONFIG.optimizer == 'SGD': 134 | optimizer = optim.SGD(list(encoder.parameters()) + list(decoder.parameters()), 135 | lr=CONFIG.lr) 136 | 137 | scheduler = lr_scheduler.StepLR(optimizer, step_size=CONFIG.lr_decay, 138 | gamma=CONFIG.gamma) 139 | 140 | # Linear indices of an upper triangular mx, used for acc calculation 141 | triu_indices = get_triu_offdiag_indices(data_variable_size) 142 | tril_indices = get_tril_offdiag_indices(data_variable_size) 143 | 144 | if CONFIG.prior: 145 | prior = np.array([0.91, 0.03, 0.03, 0.03]) # hard coded for now 146 | print("Using prior") 147 | print(prior) 148 | log_prior = torch.DoubleTensor(np.log(prior)) 149 | log_prior = torch.unsqueeze(log_prior, 0) 150 | log_prior = torch.unsqueeze(log_prior, 0) 151 | log_prior = Variable(log_prior) 152 | 153 | if CONFIG.cuda: 154 | log_prior = log_prior.cuda() 155 | 156 | if CONFIG.cuda: 157 | encoder.cuda() 158 | decoder.cuda() 159 | triu_indices = triu_indices.cuda() 160 | tril_indices = tril_indices.cuda() 161 | 162 | # compute constraint h(A) value 163 | def _h_A(A, m): 164 | expm_A = matrix_poly(A*A, m) 165 | h_A = torch.trace(expm_A) - m 166 | return h_A 167 | 168 | prox_plus = torch.nn.Threshold(0.,0.) 169 | 170 | def stau(w, tau): 171 | w1 = prox_plus(torch.abs(w)-tau) 172 | return torch.sign(w)*w1 173 | 174 | 175 | def update_optimizer(optimizer, original_lr, c_A): 176 | '''related LR to c_A, whenever c_A gets big, reduce LR proportionally''' 177 | MAX_LR = 1e-2 178 | MIN_LR = 1e-4 179 | 180 | estimated_lr = original_lr / (math.log10(c_A) + 1e-10) 181 | if estimated_lr > MAX_LR: 182 | lr = MAX_LR 183 | elif estimated_lr < MIN_LR: 184 | lr = MIN_LR 185 | else: 186 | lr = estimated_lr 187 | 188 | # set LR 189 | for parame_group in optimizer.param_groups: 190 | parame_group['lr'] = lr 191 | 192 | return optimizer, lr 193 | 194 | #=================================== 195 | # training: 196 | #=================================== 197 | def train(epoch, best_val_loss, lambda_A, c_A, optimizer): 198 | t = time.time() 199 | nll_train = [] 200 | kl_train = [] 201 | mse_train = [] 202 | shd_trian = [] 203 | 204 | encoder.train() 205 | decoder.train() 206 | scheduler.step() 207 | 208 | # update optimizer 209 | optimizer, lr = update_optimizer(optimizer, CONFIG.lr, c_A) 210 | 211 | for i in range(1): 212 | data = train_data[i*data_sample_size:(i+1)*data_sample_size] 213 | data = torch.tensor(data.to_numpy().reshape(data_sample_size,data_variable_size,1)) 214 | if CONFIG.cuda: 215 | data = data.cuda() 216 | data = Variable(data).double() 217 | 218 | optimizer.zero_grad() 219 | 220 | enc_x, logits, origin_A, adj_A_tilt_encoder, z_gap, z_positive, myA, Wa = encoder(data) # logits is of size: [num_sims, z_dims] 221 | edges = logits 222 | #print(origin_A) 223 | dec_x, output, adj_A_tilt_decoder = decoder(data, edges, data_variable_size * CONFIG.x_dims, origin_A, adj_A_tilt_encoder, Wa) 224 | 225 | if torch.sum(output != output): 226 | print('nan error\n') 227 | 228 | target = data 229 | preds = output 230 | variance = 0. 231 | 232 | # reconstruction accuracy loss 233 | loss_nll = nll_gaussian(preds, target, variance) 234 | 235 | # KL loss 236 | loss_kl = kl_gaussian_sem(logits) 237 | 238 | # ELBO loss: 239 | loss = loss_kl + loss_nll 240 | # add A loss 241 | one_adj_A = origin_A # torch.mean(adj_A_tilt_decoder, dim =0) 242 | sparse_loss = CONFIG.tau_A * torch.sum(torch.abs(one_adj_A)) 243 | 244 | # other loss term 245 | if CONFIG.use_A_connect_loss: 246 | connect_gap = A_connect_loss(one_adj_A, CONFIG.graph_threshold, z_gap) 247 | loss += lambda_A * connect_gap + 0.5 * c_A * connect_gap * connect_gap 248 | 249 | if CONFIG.use_A_positiver_loss: 250 | positive_gap = A_positive_loss(one_adj_A, z_positive) 251 | loss += .1 * (lambda_A * positive_gap + 0.5 * c_A * positive_gap * positive_gap) 252 | 253 | # compute h(A) 254 | h_A = _h_A(origin_A, data_variable_size) 255 | loss += lambda_A * h_A + 0.5 * c_A * h_A * h_A + 100. * torch.trace(origin_A*origin_A) + sparse_loss #+ 0.01 * torch.sum(variance * variance) 256 | 257 | #print(loss) 258 | loss.backward() 259 | loss = optimizer.step() 260 | 261 | myA.data = stau(myA.data, CONFIG.tau_A*lr) 262 | 263 | if torch.sum(origin_A != origin_A): 264 | print('nan error\n') 265 | 266 | # compute metrics 267 | graph = origin_A.data.clone().cpu().numpy() 268 | graph[np.abs(graph) < CONFIG.graph_threshold] = 0 269 | 270 | mse_train.append(F.mse_loss(preds, target).item()) 271 | nll_train.append(loss_nll.item()) 272 | kl_train.append(loss_kl.item()) 273 | 274 | return np.mean(np.mean(kl_train) + np.mean(nll_train)), np.mean(nll_train), np.mean(mse_train), graph, origin_A 275 | 276 | #=================================== 277 | # main 278 | #=================================== 279 | 280 | gamma = args.gamma 281 | eta = args.eta 282 | 283 | t_total = time.time() 284 | best_ELBO_loss = np.inf 285 | best_NLL_loss = np.inf 286 | best_MSE_loss = np.inf 287 | best_epoch = 0 288 | best_ELBO_graph = [] 289 | best_NLL_graph = [] 290 | best_MSE_graph = [] 291 | # optimizer step on hyparameters 292 | c_A = CONFIG.c_A 293 | lambda_A = CONFIG.lambda_A 294 | h_A_new = torch.tensor(1.) 295 | h_tol = CONFIG.h_tol 296 | k_max_iter = int(CONFIG.k_max_iter) 297 | h_A_old = np.inf 298 | 299 | E_loss = [] 300 | N_loss = [] 301 | M_loss = [] 302 | start_time = time.time() 303 | try: 304 | for step_k in range(k_max_iter): 305 | #print(step_k) 306 | while c_A < 1e+20: 307 | for epoch in range(CONFIG.epochs): 308 | #print(epoch) 309 | ELBO_loss, NLL_loss, MSE_loss, graph, origin_A = train(epoch, best_ELBO_loss, lambda_A, c_A, optimizer) 310 | E_loss.append(ELBO_loss) 311 | N_loss.append(NLL_loss) 312 | M_loss.append(MSE_loss) 313 | if ELBO_loss < best_ELBO_loss: 314 | best_ELBO_loss = ELBO_loss 315 | best_epoch = epoch 316 | best_ELBO_graph = graph 317 | 318 | if NLL_loss < best_NLL_loss: 319 | best_NLL_loss = NLL_loss 320 | best_epoch = epoch 321 | best_NLL_graph = graph 322 | 323 | if MSE_loss < best_MSE_loss: 324 | best_MSE_loss = MSE_loss 325 | best_epoch = epoch 326 | best_MSE_graph = graph 327 | 328 | #print("Optimization Finished!") 329 | #print("Best Epoch: {:04d}".format(best_epoch)) 330 | if ELBO_loss > 2 * best_ELBO_loss: 331 | break 332 | 333 | # update parameters 334 | A_new = origin_A.data.clone() 335 | h_A_new = _h_A(A_new, data_variable_size) 336 | if h_A_new.item() > gamma * h_A_old: 337 | c_A*=eta 338 | else: 339 | break 340 | 341 | # update parameters 342 | # h_A, adj_A are computed in loss anyway, so no need to store 343 | h_A_old = h_A_new.item() 344 | lambda_A += c_A * h_A_new.item() 345 | 346 | if h_A_new.item() <= h_tol: 347 | break 348 | 349 | #print("Steps: {:04d}".format(step_k)) 350 | #print("Best Epoch: {:04d}".format(best_epoch)) 351 | 352 | # test() 353 | #print (best_ELBO_graph) 354 | #print(best_NLL_graph) 355 | #print (best_MSE_graph) 356 | 357 | graph = origin_A.data.clone().cpu().numpy() 358 | graph[np.abs(graph) < 0.1] = 0 359 | graph[np.abs(graph) < 0.2] = 0 360 | graph[np.abs(graph) < 0.3] = 0 361 | 362 | except KeyboardInterrupt: 363 | print('Done!') 364 | 365 | end_time = time.time() 366 | #print("Time spent: ",end_time-start_time) 367 | print(names[idx]) 368 | adj = graph 369 | #print(adj) 370 | org_G = nx.from_numpy_matrix(adj, parallel_edges=True, create_using=nx.DiGraph) 371 | pos=nx.circular_layout(org_G) 372 | nx.draw(org_G, pos=pos, with_labels=True) 373 | plt.savefig("metrics_causality.png") 374 | 375 | # PageRank in networkx 376 | #G = nx.from_numpy_matrix(adj.T, parallel_edges=True, create_using=nx.DiGraph) 377 | #scores = nx.pagerank(G, max_iter=1000) 378 | #print(sorted(scores.items(), key=lambda item:item[1], reverse=True)) 379 | 380 | # PageRank 381 | from sknetwork.ranking import PageRank 382 | pagerank = PageRank() 383 | scores = pagerank.fit_transform(np.abs(adj.T)) 384 | #print(scores) 385 | #cmap = plt.cm.coolwarm 386 | 387 | score_dict = {} 388 | for i,s in enumerate(scores): 389 | score_dict[i] = s 390 | print(sorted(score_dict.items(), key=lambda item:item[1], reverse=True)) 391 | --------------------------------------------------------------------------------