├── .gitignore ├── README.md ├── clustering.py ├── database └── builder.py ├── demo.py ├── drawing.py ├── peewee_models.py ├── track.py ├── trajectory.py └── trajectory_tracking.py /.gitignore: -------------------------------------------------------------------------------- 1 | .idea/* 2 | *.csv 3 | *.txt 4 | *.db -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # TrajectoryTracking 2 | The main goal of this project is a cluster analysis of 2D general trajectories. In particular, the trajectories are acquired by sensors installed on the shopping carts in a supermarket during the business hours. Analysing the trajectories of customers inside the target supermarket are useful for retailers to improve the shopping experience. 3 | 4 | ## How does it work? 5 | ### 1) Database building 6 | The system collects the trajectories acquired by the shopping carts. These data are stored in a relational database. 7 | 8 | #### Database-builder script 9 | Note 1: Before you start using the `builder.py` script, make sure that a folder called `sqlite` exists in the root folder of the `builder.py` script (just create it if necessary). It is the folder where you will find the `.db` result files (you can edit the result folder name whenever you want by editing the `builder.py` script). 10 | 11 | In the `builder.py` script there are two editable contents: the `DATASET` section and the `MODEL` section. 12 | * __Dataset section__: Here it is possible to find three fields to customize. The `dataset_folder` is the folder containing the dataset file. Editing the `dataset_ext` allows to specify the text-file extention of dataset. 13 | 14 | * __Model section__: It is the core of the builder-script. It is necessary to customize the `class` for creating the model to use. Then, with the `build(line)` function body it is possible to specify how save each `line` of the dataset (note that the `build` function must return an object called `model` which is the constructor of custom `class`). 15 | 16 | Note 2: It is not necessary to edit anything, excepts the `DATASET` and `MODEL` sections, for the builder-script. 17 | 18 | ### 2) Identifying the origin area and trajectory 19 | The collected data in the database are extracted and analyzed. Then, the positions are used for identifying the actual trajectories, divided into two categories. The fist one is a `trajectory` a run called `origin area` (or, simply, `origin`), where almost any cart begins the run (it is usually the place of the supermarket where the carts are collected); in the second some `sub-trajectory` are computed, starting from a `trajectory` and breaking it into pieces which begin and end within some other areas, called `control areas` (or, simply, `controls`). 20 | 21 | ![Market map](http://i.imgur.com/gQs181S.png) 22 | 23 | Example of market map: Origin area and control areas highlighted 24 | 25 | ### 3) Filtering 26 | Three filtering operations are applied, one during (_"positions out-of-bounds"_) and two after (_"densities of points"_ and _"Kalman filter"_) the trajectory identifying process. 27 | * __Positions out-of-bounds__: Some cart positions may be located outside the bounds of the map (positions with negative x and/or y coordinates). Carts with positions out-of-bounds are "jumped" in the trajectory identifying process. 28 | * __Densities of points__: Due to the inaccuracy of the GPS or the chaotic behaviour of the customers, some trajectories may be characterized by segments with an high density of sparse points within a small area. These segments are smoothed removing the unuseful densities of points. 29 | * __Kalman filter__: A standard Kalman filter is then applied to each trajectory in order to smooth them, by removing the noise and the inaccuracies of the geo-positioning system. 30 | 31 | ### 4) Clustering 32 | Clustering is the final part of the process. The sub-trajectories are stored in clusters, following two different alghoritms inherited from the [TrajectoryClustering](https://github.com/bednarikjan/TrajectoryClustering) work by [Jan Bednarik](https://github.com/bednarikjan). 33 | 34 | * __Agglomerative clustering__: A bottom-up approach of the hierarchical clustering (also called hierarchical cluster analysis or HCA), that is a method of cluster analysis which seeks to build a hierarchy of clusters. 35 | In particular, in the agglomerative clustering each observation starts in its own cluster, and pairs of clusters are merged as one moves up the hierarchy. 36 | * __Spectral clustering__: The spectrum (eigenvalues) of the similarity matrix of the data is used to perform dimensionality reduction before clustering in fewer dimensions. The similarity matrix is provided as an input and consists of a quantitative assessment of the relative similarity of each pair of points in the dataset. 37 | 38 | The innovation on this project mainly relies on the concepts of `macro-cluster` (simplified version of the [Partition-and-Group Framework](http://hanj.cs.illinois.edu/pdf/sigmod07_jglee.pdf)) and `track` (ordered set of sub-trajectories belonging to the same main trajectory but to different clusters). The macro-clustering process is implemented in order to find the most popular patterns in the supemarket by finding the most frequent patterns of tracks. 39 | 40 | __Macro clustering__: Further clustering of sub-trajectories. After the individual (agglomerative or spectral) clustering, each sub-trajectory belonging to the same trajectory is combined in sequence, in order to recover the original and complete trajectory of the cart. In this way, the built trajectory becomes a `track` that belongs to a certain `macro-cluster`, that is a `set of ordered (sub)clusters`. Tracks of the same macro-cluster are then gathered to define which kind of paths are most frequently practiced in the supermarket. 41 | 42 | ![Track example](http://i.imgur.com/5mk5FKT.png) 43 | 44 | Example of track: Composed by four sub-trajectories (`yellow -> green -> orange -> blue`), it belongs to the macro-cluster `yellow-green-orange-blue` (red dots inside red ellipses has been inserted to simulate and higlight the complete trajectory) 45 | 46 | ## Dependencies 47 | * Python <= 2.7 48 | * tkinter 49 | * peewee 2.10.1 50 | * filterpy 0.1.5 51 | 52 | ## Supervisor 53 | * Daniele Liciotti | [GitHub](https://github.com/danielelic) 54 | * Marina Paolanti | [GitHub](https://github.com/marinapaolanti) 55 | 56 | ## Authors 57 | * Matteo Camerlengo | [GitHub](https://github.com/MatteoCamerlengo) 58 | * Francesco Di Girolamo | [GitHub](https://github.com/francescodigirolamo) 59 | * Andrea Perrello | [GitHub](https://github.com/AndreaPerrello) 60 | -------------------------------------------------------------------------------- /clustering.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | import math 3 | from scipy.cluster.vq import kmeans 4 | from scipy.cluster.vq import kmeans2 5 | import random 6 | from scipy import spatial 7 | from trajectory import euclidean_distance 8 | 9 | 10 | class Clustering: 11 | """A class implementing trajectory clustering.""" 12 | 13 | def __init__(self, alpha=0.88, w=2.0, stdNN=2, stdMin=0.4, stdMax=20.0): 14 | """ Constructor 15 | 16 | Arguments: 17 | alpha -- robustness against outliers (see [1]) 18 | w -- neighborhood (see [1]) 19 | stdNN -- number of nearest neighbors to compute standard deviation used in similarity measure (see [1]) 20 | stdMin -- minimum value for clipping (see [1]) 21 | stdMax -- maximum value for clipping (see [1]) 22 | 23 | [1] Clustering of Vehicle Trajectories (Stefan Atev) 24 | """ 25 | self.trajectories = [] 26 | self.distMat = np.zeros((0, 0)) 27 | self.stdDevs = np.zeros((0, 0)) 28 | self.alpha = alpha 29 | self.w = w 30 | self.stdNN = stdNN 31 | self.stdMin = stdMin 32 | self.stdMax = stdMax 33 | 34 | def std(self, tidx): 35 | return self.stdDevs[tidx] 36 | 37 | def modHausDist(self, t1idx, t2idx): 38 | """Computes modified Hausdorf distance.""" 39 | t1 = self.trajectories[t1idx] 40 | t2 = self.trajectories[t2idx] 41 | 42 | distances = np.zeros(len(t1)) 43 | t1pointsRelPos = [t1.getPrefixSum()[i] / t1.length() for i in range(len(t1))] 44 | t2pointsRelPos = [t2.getPrefixSum()[i] / t2.length() for i in range(len(t2))] 45 | 46 | for i in range(len(t1)): 47 | pt1 = t1.getPoints()[i] 48 | 49 | # Find corresponding point pt2 in t2 for point pt1 = t1[i] 50 | pt2idx = np.argmin( 51 | np.array([abs(t1pointsRelPos[i] - t2pointsRelPos[j]) for j in range(len(t2pointsRelPos))])) 52 | pt2 = t2.getPoints()[pt2idx] 53 | 54 | # Get set of points sp2 of t2 within neighborhood of point pt2 55 | ps = t2.getPrefixSum() 56 | tmp = [abs(ps[j] - ps[pt2idx]) - (self.w / 2.0) for j in range(len(ps))] 57 | neighborhoodIdxs = [j for j in range(len(tmp)) if tmp[j] <= 0] 58 | 59 | # Find minimum Euclidean distance between point pt1 and set of points sp2 60 | dist = float("inf") 61 | for idx in neighborhoodIdxs: 62 | newdist = euclidean_distance(pt1, t2.getPoints()[idx]) 63 | if newdist < dist: 64 | dist = newdist 65 | 66 | distances[i] = dist 67 | 68 | # Find distance worse then self.alpha percent of the other distance 69 | distances = np.sort(distances) 70 | 71 | # return distances[int(round((len(distances) - 1) * self.alpha))] 72 | return distances[min(int(len(distances) * self.alpha), len(distances) - 1)] 73 | 74 | def createDistanceMatrix(self): 75 | size = len(self.trajectories) 76 | self.distMat = np.ones((size, size)) 77 | 78 | for r in range(size): 79 | for c in range(size): 80 | dist = self.modHausDist(r, c) 81 | self.distMat[r, c] *= dist 82 | 83 | def createStdDevs(self): 84 | rowSortedDistMat = np.copy(self.distMat) 85 | rowSortedDistMat.sort(axis=1) 86 | 87 | self.stdDevs = rowSortedDistMat[:, min(self.stdNN, rowSortedDistMat.shape[1] - 1)] 88 | for i in range(len(self.stdDevs)): 89 | self.stdDevs[i] = max(self.stdMin, min(self.stdMax, self.stdDevs[i])) 90 | 91 | def similarity(self, t1idx, t2idx): 92 | """A function computes the similarity measure of trajectories t1 and t2 93 | according to paper 'Clustering of Vehicle Trajectories (Stefan Atev)' 94 | """ 95 | return math.exp( 96 | -(self.distMat[t1idx, t2idx] * self.distMat[t2idx, t1idx]) / (2 * self.std(t1idx) * self.std(t2idx))) 97 | 98 | def similarityDummy(self, t1idx, t2idx): 99 | """DEBUG VERSION 100 | A function computes the similarity measure of trajectories t1 and t2 as 101 | a simple average Euclidian distance of corresponding point pairs""" 102 | t1 = self.trajectories[t1idx] 103 | t2 = self.trajectories[t2idx] 104 | 105 | tlen = min([len(t1), len(t2)]) 106 | 107 | dist = 0 108 | for i in range(tlen): 109 | dist += math.sqrt( 110 | (t1.getPoints()[i][0] - t2.getPoints()[i][0]) ** 2 + (t1.getPoints()[i][1] - t2.getPoints()[i][1]) ** 2) 111 | 112 | return 1.0 / (dist / float(tlen) + 1e-6) 113 | 114 | def clusterAgglomerative(self, trajectories, cn): 115 | """ 116 | input: A list 'trajectories' of trajectories given as lists of 117 | objects of class Trajectory. The number of desired clusters 'nc'. 118 | output: / 119 | The function performs agglomerative clustering of trajectories 120 | and for each trajectory sets an index t.ci denoting estimated cluster. 121 | """ 122 | self.trajectories = trajectories 123 | 124 | # Update a distance matrix and std deviations 125 | self.createDistanceMatrix() 126 | 127 | clusters = [[i] for i in range(len(trajectories))] 128 | 129 | while len(clusters) > cn: 130 | affMat = np.zeros((len(clusters), len(clusters))) 131 | for r in range(affMat.shape[0] - 1): 132 | for c in range(r + 1, affMat.shape[1]): 133 | ## count inter-cluster average distance 134 | dist = 0 135 | 136 | for t1idx in clusters[r]: 137 | for t2idx in clusters[c]: 138 | # distance of trajectory t1 (t1 in tA) and trajectory t2 (t2 in tB) 139 | dist += 1 / ((self.distMat[t1idx, t2idx] * self.distMat[t2idx, t1idx]) + 1e-6) 140 | 141 | dist *= 1.0 / (len(clusters[r]) * len(clusters[c])) 142 | affMat[r, c] = dist 143 | 144 | # Find two closest clusters and merge them 145 | # First trajectory is given by row index, second trajectory is given by column index of affinity matrix 146 | t1idx = np.argmax(affMat) / affMat.shape[1] 147 | t2idx = np.argmax(affMat) % affMat.shape[0] 148 | 149 | clusters[t1idx].extend(clusters[t2idx]) 150 | clusters = [clusters[i] for i in range(len(clusters)) if i != t2idx] 151 | 152 | # Assign an estimated cluster index to each trajectory 153 | for i in range(len(clusters)): 154 | for j in clusters[i]: 155 | self.trajectories[j].setClusterIdx(i) 156 | 157 | def clusterSpectral(self, trajectories, clusters=-1): 158 | """ 159 | input: 160 | trajectories - a list 'trajectories' of trajectories given as lists of 161 | points given as tuples (x, y). 162 | clusters - A number of clusters. If the value is not specified, the 163 | algorithm estimates the best number itself 164 | output: 165 | g - Number of Centroids 166 | The function performs spectral clustering of trajectories 167 | and for each trajectory sets an index t.ci denoting estimated cluster. 168 | the function estimates the number of resulting clusters automatically. 169 | """ 170 | # Need to be assigned as am object variable - other support functions use it (createStdDevs(), etc.)! 171 | self.trajectories = trajectories 172 | 173 | # Update a distance matrix and std deviations 174 | self.createDistanceMatrix() 175 | 176 | self.createStdDevs() 177 | 178 | # Compute affinity matrix 179 | K = np.zeros((len(trajectories), len(trajectories))) 180 | for r in range(len(trajectories)): 181 | for c in range(len(trajectories)): 182 | K[r, c] = self.similarity(r, c) 183 | 184 | # Diagonal matrix W for normalization 185 | W = np.diag(1.0 / np.sqrt(np.sum(K, 1))) 186 | 187 | # Normalized affinity matrix 188 | L = np.dot(np.dot(W, K), W) 189 | 190 | # Eigendecomposition 191 | Eval, Evec = np.linalg.eig(L) 192 | 193 | gMin, gMax = 0, 0 194 | for val in Eval: 195 | if val > 0.8: 196 | gMax += 1 197 | if val > 0.99: 198 | gMin += 1 199 | 200 | # Sort eigenvalues and eigenvectors according to descending eigenvalue 201 | Eval, Evec = zip(*sorted(zip(Eval, Evec.T), reverse=True)) 202 | Evec = np.array(Evec).T 203 | 204 | g = clusters 205 | if g == -1: 206 | ## Estimate the number of clusters 207 | # Distortion scores for different number of clusters g 208 | rhog = np.zeros(gMax - gMin + 1) 209 | 210 | for g in range(gMin, gMax + 1): 211 | V = np.copy(Evec[:, 0:g]) 212 | S = np.diag(1.0 / np.sqrt(np.sum(np.multiply(V, V), 1))) 213 | R = np.dot(S, V) 214 | 215 | # k-means clustering of the row vectors of R 216 | cb, wcScatt = kmeans(R, g, iter=20, thresh=1e-05) # cb = codebook (centroids = rows of cb) 217 | 218 | # compute distortion score rho_g (withit class scatter / sum(within class scatter, total scatter)) 219 | totScatt = np.sum([np.linalg.norm(r - c) for r in R for c in cb]) 220 | rhog[g - gMin] = wcScatt / (totScatt - wcScatt) 221 | 222 | # Best number of centroids. 223 | g = gMin + np.argmin(rhog) 224 | 225 | # Prerfofm classification of trajectories using k-means clustering 226 | V = np.copy(Evec[:, 0:g]) 227 | S = np.diag(1.0 / np.sqrt(np.sum(np.multiply(V, V), 1))) 228 | R = np.dot(S, V) 229 | 230 | ## Find g initial centroids (rows) 231 | initCentroids = np.zeros((g, R.shape[1])) 232 | # Matrix of distance of each observation (rows) to each initial centroid (columns) 233 | initCentroidsDist = np.zeros((R.shape[0], g)) 234 | 235 | initCentroids[0] = R[random.randint(0, R.shape[0] - 1)] 236 | for i in range(g - 1): 237 | # get each observation's distance to the new centroid 238 | initCentroidsDist[:, i] = [spatial.distance.euclidean(obs, initCentroids[i]) for obs in R] 239 | 240 | # get the observation which has the worst minimal distance to some already existing centroid 241 | newidx = np.argmax(np.min(initCentroidsDist[:, :(i + 1)], 1)) 242 | initCentroids[i + 1] = R[newidx] 243 | 244 | controids, labels = kmeans2(R, initCentroids, iter=10, thresh=1e-05, minit='matrix', missing='warn') 245 | 246 | assert (len(trajectories) == len(labels)) 247 | 248 | for trajLab in zip(trajectories, labels): 249 | trajLab[0].setClusterIdx(trajLab[1]) 250 | 251 | return g 252 | -------------------------------------------------------------------------------- /database/builder.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python2 2 | # -*- coding: utf-8 -*- 3 | 4 | ################################################################### 5 | # File per la conversione di un dataset da dataset-file a .sqlite 6 | # e salvataggio all'interno di un DataBase File. 7 | ################################################################### 8 | 9 | import os 10 | 11 | from peewee import * 12 | 13 | ################################################################### 14 | # DATASET 15 | 16 | dataset_folder = "dataset" # Define dataset folder name 17 | dataset_file = "AOIs" # Define dataset file name 18 | dataset_ext = ".csv" # Define dataset file extention 19 | 20 | ################################################################### 21 | 22 | result_folder = "sqlite" # Result folder name. Editing it is unnecessary. 23 | 24 | ################################################################### 25 | ################################################################### 26 | PROJECT_ROOT = os.path.dirname(os.path.realpath(__file__)) 27 | 28 | sql_filepath = os.path.join(PROJECT_ROOT, result_folder, dataset_file + ".db") 29 | dataset_filepath = os.path.join(PROJECT_ROOT, dataset_folder, dataset_file + dataset_ext) 30 | 31 | try: 32 | open(dataset_filepath, "rb") 33 | except IOError: 34 | print("Dataset '" + dataset_file + dataset_ext + "' does not exist inside folder '" + dataset_folder + "'") 35 | exit() 36 | 37 | # IF YOU NEED TO BUILD A SINGLE DB CONTAINING ALL YOUR TABLES, YOU MAY 38 | # REPLACE THE VARIABLE sql_filepath WITH A STRING CONTAINING THE STATIC FILE PATH 39 | # OF YOUR SQL .DB FILE. THIS WILL PREVENT FROM CREATING A BRAND NEW .DB FILE. 40 | db = SqliteDatabase(sql_filepath) 41 | 42 | 43 | class BaseModel(Model): 44 | class Meta: 45 | database = db 46 | 47 | 48 | ################################################################### 49 | ################################################################### 50 | # MODEL 51 | 52 | # Define the class from which to create the model (note: must extend "BaseModel") 53 | 54 | # example 55 | class Aoi(BaseModel): 56 | "Regioni di interesse che costituiscono la mappa" 57 | id = IntegerField(primary_key=True) 58 | x_min = FloatField() 59 | x_max = FloatField() 60 | y_min = FloatField() 61 | y_max = FloatField() 62 | 63 | 64 | # Set "model" variable equal to the defined class name 65 | # model = Cart 66 | model = Aoi 67 | 68 | 69 | def build(line): 70 | "Main build function. Each line is an array of values reflecting the structure of the dataset" 71 | 72 | return model(id=line[0], x_min=line[1], x_max=line[7], y_min=line[2], y_max=line[8]) 73 | 74 | 75 | ################################################################### 76 | ################################################################### 77 | db.connect() 78 | db.create_tables([model], safe=True) 79 | 80 | i = 0 81 | with open(dataset_filepath, "rb") as features: 82 | database = features.readlines() 83 | for line in database: 84 | print(str(i) + "/" + str(len(database))) 85 | 86 | build(line.split(",")).save(force_insert=True) 87 | i = i + 1 88 | features.close() 89 | ################################################################### 90 | ################################################################### 91 | -------------------------------------------------------------------------------- /demo.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python2 2 | # -*- coding: utf-8 -*- 3 | 4 | from Tkinter import * 5 | 6 | import operator 7 | from collections import OrderedDict 8 | 9 | from clustering import Clustering 10 | from drawing import Map 11 | from peewee_models import Cart, Aoi 12 | from track import Track 13 | from trajectory import Trajectory 14 | 15 | ########################### 16 | # 1) Global variables # 17 | ########################### 18 | 19 | MAX_CLUSTERS = 8 # MAX_CLUSTERS <= 10 20 | MAX_CLUSTERS_USER_DEFINED = False 21 | 22 | colors = {"purple": "#A020F0", 23 | "orange": "#FF8C00", 24 | "red": "#FF0000", 25 | "yellow": "#FFFF00", 26 | "green": "#228B22", 27 | "lime": "#7FFF00", 28 | "cyan": "#00FFFF", 29 | "blue": "#4169E1", 30 | "pink": "#FF69B4", 31 | "gray": "#2F4F4F"} 32 | 33 | COLOR_BLACK = "#000000" 34 | 35 | # Origin area 36 | origin = Aoi(x_min=0.1, x_max=14., y_min=28.5, y_max=35.18) 37 | # Extended origin area 38 | extended_origin = Aoi(x_min=0.1, x_max=17., y_min=26.5, y_max=35.18) 39 | 40 | # Control areas 41 | controls = { 42 | "c1": Aoi(x_min=41.18, x_max=44.23, y_min=19.53, y_max=21.49), 43 | # "c2": Aoi(x_min=31.13, x_max=34.28, y_min=19.53, y_max=21.49), 44 | "c3": Aoi(x_min=31.13, x_max=34.24, y_min=9.55, y_max=12.43), 45 | # "c4": Aoi(x_min=41.26, x_max=44.22, y_min=9.55, y_max=12.43), 46 | # "c5": Aoi(x_min=0.74, x_max=4.4, y_min=18.74, y_max=22.00), 47 | # "c6": Aoi(x_min=8.1, x_max=11.15, y_min=18.63, y_max=22.00), 48 | "c7": Aoi(x_min=8.1, x_max=11.88, y_min=9.08, y_max=12.03), 49 | # "c8": Aoi(x_min=19.08, x_max=22.12, y_min=9.08, y_max=11.35) 50 | } 51 | 52 | # List of all trajectories 53 | trajectories = [] 54 | # Index of the current trajectory to draw 55 | trajectory_index = 0 56 | # List of clusters 57 | clusters = Clustering() 58 | # Index of the current cluster to draw 59 | cluster_index = 0 60 | # Number of Trajectories per Cluster 61 | ntc = [] 62 | # List of tracks 63 | tracks = [] 64 | # Index of the current track to draw 65 | track_index = 0 66 | # Flag to prevent tracks from being computed again 67 | tracks_computed = False 68 | # List of macro-clusters 69 | macro_clusters = {} 70 | # Index of the current macro-clusters to draw 71 | macro_index = 0 72 | 73 | ########################## 74 | # 2) Drawing the map # 75 | ########################## 76 | 77 | # Inizializza la mappa 78 | tkmaster = Tk(className="Demo") 79 | map = Map(tkmaster, scale=18, width=1200, height=640, bg="#FFFFFF") 80 | map.pack(expand=True, fill="both", side="right") 81 | 82 | # Disegna la mappa 83 | map.draw_init(Aoi.select(), origin, controls) 84 | 85 | ########################## 86 | # 3) Selecting carts # 87 | ########################## 88 | 89 | # Preleva la lista dei singoli carrelli (len(carts_id) = 16) 90 | carts = Cart.select().where(Cart.tag_id == "0x00205EFE0E93") \ 91 | .group_by(Cart.tag_id) 92 | 93 | 94 | ######################################################################################################################## 95 | # LEGEND 96 | ######################################################################################################################## 97 | def show_legend(): 98 | map.clear_log() 99 | map.log(txt=">> Legend (keys)\n\n") 100 | map.log(txt="1: Compute trajectories\n") 101 | map.log(txt="2: Draw single trajectory\n") 102 | map.log(txt="3: Draw all trajectories\n") 103 | map.log(txt="4: Clustering (agglomerative)\n") 104 | map.log(txt="5: Clustering (spectral)\n") 105 | map.log(txt="6: Draw single cluster\n") 106 | map.log(txt="7: Draw all clusters\n") 107 | map.log(txt="8: Compute tracks\n") 108 | map.log(txt="9: Draw single track\n") 109 | map.log(txt="0: Draw macro cluster\n") 110 | map.log(txt="L: Show legend\n") 111 | 112 | 113 | show_legend() 114 | 115 | 116 | ######################################################################################################################## 117 | 118 | ######################################################################################################################## 119 | # FUNCTIONS # 120 | ######################################################################################################################## 121 | 122 | def compute_trajectories(event): 123 | map.clear_log() 124 | map.log(txt=">> 1: Compute trajectories\n\n") 125 | 126 | global trajectory_index, ntc 127 | trajectory_index = 0 128 | ntc = [] 129 | progress_carts = 0 130 | trajectories[:] = [] 131 | 132 | # For each cart: 133 | for cart in carts: 134 | # ProgressBar 135 | map.log(txt="Progress:\t" + '{0:.3g}'.format(100 * (float(progress_carts) / float(carts.count()))) + "%\n") 136 | map.update() 137 | 138 | # Get all the cart's instances ordered by time, 139 | # deleting the one out-of-bounds: 140 | instances = list( 141 | Cart.select() 142 | .order_by(Cart.time_stamp.desc()) 143 | .where(Cart.tag_id == cart.tag_id) 144 | .where(Cart.x > 0.).where(Cart.y > 0.) 145 | ) 146 | 147 | # Divide all the instances in trajectories which are origin2origin or origin2control or 148 | # control2control, and build the array of trajectories. 149 | # NB: if the last run does not reach a control or the origin, it is not taken. 150 | 151 | # Minimum length di of an origin2origin trajectory 152 | complete_min_run_length = 25 153 | # Minimum length di of an origin2control trajectory 154 | middle_min_run_length = 15 155 | # Maximum length di of a trajectory 156 | max_run_length = 350 157 | 158 | # Index of the beginning run instance 159 | begin = 0 160 | # Index of the current run instance 161 | i = 0 162 | # Flag: run has started 163 | has_run_started = False 164 | 165 | # For each instance: 166 | for instance in instances: 167 | # If the started run has not reached the origin or a control 168 | # or if the non-started run is inside the origin or a control: 169 | if (not instance.inside(origin) and not instance.multinside(controls) and has_run_started) \ 170 | or (instance.inside(origin) and not has_run_started) \ 171 | or (instance.multinside(controls) and not has_run_started): 172 | pass 173 | else: 174 | # If it needs to start the run: 175 | if not instance.inside(origin) and not instance.multinside(controls) and not has_run_started: 176 | # Start the run 177 | has_run_started = True 178 | # Save the begin index (exception check: run starts outside the origin/control) 179 | if i > 0: 180 | begin = i - 1 181 | else: 182 | begin = 0 183 | # If it needs to stop the run: 184 | else: 185 | # Stops the run 186 | has_run_started = False 187 | # Save the run interval 188 | run = instances[begin:i] 189 | 190 | # If the run is an endingrun (inside the origin): 191 | if instance.inside(origin): 192 | trajectory = Trajectory(run) 193 | # If the trajcetory is between complete_min_run_length and max_run_length 194 | if (complete_min_run_length < trajectory.prefixSum[len(trajectory.prefixSum) - 1] < \ 195 | max_run_length) and \ 196 | ((str(instances[begin].time_stamp - instances[i].time_stamp)) < str(3)): 197 | # Pulisce la traiettoria 198 | trajectory.clean() 199 | # Filtra la traiettoria attraverso un filtro di Kalman 200 | trajectory.filter() 201 | # Aggiunge la traiettoria alla lista 202 | trajectories.append(trajectory) 203 | # If the run is a middlerun (inside a control): 204 | else: 205 | trajectory = Trajectory(run) 206 | # If the trajcetory is between middle_min_run_length and max_run_length 207 | if (middle_min_run_length < trajectory.prefixSum[len(trajectory.prefixSum) - 1] < \ 208 | max_run_length) and \ 209 | ((str(instances[begin].time_stamp - instances[i].time_stamp)) < str(3)): 210 | # Pulisce la traiettoria 211 | trajectory.clean() 212 | # Filtra la traiettoria attraverso un filtro di Kalman 213 | trajectory.filter() 214 | # Aggiunge la traiettoria alla lista 215 | trajectories.append(trajectory) 216 | i += 1 217 | progress_carts += 1 218 | 219 | map.log(txt="Progress: 100%\n\n") 220 | map.log(txt="Number of trajectories:\t" + str(len(trajectories)) + "\n") 221 | map.log(txt="\nComputing tracks..\n") 222 | 223 | trajectory_index = len(trajectories) - 1 224 | 225 | # Set the track attribute to each trajectory to find the complete macro-trajectories 226 | n_track = -1 227 | flag = False 228 | for trajectory in trajectories: 229 | # Descendent order 230 | stop = trajectory.run[0].inside(extended_origin) 231 | start = trajectory.run[len(trajectory.run) - 1].inside(extended_origin) 232 | if start: 233 | trajectory.track = n_track 234 | n_track += 1 235 | flag = True 236 | else: 237 | if stop: 238 | if flag: 239 | flag = False 240 | else: 241 | n_track += 1 242 | trajectory.track = n_track 243 | 244 | map.log(txt="Tracks computed!\n") 245 | 246 | 247 | def draw_single_trajectory(event): 248 | global trajectory_index 249 | 250 | map.clear_log() 251 | map.log(txt=">> 2: Draw single trajectory\n\n") 252 | 253 | map.draw_init(Aoi.select(), origin, controls) 254 | 255 | if len(trajectories) > 0: 256 | map.draw_trajectory(trajectories[trajectory_index], color="red") 257 | 258 | map.log(txt="Cart id: " + str(trajectories[trajectory_index].run[0].tag_id) + "\n") 259 | map.log(txt= 260 | "Start:\t" 261 | + str(trajectories[trajectory_index].run[len(trajectories[trajectory_index].run) - 1].time_stamp) 262 | + "\n" 263 | ) 264 | map.log(txt="End:\t" + str(trajectories[trajectory_index].run[0].time_stamp) + "\n") 265 | 266 | if trajectory_index >= 0: 267 | trajectory_index -= 1 268 | else: 269 | trajectory_index = len(trajectories) - 1 270 | else: 271 | map.log(txt="Error: No trajectories computed.\n") 272 | 273 | 274 | def draw_all_trajectories(event): 275 | map.clear_log() 276 | map.log(txt=">> 3: Draw all trajectories\n\n") 277 | 278 | map.draw_init(Aoi.select(), origin, controls) 279 | 280 | if len(trajectories) == 0: 281 | map.log(txt="Error: No trajectories computed.\n") 282 | else: 283 | for trajectory in trajectories: 284 | map.draw_trajectory(trajectory, color="red") 285 | 286 | map.log(txt="N. of trajectories:\t" + str(len(trajectories)) + "\n") 287 | 288 | 289 | def cluster_trajectories_agglomerative(event): 290 | map.clear_log() 291 | map.log(txt=">> 4: Clustering (agglomerative)\n\n") 292 | map.update() 293 | 294 | global cluster_index, ntc 295 | cluster_index = 0 296 | 297 | if len(trajectories) == 0: 298 | map.log(txt="Error: No trajectories computed.\n") 299 | else: 300 | map.log(txt="Clustering..\n\n") 301 | map.update() 302 | 303 | # Clustering 304 | clusters.clusterAgglomerative(trajectories, MAX_CLUSTERS) 305 | 306 | map.draw_init(Aoi.select(), origin, controls) 307 | 308 | # Computes the number of trajectories per cluster 309 | ntc = [0] * MAX_CLUSTERS 310 | for t in trajectories: 311 | ntc[t.getClusterIdx()] += 1 312 | map.log(txt="Clusters:\n") 313 | for i in range(MAX_CLUSTERS): 314 | if ntc[i] > 0: 315 | perc = float(ntc[i]) / float(len(trajectories)) * 100 316 | map.log(txt="- " + '{0:.2f}'.format(perc) + "% " + colors.keys()[i] + " (" + str(ntc[i]) + ")\n") 317 | 318 | 319 | def cluster_trajectories_spectral(event): 320 | map.clear_log() 321 | map.log(txt=">> 5: Clustering (spectral)\n\n") 322 | map.update() 323 | 324 | global cluster_index, ntc, g 325 | cluster_index = 0 326 | 327 | if len(trajectories) == 0: 328 | map.log(txt="Error: No trajectories computed.\n") 329 | else: 330 | map.log(txt="Clustering..\n\n") 331 | map.update() 332 | 333 | # Clustering 334 | if MAX_CLUSTERS_USER_DEFINED: 335 | clusters.clusterSpectral(trajectories, MAX_CLUSTERS) 336 | else: 337 | g = clusters.clusterSpectral(trajectories) 338 | 339 | map.draw_init(Aoi.select(), origin, controls) 340 | 341 | # Computes the number of trajectories per cluster 342 | ntc = [0] * g 343 | for t in trajectories: 344 | ntc[t.getClusterIdx()] += 1 345 | 346 | map.log(txt="Clusters:\n") 347 | 348 | if MAX_CLUSTERS_USER_DEFINED: 349 | for i in range(MAX_CLUSTERS): 350 | if ntc[i] > 0: 351 | perc = float(ntc[i]) / float(len(trajectories)) * 100 352 | map.log(txt="- " + '{0:.2f}'.format(perc) + "% " + colors.keys()[i] + " (" + str(ntc[i]) + ")\n") 353 | else: 354 | for i in range(g): 355 | if ntc[i] > 0: 356 | perc = float(ntc[i]) / float(len(trajectories)) * 100 357 | map.log(txt="- " + '{0:.2f}'.format(perc) + "% " + colors.keys()[i] + " (" + str(ntc[i]) + ")\n") 358 | 359 | 360 | def draw_single_cluster(event): 361 | global cluster_index, ntc 362 | 363 | map.clear_log() 364 | map.log(txt='>> 6: Draw single cluster\n\n') 365 | 366 | if len(trajectories) == 0: 367 | map.log(txt="Error: No trajectories computed.\n") 368 | if len(ntc) == 0: 369 | map.log(txt="Error: No cluster computed.\n") 370 | else: 371 | if len(ntc) == 0: 372 | map.log(txt="Error: No cluster computed.\n") 373 | else: 374 | map.draw_init(Aoi.select(), origin, controls) 375 | 376 | for trajectory in trajectories: 377 | if trajectory.getClusterIdx() == cluster_index: 378 | map.draw_trajectory(trajectory, color=colors.values()[cluster_index]) 379 | perc = float(ntc[cluster_index]) / float(len(trajectories)) * 100 380 | 381 | map.log(txt= 382 | "- " + '{0:.2f}'.format(perc) + "% " + colors.keys()[cluster_index] 383 | + " (" + str(ntc[cluster_index]) + ")\n" 384 | ) 385 | 386 | if cluster_index < len(ntc) - 1: 387 | cluster_index += 1 388 | else: 389 | cluster_index = 0 390 | 391 | 392 | def draw_all_clusters(event): 393 | map.clear_log() 394 | map.log(txt='>> 7: Draw all clusters\n\n') 395 | 396 | if len(trajectories) == 0: 397 | map.log(txt="Error: No trajectories computed.\n") 398 | if len(ntc) == 0: 399 | map.log(txt="Error: No cluster computed.\n") 400 | else: 401 | if len(ntc) == 0: 402 | map.log(txt="Error: No cluster computed.\n") 403 | else: 404 | map.draw_init(Aoi.select(), origin, controls) 405 | 406 | for trajectory in trajectories: 407 | map.draw_trajectory(trajectory, colors.values()[trajectory.getClusterIdx()]) 408 | for i in range(len(ntc)): 409 | if ntc[i] > 0: 410 | perc = float(ntc[i]) / float(len(trajectories)) * 100 411 | map.log(txt= 412 | "- " + '{0:.2f}'.format(perc) + "% " 413 | + colors.keys()[i] + " (" + str(ntc[i]) + ")\n" 414 | ) 415 | 416 | 417 | def compute_tracks(event): 418 | map.clear_log() 419 | map.log(txt='>> 8: Compute tracks\n\n') 420 | 421 | if len(trajectories) == 0 or len(ntc) == 0: 422 | map.log(txt="Error: No trajectories or cluster computed.\n") 423 | else: 424 | global tracks_computed 425 | if not tracks_computed: 426 | global track_index, macro_index 427 | track_index = 0 428 | macro_index = 0 429 | print(macro_clusters) 430 | 431 | map.draw_init(Aoi.select(), origin, controls) 432 | 433 | for traj in trajectories: 434 | if len(tracks) == 0: 435 | tracks.append(Track()) 436 | tracks[0].add_trajectory(traj) 437 | else: 438 | if tracks[len(tracks) - 1].id == traj.track: 439 | tracks[len(tracks) - 1].add_trajectory(traj) 440 | else: 441 | tracks.append(Track()) 442 | tracks[len(tracks) - 1].add_trajectory(traj) 443 | tracks_computed = True 444 | map.log(txt="Tracks computed.\n") 445 | 446 | # Macro cluster 447 | for track in tracks: 448 | key = str(track.cluster_code) 449 | macro_clusters[key] = macro_clusters.get(key, 0) + 1 450 | map.log(txt="Macro clusters computed.\n\n") 451 | else: 452 | map.log(txt="Tracks already computed. \n\n") 453 | 454 | ord_macroclusters = OrderedDict(sorted(macro_clusters.items(), key=operator.itemgetter(1), reverse=True)) 455 | 456 | map.log(txt="Macro clusters:\t\n") 457 | for macrocluster_code in ord_macroclusters: 458 | color_keys = [] 459 | cluster_codes = list(eval(macrocluster_code)) 460 | for cluster_code in sorted(cluster_codes, reverse=True): 461 | color_keys.append(colors.keys()[cluster_code]) 462 | map.log(txt=str(color_keys) + " " + str(ord_macroclusters[macrocluster_code]) + "\n") 463 | 464 | 465 | def draw_single_track(event): 466 | global track_index 467 | 468 | map.clear_log() 469 | map.log(txt='>> 9: Draw single track\n\n') 470 | 471 | if len(trajectories) == 0 or len(ntc) == 0 or len(tracks) == 0: 472 | map.log(txt="Error: No trajectory, cluster or track computed.\n") 473 | else: 474 | # Canvas refresh 475 | map.draw_init(Aoi.select(), origin, controls) 476 | 477 | for i in tracks[track_index].trajectories: 478 | map.draw_trajectory(i, colors.values()[i.getClusterIdx()]) 479 | 480 | map.log(txt="Cart id: " + tracks[track_index].trajectories[0].run[0].tag_id + "\n") 481 | map.log(txt= 482 | "Start:\t" 483 | + str(tracks[track_index].trajectories[len(tracks[track_index].trajectories) - 1] 484 | .run[len( 485 | tracks[track_index].trajectories[len(tracks[track_index].trajectories) - 1].run) - 1] 486 | .time_stamp) + "\n" 487 | ) 488 | map.log(txt= 489 | "End:\t" 490 | + str(tracks[track_index].trajectories[0].run[0].time_stamp) + "\n" 491 | ) 492 | 493 | if track_index < len(tracks) - 1: 494 | track_index += 1 495 | else: 496 | track_index = 0 497 | 498 | 499 | def draw_macro_cluster(event): 500 | global macro_index 501 | 502 | map.clear_log() 503 | map.log(txt='>> 0: Draw macro-clusters\n\n') 504 | 505 | if len(trajectories) == 0 or len(ntc) == 0 or len(tracks) == 0: 506 | map.log(txt="Error: No trajectory, cluster or track computed.\n") 507 | else: 508 | map.draw_init(Aoi.select(), origin, controls) 509 | 510 | ord_macro_clusters = OrderedDict( 511 | sorted(macro_clusters.items(), key=operator.itemgetter(1), reverse=True)) 512 | for track in tracks: 513 | if str(track.cluster_code) == ord_macro_clusters.keys()[macro_index]: 514 | for traj in track.trajectories: 515 | map.draw_trajectory(traj, color=colors.values()[traj.getClusterIdx()]) 516 | 517 | map.log(txt="N. of Tracks in each Macro Cluster:\t" + str(ord_macro_clusters.values()[macro_index]) \ 518 | + "\n") 519 | 520 | if macro_index < len(ord_macro_clusters) - 1: 521 | macro_index += 1 522 | else: 523 | macro_index = 0 524 | 525 | 526 | def legend(event): 527 | show_legend() 528 | pass 529 | 530 | 531 | # Command line parsing 532 | if (len(sys.argv) == 2): 533 | MAX_CLUSTERS = int(sys.argv[1]) 534 | MAX_CLUSTERS_USER_DEFINED = True 535 | 536 | ######################################################################################################################## 537 | 538 | tkmaster.bind("1", compute_trajectories) 539 | tkmaster.bind("2", draw_single_trajectory) 540 | tkmaster.bind("3", draw_all_trajectories) 541 | tkmaster.bind("4", cluster_trajectories_agglomerative) 542 | tkmaster.bind("5", cluster_trajectories_spectral) 543 | tkmaster.bind("6", draw_single_cluster) 544 | tkmaster.bind("7", draw_all_clusters) 545 | tkmaster.bind("8", compute_tracks) 546 | tkmaster.bind("9", draw_single_track) 547 | tkmaster.bind("0", draw_macro_cluster) 548 | tkmaster.bind("l", legend) 549 | 550 | mainloop() 551 | -------------------------------------------------------------------------------- /drawing.py: -------------------------------------------------------------------------------- 1 | from Tkinter import * 2 | import datetime 3 | 4 | class Map(Canvas): 5 | def __init__(self, master, scale=1, **kw): 6 | Canvas.__init__(self, master, **kw) 7 | self.scale = scale 8 | # Rightside Log 9 | self.T = Text(self, height=35, width=45) 10 | self.T.place(x=830, y=30) 11 | self.T.config(state="disabled") 12 | 13 | def create_circle(self, x, y, r, color): 14 | "Disegna un cerchio sul canvas" 15 | return self.create_oval(x - r, y - r, x + r, y + r, fill=color) 16 | 17 | def draw_aoi(self, aoi, color, text=""): 18 | "Disegna il rettangolo relativo alla regione aoi sul canvas" 19 | 20 | self.create_rectangle(aoi.x_min * self.scale, aoi.y_max * self.scale, 21 | aoi.x_max * self.scale, aoi.y_min * self.scale, 22 | fill=color) 23 | self.create_text((aoi.x_min + aoi.x_max) * self.scale / 2, 24 | (aoi.y_max + aoi.y_min) * self.scale / 2, 25 | text=text) 26 | 27 | def generate_eps(self): 28 | self.postscript(file=datetime.datetime.now().isoformat() + " - screenshot.eps") 29 | 30 | def draw_trajectory(self, trajectory, color): 31 | "Disegna una traiettoria sulla mappa" 32 | xlast, ylast = None, None 33 | for p in trajectory.points: 34 | # Disegna un punto 35 | self.create_circle(p[0] * self.scale, p[1] * self.scale, 3, color) 36 | # Disegna un segmento 37 | if xlast is not None and ylast is not None: 38 | self.create_line(xlast * self.scale, ylast * self.scale, p[0] * self.scale, 39 | p[1] * self.scale, smooth=True) 40 | xlast = p[0] 41 | ylast = p[1] 42 | 43 | def draw_init(self, aois, origin, controls): 44 | "Inizializza i disegni di aree di interesse, origine e controllo" 45 | 46 | # Resetta la mappa 47 | self.delete("all") 48 | 49 | # Disegna le aree di interesse 50 | for aoi in aois: 51 | self.draw_aoi(aoi, color="#D3D3D3", text=aoi.id) 52 | 53 | # Disegna l'area di origine 54 | self.draw_aoi(origin, color="peachpuff", text="ORIGIN") 55 | 56 | # Disegna le aree di controllo 57 | for c in controls: 58 | self.draw_aoi(controls[c], color="peachpuff", text=c) 59 | 60 | def log(self, txt): 61 | self.T.config(state="normal") 62 | self.T.insert(END, txt) 63 | self.T.config(state="disabled") 64 | 65 | def clear_log(self): 66 | self.T.config(state="normal") 67 | self.T.delete(1.0, END) 68 | self.T.config(state="disabled") 69 | -------------------------------------------------------------------------------- /peewee_models.py: -------------------------------------------------------------------------------- 1 | from peewee import * 2 | 3 | # Data Base File (SQLite) 4 | db = SqliteDatabase('database/sqlite/testset.db') 5 | 6 | 7 | class BaseModel(Model): 8 | class Meta: 9 | database = db 10 | 11 | 12 | class Aoi(BaseModel): 13 | "Regioni di interesse che costituiscono la mappa" 14 | id = IntegerField(primary_key=True) 15 | x_min = FloatField() 16 | x_max = FloatField() 17 | y_min = FloatField() 18 | y_max = FloatField() 19 | 20 | 21 | class Cart(BaseModel): 22 | "Istanze che descrivono la posizione di un carrello" 23 | id = IntegerField(primary_key=True) 24 | tag_id = CharField() 25 | time_stamp = DateTimeField() 26 | x = FloatField() 27 | y = FloatField() 28 | 29 | def inside(self, aoi): 30 | return self.x > aoi.x_min and self.x < aoi.x_max and self.y > aoi.y_min and self.y < aoi.y_max 31 | 32 | def multinside(self, aois): 33 | for aoi in aois.values(): 34 | if self.inside(aoi): 35 | return True 36 | return False 37 | -------------------------------------------------------------------------------- /track.py: -------------------------------------------------------------------------------- 1 | class Track: 2 | id = -1 3 | 4 | def __init__(self): 5 | Track.id += 1 6 | self.trajectories = [] 7 | self.cluster_code = [] 8 | 9 | def add_trajectory(self, trajectory): 10 | self.trajectories.append(trajectory) 11 | self.cluster_code.append(trajectory.getClusterIdx()) 12 | -------------------------------------------------------------------------------- /trajectory.py: -------------------------------------------------------------------------------- 1 | from math import * 2 | import numpy as np 3 | from filterpy.kalman import KalmanFilter 4 | from filterpy.common import Q_discrete_white_noise 5 | 6 | 7 | class Trajectory(): 8 | globID = 0 9 | 10 | def __init__(self, run): 11 | self.id = Trajectory.globID 12 | Trajectory.globID += 1 13 | 14 | self.run = list(run) 15 | self.points = [] 16 | self.ci = -1 17 | self.prefixSum = [0.0] 18 | self.track = -1 19 | 20 | self.build() 21 | 22 | def clean(self, param=0.9, dist=3.): 23 | "Filtra la corsa eliminando i punti che sono troppo vicini tra di loro" 24 | run = self.getPoints() 25 | i = 0 26 | while i < len(run) - 1: 27 | ii = i + 1 28 | while ii < len(run) - 1: 29 | dist_i_ii = euclidean_distance(run[i], run[ii]) 30 | if dist_i_ii < param: 31 | self.points.remove(run[ii]) 32 | ii += 1 33 | i += 1 34 | # Aggiorna la lunghezza della corsa 35 | self.setPrefixSum() 36 | 37 | def filter(self): 38 | "Filtra la corsa attraverso un filtro di Kalman" 39 | # Crea il filtro di Kalman 40 | f = KalmanFilter(dim_x=2, dim_z=2) 41 | # Inizializza il filtro 42 | f.x = np.array(self.points[0]) 43 | index = 0 44 | while index < len(self.points) - 1: 45 | f.F = np.array([[1, 0], [0, 1]]) # state transition matrix 46 | f.H = np.array([[1, 0], [0, 1]]) # Measurement function 47 | f.P *= 1.5 # covariance matrix 48 | f.R = np.array([[1, 0], [0, 1]]) # state uncertainty 49 | f.Q = Q_discrete_white_noise(2, 1., 1.) # process uncertainty 50 | f.predict() 51 | f.update(self.points[index + 1]) 52 | self.points[index + 1] = f.x 53 | index += 1 54 | 55 | def build(self): 56 | "Costruisce i punti della traiettoria" 57 | for cart in self.run: 58 | self.addPoint((cart.x, cart.y)) 59 | # Aggiorna la lunghezza della corsa 60 | self.setPrefixSum() 61 | 62 | def setPrefixSum(self): 63 | "Calcola e aggiorna la lunghezza della traiettoria" 64 | self.prefixSum = [0.0] 65 | if len(self.points) > 0: 66 | index = 0 67 | while index < len(self.points) - 1: 68 | self.prefixSum.append(self.prefixSum[len(self.prefixSum) - 1] + 69 | euclidean_distance(self.points[index + 1], self.points[index])) 70 | index += 1 71 | 72 | def getPrefixSum(self): 73 | return self.prefixSum 74 | 75 | def addPoint(self, p): 76 | self.points.append(p) 77 | 78 | def getPoints(self): 79 | return self.points 80 | 81 | def getClusterIdx(self): 82 | return self.ci 83 | 84 | def setClusterIdx(self, ci): 85 | self.ci = ci 86 | 87 | def length(self): 88 | return self.prefixSum[len(self.prefixSum) - 1] 89 | 90 | @staticmethod 91 | def decGlobID(): 92 | Trajectory.globID -= 1 93 | 94 | @staticmethod 95 | def resetGlobID(): 96 | Trajectory.globID = 0 97 | 98 | def __str__(self): 99 | str = "=== Trajectory ===\n" 100 | str += "cluster: %d\n" % self.ci 101 | for p in self.points: 102 | str += repr(p) + ", " 103 | str += "\n" 104 | return str 105 | 106 | def __len__(self): 107 | return len(self.points) 108 | 109 | 110 | # calcola la distanza euclidea tra due punti p1 e p2 111 | def euclidean_distance(p1, p2): 112 | assert (len(p1) == len(p2)) 113 | return sqrt(sum([((p1[i] - p2[i])) ** 2 for i in range(len(p1))])) 114 | -------------------------------------------------------------------------------- /trajectory_tracking.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python2 2 | # -*- coding: utf-8 -*- 3 | import matplotlib 4 | matplotlib.use('TkAgg') 5 | 6 | from Tkinter import * 7 | 8 | import operator 9 | from collections import OrderedDict 10 | 11 | from clustering import Clustering 12 | from drawing import Map 13 | from peewee_models import Cart, Aoi 14 | from track import Track 15 | from trajectory import Trajectory 16 | 17 | ########################### 18 | # 1) Global variables # 19 | ########################### 20 | 21 | MAX_CLUSTERS = 8 # MAX_CLUSTERS <= 10 22 | MAX_CLUSTERS_USER_DEFINED = False 23 | 24 | colors = {"purple": "#A020F0", 25 | "orange": "#FF8C00", 26 | "red": "#FF0000", 27 | "yellow": "#FFFF00", 28 | "green": "#228B22", 29 | "lime": "#7FFF00", 30 | "cyan": "#00FFFF", 31 | "blue": "#4169E1", 32 | "pink": "#FF69B4", 33 | "gray": "#2F4F4F"} 34 | 35 | COLOR_BLACK = "#000000" 36 | 37 | # Origin area 38 | origin = Aoi(x_min=0.1, x_max=14., y_min=28.5, y_max=35.18) 39 | # Extended origin area 40 | extended_origin = Aoi(x_min=0.1, x_max=17., y_min=26.5, y_max=35.18) 41 | 42 | # Control areas 43 | controls = { 44 | "c1": Aoi(x_min=41.18, x_max=44.23, y_min=19.53, y_max=21.49), 45 | # "c2": Aoi(x_min=31.13, x_max=34.28, y_min=19.53, y_max=21.49), 46 | "c3": Aoi(x_min=31.13, x_max=34.24, y_min=9.55, y_max=12.43), 47 | # "c4": Aoi(x_min=41.26, x_max=44.22, y_min=9.55, y_max=12.43), 48 | # "c5": Aoi(x_min=0.74, x_max=4.4, y_min=18.74, y_max=22.00), 49 | # "c6": Aoi(x_min=8.1, x_max=11.15, y_min=18.63, y_max=22.00), 50 | "c7": Aoi(x_min=8.1, x_max=11.88, y_min=9.08, y_max=12.03), 51 | # "c8": Aoi(x_min=19.08, x_max=22.12, y_min=9.08, y_max=11.35) 52 | } 53 | 54 | # List of all trajectories 55 | trajectories = [] 56 | # Index of the current trajectory to draw 57 | trajectory_index = 0 58 | # List of clusters 59 | clusters = Clustering() 60 | # Index of the current cluster to draw 61 | cluster_index = 0 62 | # Number of Trajectories per Cluster 63 | ntc = [] 64 | # List of tracks 65 | tracks = [] 66 | # Index of the current track to draw 67 | track_index = 0 68 | # Flag to prevent tracks from being computed again 69 | tracks_computed = False 70 | # List of macro-clusters 71 | macro_clusters = {} 72 | # Index of the current macro-clusters to draw 73 | macro_index = 0 74 | 75 | ########################## 76 | # 2) Drawing the map # 77 | ########################## 78 | 79 | # Inizializza la mappa 80 | tkmaster = Tk(className="TrajectoryTracking") 81 | map = Map(tkmaster, scale=18, width=1200, height=640, bg="#FFFFFF") 82 | map.pack(expand=True, fill="both", side="right") 83 | 84 | # Disegna la mappa 85 | map.draw_init(Aoi.select(), origin, controls) 86 | 87 | ########################## 88 | # 3) Selecting carts # 89 | ########################## 90 | 91 | # Preleva la lista dei singoli carrelli (len(carts_id) = 16) 92 | carts = Cart.select().group_by(Cart.tag_id) 93 | 94 | 95 | ######################################################################################################################## 96 | # LEGEND 97 | ######################################################################################################################## 98 | def show_legend(): 99 | map.clear_log() 100 | map.log(txt=">> Legend (keys)\n\n") 101 | map.log(txt="1: Compute trajectories\n") 102 | map.log(txt="2: Draw single trajectory\n") 103 | map.log(txt="3: Draw all trajectories\n") 104 | map.log(txt="4: Clustering (agglomerative)\n") 105 | map.log(txt="5: Clustering (spectral)\n") 106 | map.log(txt="6: Draw single cluster\n") 107 | map.log(txt="7: Draw all clusters\n") 108 | map.log(txt="8: Compute tracks\n") 109 | map.log(txt="9: Draw single track\n") 110 | map.log(txt="0: Draw macro cluster\n") 111 | map.log(txt="E: Take screenshot\n") 112 | map.log(txt="L: Show legend\n") 113 | 114 | 115 | show_legend() 116 | 117 | 118 | ######################################################################################################################## 119 | 120 | ######################################################################################################################## 121 | # FUNCTIONS # 122 | ######################################################################################################################## 123 | 124 | def compute_trajectories(event): 125 | map.clear_log() 126 | map.log(txt=">> 1: Compute trajectories\n\n") 127 | 128 | global trajectory_index, ntc 129 | trajectory_index = 0 130 | ntc = [] 131 | progress_carts = 0 132 | trajectories[:] = [] 133 | 134 | # For each cart: 135 | for cart in carts: 136 | # ProgressBar 137 | map.log(txt="Progress:\t" + '{0:.3g}'.format(100 * (float(progress_carts) / float(carts.count()))) + "%\n") 138 | map.update() 139 | 140 | # Get all the cart's instances ordered by time, 141 | # deleting the one out-of-bounds: 142 | instances = list( 143 | Cart.select() 144 | .order_by(Cart.time_stamp.desc()) 145 | .where(Cart.tag_id == cart.tag_id) 146 | .where(Cart.x > 0.).where(Cart.y > 0.) 147 | ) 148 | 149 | # Divide all the instances in trajectories which are origin2origin or origin2control or 150 | # control2control, and build the array of trajectories. 151 | # NB: if the last run does not reach a control or the origin, it is not taken. 152 | 153 | # Minimum length di of an origin2origin trajectory 154 | complete_min_run_length = 25 155 | # Minimum length di of an origin2control trajectory 156 | middle_min_run_length = 15 157 | # Maximum length di of a trajectory 158 | max_run_length = 350 159 | 160 | # Index of the beginning run instance 161 | begin = 0 162 | # Index of the current run instance 163 | i = 0 164 | # Flag: run has started 165 | has_run_started = False 166 | 167 | # For each instance: 168 | for instance in instances: 169 | # If the started run has not reached the origin or a control 170 | # or if the non-started run is inside the origin or a control: 171 | if (not instance.inside(origin) and not instance.multinside(controls) and has_run_started) \ 172 | or (instance.inside(origin) and not has_run_started) \ 173 | or (instance.multinside(controls) and not has_run_started): 174 | pass 175 | else: 176 | # If it needs to start the run: 177 | if not instance.inside(origin) and not instance.multinside(controls) and not has_run_started: 178 | # Start the run 179 | has_run_started = True 180 | # Save the begin index (exception check: run starts outside the origin/control) 181 | if i > 0: 182 | begin = i - 1 183 | else: 184 | begin = 0 185 | # If it needs to stop the run: 186 | else: 187 | # Stops the run 188 | has_run_started = False 189 | # Save the run interval 190 | run = instances[begin:i] 191 | 192 | # If the run is an endingrun (inside the origin): 193 | if instance.inside(origin): 194 | trajectory = Trajectory(run) 195 | # If the trajcetory is between complete_min_run_length and max_run_length 196 | if (complete_min_run_length < trajectory.prefixSum[len(trajectory.prefixSum) - 1] < \ 197 | max_run_length) and \ 198 | ((str(instances[begin].time_stamp - instances[i].time_stamp)) < str(3)): 199 | # Pulisce la traiettoria 200 | trajectory.clean() 201 | # Filtra la traiettoria attraverso un filtro di Kalman 202 | trajectory.filter() 203 | # Aggiunge la traiettoria alla lista 204 | trajectories.append(trajectory) 205 | # If the run is a middlerun (inside a control): 206 | else: 207 | trajectory = Trajectory(run) 208 | # If the trajcetory is between middle_min_run_length and max_run_length 209 | if (middle_min_run_length < trajectory.prefixSum[len(trajectory.prefixSum) - 1] < \ 210 | max_run_length) and \ 211 | ((str(instances[begin].time_stamp - instances[i].time_stamp)) < str(3)): 212 | # Pulisce la traiettoria 213 | trajectory.clean() 214 | # Filtra la traiettoria attraverso un filtro di Kalman 215 | trajectory.filter() 216 | # Aggiunge la traiettoria alla lista 217 | trajectories.append(trajectory) 218 | i += 1 219 | progress_carts += 1 220 | 221 | map.log(txt="Progress: 100%\n\n") 222 | map.log(txt="Number of trajectories:\t" + str(len(trajectories)) + "\n") 223 | map.log(txt="\nComputing tracks..\n") 224 | 225 | trajectory_index = len(trajectories) - 1 226 | 227 | # Set the track attribute to each trajectory to find the complete macro-trajectories 228 | n_track = -1 229 | flag = False 230 | for trajectory in trajectories: 231 | # Descendent order 232 | stop = trajectory.run[0].inside(extended_origin) 233 | start = trajectory.run[len(trajectory.run) - 1].inside(extended_origin) 234 | if start: 235 | trajectory.track = n_track 236 | n_track += 1 237 | flag = True 238 | else: 239 | if stop: 240 | if flag: 241 | flag = False 242 | else: 243 | n_track += 1 244 | trajectory.track = n_track 245 | 246 | map.log(txt="Tracks computed!\n") 247 | 248 | 249 | def draw_single_trajectory(event): 250 | global trajectory_index 251 | 252 | map.clear_log() 253 | map.log(txt=">> 2: Draw single trajectory\n\n") 254 | 255 | map.draw_init(Aoi.select(), origin, controls) 256 | 257 | if len(trajectories) > 0: 258 | map.draw_trajectory(trajectories[trajectory_index], color="red") 259 | 260 | map.log(txt="Cart id:\t" + str(trajectories[trajectory_index].run[0].tag_id) + "\n") 261 | map.log(txt= 262 | "Start:\t" 263 | + str(trajectories[trajectory_index].run[len(trajectories[trajectory_index].run) - 1].time_stamp) 264 | + "\n" 265 | ) 266 | map.log(txt="End:\t" + str(trajectories[trajectory_index].run[0].time_stamp) + "\n") 267 | 268 | if trajectory_index >= 0: 269 | trajectory_index -= 1 270 | else: 271 | trajectory_index = len(trajectories) - 1 272 | else: 273 | map.log(txt="Error: No trajectories computed.\n") 274 | 275 | 276 | def exportToEPS(event): 277 | map.generate_eps() 278 | 279 | def draw_all_trajectories(event): 280 | map.clear_log() 281 | map.log(txt=">> 3: Draw all trajectories\n\n") 282 | 283 | map.draw_init(Aoi.select(), origin, controls) 284 | 285 | if len(trajectories) == 0: 286 | map.log(txt="Error: No trajectories computed.\n") 287 | else: 288 | for trajectory in trajectories: 289 | map.draw_trajectory(trajectory, color="red") 290 | 291 | map.log(txt="N. of trajectories: " + str(len(trajectories)) + "\n") 292 | 293 | 294 | def cluster_trajectories_agglomerative(event): 295 | map.clear_log() 296 | map.log(txt=">> 4: Clustering (agglomerative)\n\n") 297 | map.update() 298 | 299 | global cluster_index, ntc 300 | cluster_index = 0 301 | 302 | if len(trajectories) == 0: 303 | map.log(txt="Error: No trajectories computed.\n") 304 | else: 305 | map.log(txt="Clustering..\n\n") 306 | map.update() 307 | 308 | # Clustering 309 | clusters.clusterAgglomerative(trajectories, MAX_CLUSTERS) 310 | 311 | map.draw_init(Aoi.select(), origin, controls) 312 | 313 | # Computes the number of trajectories per cluster 314 | ntc = [0] * MAX_CLUSTERS 315 | for t in trajectories: 316 | ntc[t.getClusterIdx()] += 1 317 | map.log(txt="Clusters:\n") 318 | for i in range(MAX_CLUSTERS): 319 | if ntc[i] > 0: 320 | perc = float(ntc[i]) / float(len(trajectories)) * 100 321 | map.log(txt="- " + '{0:.2f}'.format(perc) + "% " + colors.keys()[i] + " (" + str(ntc[i]) + ")\n") 322 | 323 | 324 | def cluster_trajectories_spectral(event): 325 | map.clear_log() 326 | map.log(txt=">> 5: Clustering (spectral)\n\n") 327 | map.update() 328 | 329 | global cluster_index, ntc, g 330 | cluster_index = 0 331 | 332 | if len(trajectories) == 0: 333 | map.log(txt="Error: No trajectories computed.\n") 334 | else: 335 | map.log(txt="Clustering..\n\n") 336 | map.update() 337 | 338 | # Clustering 339 | if MAX_CLUSTERS_USER_DEFINED: 340 | clusters.clusterSpectral(trajectories, MAX_CLUSTERS) 341 | else: 342 | g = clusters.clusterSpectral(trajectories) 343 | 344 | map.draw_init(Aoi.select(), origin, controls) 345 | 346 | # Computes the number of trajectories per cluster 347 | ntc = [0] * g 348 | for t in trajectories: 349 | ntc[t.getClusterIdx()] += 1 350 | 351 | map.log(txt="Clusters:\n") 352 | 353 | if MAX_CLUSTERS_USER_DEFINED: 354 | for i in range(MAX_CLUSTERS): 355 | if ntc[i] > 0: 356 | perc = float(ntc[i]) / float(len(trajectories)) * 100 357 | map.log(txt="- " + '{0:.2f}'.format(perc) + "% " + colors.keys()[i] + " (" + str(ntc[i]) + ")\n") 358 | else: 359 | for i in range(g): 360 | if ntc[i] > 0: 361 | perc = float(ntc[i]) / float(len(trajectories)) * 100 362 | map.log(txt="- " + '{0:.2f}'.format(perc) + "% " + colors.keys()[i] + " (" + str(ntc[i]) + ")\n") 363 | 364 | 365 | def draw_single_cluster(event): 366 | global cluster_index, ntc 367 | 368 | map.clear_log() 369 | map.log(txt='>> 6: Draw single cluster\n\n') 370 | 371 | if len(trajectories) == 0: 372 | map.log(txt="Error: No trajectories computed.\n") 373 | if len(ntc) == 0: 374 | map.log(txt="Error: No cluster computed.\n") 375 | else: 376 | if len(ntc) == 0: 377 | map.log(txt="Error: No cluster computed.\n") 378 | else: 379 | map.draw_init(Aoi.select(), origin, controls) 380 | 381 | for trajectory in trajectories: 382 | if trajectory.getClusterIdx() == cluster_index: 383 | map.draw_trajectory(trajectory, color=colors.values()[cluster_index]) 384 | perc = float(ntc[cluster_index]) / float(len(trajectories)) * 100 385 | 386 | map.log(txt= 387 | "- " + '{0:.2f}'.format(perc) + "% " + colors.keys()[cluster_index] 388 | + " (" + str(ntc[cluster_index]) + ")\n" 389 | ) 390 | 391 | if cluster_index < len(ntc) - 1: 392 | cluster_index += 1 393 | else: 394 | cluster_index = 0 395 | 396 | 397 | def draw_all_clusters(event): 398 | map.clear_log() 399 | map.log(txt='>> 7: Draw all clusters\n\n') 400 | 401 | if len(trajectories) == 0: 402 | map.log(txt="Error: No trajectories computed.\n") 403 | if len(ntc) == 0: 404 | map.log(txt="Error: No cluster computed.\n") 405 | else: 406 | if len(ntc) == 0: 407 | map.log(txt="Error: No cluster computed.\n") 408 | else: 409 | map.draw_init(Aoi.select(), origin, controls) 410 | 411 | for trajectory in trajectories: 412 | map.draw_trajectory(trajectory, colors.values()[trajectory.getClusterIdx()]) 413 | for i in range(len(ntc)): 414 | if ntc[i] > 0: 415 | perc = float(ntc[i]) / float(len(trajectories)) * 100 416 | map.log(txt= 417 | "- " + '{0:.2f}'.format(perc) + "% " 418 | + colors.keys()[i] + " (" + str(ntc[i]) + ")\n" 419 | ) 420 | 421 | 422 | def compute_tracks(event): 423 | map.clear_log() 424 | map.log(txt='>> 8: Compute tracks\n\n') 425 | 426 | if len(trajectories) == 0 or len(ntc) == 0: 427 | map.log(txt="Error: No trajectories or cluster computed.\n") 428 | else: 429 | global tracks_computed 430 | if not tracks_computed: 431 | global track_index, macro_index 432 | track_index = 0 433 | macro_index = 0 434 | print(macro_clusters) 435 | 436 | map.draw_init(Aoi.select(), origin, controls) 437 | 438 | for traj in trajectories: 439 | if len(tracks) == 0: 440 | tracks.append(Track()) 441 | tracks[0].add_trajectory(traj) 442 | else: 443 | if tracks[len(tracks) - 1].id == traj.track: 444 | tracks[len(tracks) - 1].add_trajectory(traj) 445 | else: 446 | tracks.append(Track()) 447 | tracks[len(tracks) - 1].add_trajectory(traj) 448 | tracks_computed = True 449 | map.log(txt="Tracks computed.\n") 450 | 451 | # Macro cluster 452 | for track in tracks: 453 | key = str(track.cluster_code) 454 | macro_clusters[key] = macro_clusters.get(key, 0) + 1 455 | map.log(txt="Macro clusters computed.\n\n") 456 | else: 457 | map.log(txt="Tracks already computed. \n\n") 458 | 459 | ord_macroclusters = OrderedDict(sorted(macro_clusters.items(), key=operator.itemgetter(1), reverse=True)) 460 | 461 | map.log(txt="Macro clusters: \n") 462 | for macrocluster_code in ord_macroclusters: 463 | color_keys = [] 464 | cluster_codes = list(eval(macrocluster_code)) 465 | for cluster_code in sorted(cluster_codes, reverse=True): 466 | color_keys.append(colors.keys()[cluster_code]) 467 | map.log(txt=str(color_keys) + " " + str(ord_macroclusters[macrocluster_code]) + "\n") 468 | 469 | 470 | def draw_single_track(event): 471 | global track_index 472 | 473 | map.clear_log() 474 | map.log(txt='>> 9: Draw single track\n\n') 475 | 476 | if len(trajectories) == 0 or len(ntc) == 0 or len(tracks) == 0: 477 | map.log(txt="Error: No trajectory, cluster or track computed.\n") 478 | else: 479 | # Canvas refresh 480 | map.draw_init(Aoi.select(), origin, controls) 481 | 482 | for i in tracks[track_index].trajectories: 483 | map.draw_trajectory(i, colors.values()[i.getClusterIdx()]) 484 | 485 | map.log(txt="Cart id:\t" + tracks[track_index].trajectories[0].run[0].tag_id + "\n") 486 | map.log(txt= 487 | "Start:\t" 488 | + str(tracks[track_index].trajectories[len(tracks[track_index].trajectories) - 1] 489 | .run[len( 490 | tracks[track_index].trajectories[len(tracks[track_index].trajectories) - 1].run) - 1] 491 | .time_stamp) + "\n" 492 | ) 493 | map.log(txt= 494 | "End:\t" 495 | + str(tracks[track_index].trajectories[0].run[0].time_stamp) + "\n" 496 | ) 497 | 498 | if track_index < len(tracks) - 1: 499 | track_index += 1 500 | else: 501 | track_index = 0 502 | 503 | 504 | def draw_macro_cluster(event): 505 | global macro_index 506 | 507 | map.clear_log() 508 | map.log(txt='>> 0: Draw macro-clusters\n\n') 509 | 510 | if len(trajectories) == 0 or len(ntc) == 0 or len(tracks) == 0: 511 | map.log(txt="Error: No trajectory, cluster or track computed.\n") 512 | else: 513 | map.draw_init(Aoi.select(), origin, controls) 514 | 515 | ord_macro_clusters = OrderedDict( 516 | sorted(macro_clusters.items(), key=operator.itemgetter(1), reverse=True)) 517 | for track in tracks: 518 | if str(track.cluster_code) == ord_macro_clusters.keys()[macro_index]: 519 | for traj in track.trajectories: 520 | map.draw_trajectory(traj, color=colors.values()[traj.getClusterIdx()]) 521 | 522 | map.log(txt="N. of Tracks in each Macro Cluster:\t" + str(ord_macro_clusters.values()[macro_index]) \ 523 | + "\n") 524 | 525 | if macro_index < len(ord_macro_clusters) - 1: 526 | macro_index += 1 527 | else: 528 | macro_index = 0 529 | 530 | 531 | def legend(event): 532 | show_legend() 533 | pass 534 | 535 | 536 | # Command line parsing 537 | if (len(sys.argv) == 2): 538 | MAX_CLUSTERS = int(sys.argv[1]) 539 | MAX_CLUSTERS_USER_DEFINED = True 540 | 541 | ######################################################################################################################## 542 | 543 | tkmaster.bind("1", compute_trajectories) 544 | tkmaster.bind("2", draw_single_trajectory) 545 | tkmaster.bind("3", draw_all_trajectories) 546 | tkmaster.bind("4", cluster_trajectories_agglomerative) 547 | tkmaster.bind("5", cluster_trajectories_spectral) 548 | tkmaster.bind("6", draw_single_cluster) 549 | tkmaster.bind("7", draw_all_clusters) 550 | tkmaster.bind("8", compute_tracks) 551 | tkmaster.bind("9", draw_single_track) 552 | tkmaster.bind("0", draw_macro_cluster) 553 | tkmaster.bind("e", exportToEPS) 554 | tkmaster.bind("l", legend) 555 | 556 | mainloop() 557 | --------------------------------------------------------------------------------