├── london.png ├── casa_black.png ├── londontiles ├── london_tiles.dbf ├── london_tiles.shp └── london_tiles.shx ├── LICENSE.txt ├── samplerulefile.cga ├── README.md └── cityengine_twitter.py /london.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/urschrei/CityEngine-Twitter/HEAD/london.png -------------------------------------------------------------------------------- /casa_black.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/urschrei/CityEngine-Twitter/HEAD/casa_black.png -------------------------------------------------------------------------------- /londontiles/london_tiles.dbf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/urschrei/CityEngine-Twitter/HEAD/londontiles/london_tiles.dbf -------------------------------------------------------------------------------- /londontiles/london_tiles.shp: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/urschrei/CityEngine-Twitter/HEAD/londontiles/london_tiles.shp -------------------------------------------------------------------------------- /londontiles/london_tiles.shx: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/urschrei/CityEngine-Twitter/HEAD/londontiles/london_tiles.shx -------------------------------------------------------------------------------- /LICENSE.txt: -------------------------------------------------------------------------------- 1 | The MIT License (MIT) 2 | 3 | Copyright (c) 2013 Stephan Hügel, Flora Roumpani 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in 13 | all copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN 21 | THE SOFTWARE. -------------------------------------------------------------------------------- /samplerulefile.cga: -------------------------------------------------------------------------------- 1 | /** 2 | * File: ruleheights.cga 3 | * Created: 10 Jun 2013 13:42:55 GMT 4 | * Author: Flora Roumpani 5 | 6 | This is a sample cga rule file used to simply extrude the volumes and 7 | control colors. In this case we are using the Twitter palette. 8 | The user can modify this rule to create diffrent building shapes 9 | instead of simple extrutions. 10 | 11 | */ 12 | 13 | version "2011.1" 14 | 15 | attr HGT = 0 16 | attr opacitytwit=1 17 | attr opacityshape=0.6 18 | 19 | 20 | 21 | ##to use for a colorramp 22 | #attr maxHGT= 1000 23 | #attr min = 0 24 | #@Range(0,1) 25 | #attr colorValue = 1 26 | #attr x_norm =1 / (maxHGT - min) * (maxHGT - min) 27 | 28 | 29 | Lot --> 30 | 31 | 32 | ##use colorRamp 33 | #extrude (HGT) 34 | #color (colorRamp("brownToBlue",x_norm)) 35 | #set (material.opacity, opacity) 36 | #Volume. 37 | 38 | 39 | case HGT < 5 : 40 | extrude (30)color ("#CACACA") 41 | set (material.opacity, opacityshape) 42 | Volume. 43 | 44 | case HGT < 100 : 45 | extrude (HGT)color ("#A3E0FF")#twitterpalete 46 | set (material.opacity, opacitytwit) 47 | Volume. 48 | 49 | case HGT < 200 : 50 | extrude (HGT)color ("#66CCFF")#twitterpalete 51 | set (material.opacity, opacitytwit) 52 | Volume. 53 | 54 | case HGT < 400 : 55 | extrude (HGT)color ("#52A3CC")#twitterpalete 56 | set (material.opacity, opacitytwit) 57 | Volume. 58 | 59 | case HGT < 600 : 60 | extrude (HGT)color ("#478FB2")#twitterpalete 61 | set (material.opacity, opacitytwit) 62 | Volume. 63 | 64 | case HGT < 800 : 65 | extrude (HGT)color ("#3D7A99")#twitterpalete 66 | set (material.opacity, opacitytwit) 67 | Volume. 68 | 69 | case HGT > 1000 : 70 | extrude (HGT)color ("#295266")#twitterpalete 71 | set (material.opacity, opacitytwit) 72 | Volume. 73 | 74 | else: 75 | set (material.opacity, opacityshape) 76 | Volume. 77 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | 2 | [![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.9795.svg)](https://doi.org/10.5281/zenodo.9795) 3 | 4 | # Mapping London Buildings in Real Time using Twitter Data 5 | 6 | ![London](london.png "London") 7 | 8 | Esri's [CityEngine](http://www.esri.com/software/cityengine) is a 3D GIS tool which allows city models to be generated procedurally (that is, models are generated using a set of rules). This approach allows us to define simple conditions which govern the appearance of a city model. 9 | 10 | Twitter provides a 'firehose' of real-time tweets, including their location if this functionality has been enabled by a user. 11 | 12 | Using CityEngine's Python scripting functionality, we've created a model of London in which buildings grow upwards, according to the frequency of tweets which are sent from (or in close proximity to) them. 13 | 14 | ## Details 15 | ### Installation ### 16 | This script requires the [Tweepy](http://tweepy.github.io) library. You should first install the [pip](http://www.pip-installer.org/en/latest/) tool for your Python installation. Because CityEngine uses its own Jython installation, it's easiest to install Tweepy into a specific target directory external to that installation: 17 | `mkdir /path/to/directory` 18 | `pip install -t /path/to/directory tweepy` 19 | This directory should be added to your Python path using `sys.path.append("C:/path/to/directory")` **before** you import tweepy in your script. See our example [here](cityengine_twitter.py#L14-15). 20 | 21 | ### The CityEngine Ruleset ### 22 | 23 | You may use any spatial basemap you like. In this script, the co-ordinates cover the area of wider London, however, they can be altered by the user in the script. 24 | 25 | In order to run the demo: 26 | 27 | 1. Import shapefile `tileslondon.shp`, `cityengine_twitter.py` and `samplerulefile.cga` in CityEngine 28 | 2. Assign the rule file `samplerulefile.cga` in `tileslondon`, in case it is not assigned already 29 | 3. Run Python script 30 | 31 | If you wish to use your own basemap, the steps are as follows: 32 | 33 | 1. Import the shapefile (or any format supported by CityEngine, such as `dxf` &c.), `cityengine_twitter.py` and `samplerulefile.cga` into CityEngine (The script is created for the area of London, so make sure the coordinate system is “British National Grid”). If you are visualising a city outside the UK, this is not necessary, but you will also have to modify the Python script to skip the conversion step to BNG in the [on_status](cityengine_twitter.py#L178) method 34 | 2. In the CityEngine scene, select the shapes to which you wish to apply the visualisation, and change the name to `buildings_tq_small` (this corresponds to the variable value [here](cityengine_twitter.py#L22)) 35 | 3. Assign the rule file `samplerulefile.cga` to the CityEngine layer in which you wish to collect the tweets. In the rule options of the inspector use `Lot` as Start Rule. You must be able to see three attributes under `samplerulefile`: 36 | 1. `HGT` (controls the height of shapes (object defined)) 37 | 2. `Opacityshape` (controls the opacity of the shapes without tweets (user controlled)), 38 | 3. `Opacitytwit` (controls the opacity of the shapes with tweets (user controlled)). 39 | 4. Under the field **Object Attributes** in the inspector, right click and add Object Attribute. 40 | You will have to repeat this process 5 times as explained below to add 5 different attributes which will allow you to control the catchment areas for the tweets and the height visualization (the names must be identical): 41 | 1. Add Object Attribute 42 | - Attribute name: `HGT` 43 | - Type: float 44 | - Value: 0 - (adds height to shape) - script controlled 45 | 2. Add Object Attribute 46 | - Attribute name: `count_t` 47 | - Type: float 48 | - Value: 0 – (this counts the no. of total tweets on each shape) – script controlled 49 | 3. Add Object Attribute 50 | - Attribute name: `maxHGT` 51 | - Type: float 52 | - Value: user defined – (maximum height of the generated shapes) 53 | 4. Add Object Attribute 54 | - Attribute name: `twitHGT` 55 | - Type: float 56 | - Value: user defined – (original step growth per tweet) 57 | 5. Add Object Attribute 58 | - Attribute name: `maxdistance` 59 | - Type: float 60 | - Value: user defined – (defines the maximum catchment area for the tweet (e.g. The maximum radius that one of our shapes may cover) 61 | 5. Back on the Rule Attributes, go to `HGT`, select source for attribute `HGT`, and set it to the Object attribute with the name `HGT`. 62 | 6. Run Python script 63 | 7. If you wish to restart the visualisation – resetting all building heights to zero, you must select all the shapes and in the CityEngine inspector's object attributes, set the `HGT` and `count_t` variables to `0` 64 | 65 | ### Method ### 66 | Using Tweepy, we have defined a bounding box around Greater London. When a tweet is sent to our script by the Twitter streaming API, we determine whether it contains GPS data, and discard it if not. 67 | 68 | The basemap we're using is a map of London building footprints, which is supplied by Ordnance Survey, and which can be obtained from [Digimap](http://digimap.edina.ac.uk/digimap/home). In order to place a tweet on our map, we must first convert its coordinates from [WGS](http://en.wikipedia.org/wiki/WGS84) latitude and longitude points to [Ordnance Survey National Grid coordinates](http://en.wikipedia.org/wiki/British_National_Grid). 69 | 70 | Following the conversion we determine whether the tweet falls within the boundary of a building on our basemap. This calculation is performed in two steps: 71 | 72 | 1. We create a list of the coordinates of the centroid of each object (in this case, shape) on our map. We then calculate which shapes fall within a user-defined radius (in this case, 100 metres) 73 | 2. For each shape obtained in step 1, we create a list of the `x` and `y` coordinates of its vertices, and apply the [even-odd](http://en.wikipedia.org/wiki/Even–odd_rule) rule to determine whether the point falls within the surface defined by the vertices. If it does, we extrude the shape by a given factor, set its colour, and define certain characteristics. In our case, we are mapping one particular attribute – location – to the shape in the form of height. However, any attribute of the tweet could be used to alter a shape's attributes. For instance, we could set the heights of the buildings to their true heights, but add a window to the buildings each time a tweet is received 74 | 75 | Because we're mapping tweet frequency to building height, we have had to implement a method of slowing the growth of building heights, in order to avoid quickly growing locations which generate a lot of tweets (e.g. the British Museum): 76 | ![equation](http://latex.codecogs.com/png.latex?%5Cfn_phv%20h%20%3D%20h_%7Bprev%7D%20+%20%5Cleft%20%28%5Cfrac%7Bh_%7Bmax%7D%20-%20h_%7Bprev%7D%7D%7Bh_%7Bmax%7D%7D%20%5Cright%29%20*%20100) 77 | This equation scales the shape's height,`h`, by an amount which decreases linearly as it grows towards the maximum height. 78 | An example generator which yields these values: 79 | 80 | ``` python 81 | def height(): 82 | """ 83 | Yield linearly-decreasing values, beginning with 100 84 | 85 | """ 86 | maxheight = 18000.00 87 | previous = 0.00 88 | while True: 89 | newheight = previous + (((maxheight - previous) / maxheight) * 100.00) 90 | previous = newheight 91 | yield newheight 92 | 93 | scaled_height = height() 94 | for h in xrange(5): 95 | print(scaled_height.next()) 96 | # 100.00, 199.44, 298.33, 396.67, 494.47 97 | ``` 98 | 99 | ## Problems and Caveats 100 | 101 | - As [Ed Manley](http://urbanmovements.co.uk) and [James Cheshire](http://spatial.ly) have pointed out, different Twitter clients report their location with differing levels of spatial precision, and some clients introduce a rounding error in their reported GPS co-ordinates, which leads to 'striping' when they are visualised. In addition, it is not possible to determine whether a tweet was sent from a moving vehicle or train, or simply as someone was walking close to a building. Additional uncertainty is introduced by the conversion from WGS to BNG, which is only accurate to ~5m 102 | - The Twitter Streaming API only delivers filtered (in our case, by location) messages up to the "streaming cap", and there is no way of determining whether the sample that we receive is "representative" 103 | - Current estimates suggest that only [~1% of Tweets](http://www.quora.com/What-percentage-of-tweets-are-geotagged-What-percentage-of-geotagged-tweets-are-ascribed-to-a-venue#) are Geotagged. Visualising these data thus cannot represent 'actual' Twitter usage in a given place 104 | 105 | ## Citation 106 | 107 | If you make use of this work, please cite it using the following DOI: 108 | [![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.9795.svg)](https://doi.org/10.5281/zenodo.9795) 109 | 110 | 111 | --- 112 | [![CASA](casa_black.png)](http://www.bartlett.ucl.ac.uk/casa) 113 | -------------------------------------------------------------------------------- /cityengine_twitter.py: -------------------------------------------------------------------------------- 1 | # -*- coding: utf-8 -*- 2 | """ 3 | Created on 2013-06-04 4 | 5 | @author: Flora Roumpani (flora.roumpani.11@ucl.ac.uk) 6 | @author: Stephan Hügel (stephan.hugel.12@ucl.ac.uk) 7 | http://www.bartlett.ucl.ac.uk/casa 8 | 9 | """ 10 | 11 | 12 | import sys 13 | import datetime 14 | import traceback 15 | import time 16 | from threading import Thread 17 | import math 18 | sys.path.append("C:/Users/flora/pythonpackages") 19 | import tweepy 20 | from credentials import con_key, con_secret, acc_key, acc_secret 21 | from scripting import * 22 | 23 | # get a CityEngine instance 24 | ce = CE() 25 | 26 | 27 | class Screenshot(Thread): 28 | def __init__(self): 29 | self.stopped = False 30 | Thread.__init__(self) 31 | self.counter = 0 32 | 33 | def run(self): 34 | while not self.stopped: 35 | # Write a screenshot every 30 seconds 36 | time.sleep(30.00) 37 | self.counter += 1 38 | filename = "/new_snapshot_%s.png" % self.counter 39 | views = ce.getObjectsFrom(ce.get3DViews(), ce.isViewport) 40 | # ce.waitForUIIdle() 41 | views[0].snapshot(ce.toFSPath('images') + filename, 1920, 1080) 42 | 43 | 44 | """ fill name of shapes to analyse """ 45 | name = 'buildings_tq_small' 46 | countalltweets = 0 47 | lotposXa = [] 48 | lotposYa = [] 49 | shapeslist = [] 50 | zipped = [] 51 | 52 | 53 | def bng(input_lat, input_lon): 54 | """ 55 | Convert lat and long to BNG 56 | Expects two floats as input 57 | Returns a tuple of Easting, Northing floats 58 | Accurate to ~5m 59 | 60 | For testing purposes: 61 | input: 51.44533267, -0.32824866 62 | output: 516275.97337480687, 173142.1442810801 63 | 64 | Entirely Based on code by Hannah Fry (CASA): 65 | hannahfry.co.uk/2012/02/01/converting-british-national-grid-to-latitude-and-longitude-ii/ 66 | 67 | """ 68 | # Simple bounds checking 69 | if not all([0 <= input_lat <= 90, -180 <= input_lon <= 180]): 70 | raise Exception( 71 | "input_lat should be between 0 and 90, input_lon should be between -180 and 80") 72 | # Convert input to degrees 73 | lat_1 = input_lat * math.pi / 180 74 | lon_1 = input_lon * math.pi / 180 75 | # The GSR80 semi-major and semi-minor axes used for WGS84 (m) 76 | a_1, b_1 = 6378137.000, 6356752.3141 77 | # The eccentricity (squared) of the GRS80 ellipsoid 78 | e2_1 = 1 - (b_1 * b_1) / (a_1 * a_1) 79 | # Transverse radius of curvature 80 | nu_1 = a_1 / math.sqrt(1 - e2_1 * math.sin(lat_1) ** 2) 81 | # Third spherical coordinate is 0, in this case 82 | H = 0 83 | x_1 = (nu_1 + H) * math.cos(lat_1) * math.cos(lon_1) 84 | y_1 = (nu_1 + H) * math.cos(lat_1) * math.sin(lon_1) 85 | z_1 = ((1 - e2_1) * nu_1 + H) * math.sin(lat_1) 86 | 87 | # Perform Helmert transform (to go between Airy 1830 (_1) and GRS80 (_2)) 88 | s = 20.4894 * 10 ** -6 89 | # The translations along x,y,z axes respectively 90 | tx, ty, tz = -446.448, 125.157, -542.060 91 | # The rotations along x,y,z respectively, in seconds 92 | rxs, rys, rzs = -0.1502, -0.2470, -0.8421 93 | # In radians 94 | rx, ry, rz = \ 95 | rxs * math.pi / (180 * 3600.),\ 96 | rys * math.pi / (180 * 3600.),\ 97 | rzs * math.pi / (180 * 3600.) 98 | x_2 = tx + (1 + s) * x_1 + (-rz) * y_1 + (ry) * z_1 99 | y_2 = ty + (rz) * x_1 + (1 + s) * y_1 + (-rx) * z_1 100 | z_2 = tz + (-ry) * x_1 + (rx) * y_1 + (1 + s) * z_1 101 | 102 | # The GSR80 semi-major and semi-minor axes used for WGS84 (m) 103 | a, b = 6377563.396, 6356256.909 104 | # The eccentricity of the Airy 1830 ellipsoid 105 | e2 = 1 - (b * b) / (a * a) 106 | p = math.sqrt(x_2 ** 2 + y_2 ** 2) 107 | # Initial value 108 | lat = math.atan2(z_2, (p * (1 - e2))) 109 | latold = 2 * math.pi 110 | 111 | # Latitude is obtained by an iterative procedure 112 | while abs(lat - latold) > 10 ** -16: 113 | lat, latold = latold, lat 114 | nu = a / math.sqrt(1 - e2 * math.sin(latold) ** 2) 115 | lat = math.atan2(z_2 + e2 * nu * math.sin(latold), p) 116 | 117 | lon = math.atan2(y_2, x_2) 118 | H = p / math.cos(lat) - nu 119 | # Scale factor on the central meridian 120 | F0 = 0.9996012717 121 | # Latitude of true origin (radians) 122 | lat0 = 49 * math.pi / 180 123 | # Longtitude of true origin and central meridian (radians) 124 | lon0 = -2 * math.pi / 180 125 | # Northing & easting of true origin (m) 126 | N0, E0 = -100000, 400000 127 | n = (a - b) / (a + b) 128 | # Meridional radius of curvature 129 | rho = a * F0 * (1-e2) * (1 - e2 * math.sin(lat) ** 2) ** (-1.5) 130 | eta2 = nu * F0 / rho - 1 131 | 132 | M1 = (1 + n + (5/4) * n ** 2 + (5 / 4) * n ** 3) \ 133 | * (lat - lat0) 134 | M2 = (3 * n + 3 * n ** 2 + (21 / 8) * n ** 3) \ 135 | * math.sin(lat - lat0) * math.cos(lat + lat0) 136 | M3 = ((15 / 8) * n ** 2 + (15 / 8) * n ** 3) \ 137 | * math.sin(2 * (lat-lat0)) * math.cos(2 * (lat + lat0)) 138 | M4 = (35 / 24) * n ** 3 * math.sin(3 * (lat - lat0)) \ 139 | * math.cos(3 * (lat + lat0)) 140 | 141 | M = b * F0 * (M1 - M2 + M3 - M4) 142 | 143 | I = M + N0 144 | II = nu * F0 * math.sin(lat) * \ 145 | math.cos(lat) / 2 146 | III = nu * F0 * math.sin(lat) * \ 147 | math.cos(lat) ** 3 * (5 - math.tan(lat) ** 2 + 9 * eta2) / 24 148 | IIIA = nu * F0 * math.sin(lat) * \ 149 | math.cos(lat) ** 5 * (61 - 58 * math.tan(lat) ** 2 + math.tan(lat) ** 4) / 720 150 | IV = nu * F0 * math.cos(lat) 151 | V = nu * F0 * \ 152 | math.cos(lat) ** 3 * (nu / rho - math.tan(lat) ** 2) / 6 153 | VI = nu * F0 * math.cos(lat) ** 5 * \ 154 | (5 - 18 * math.tan(lat) ** 2 + math.tan(lat) ** 4 + 14 * eta2 - 58 * eta2 * math.tan(lat) ** 2) / 120 155 | 156 | N = I + II * (lon - lon0) \ 157 | ** 2 + III * (lon - lon0) ** 4 + IIIA * (lon - lon0) ** 6 158 | E = E0 + IV * (lon - lon0) + V * (lon - lon0) \ 159 | ** 3 + VI * (lon - lon0) ** 5 160 | return E, N 161 | 162 | 163 | Shapes = ce.getObjectsFrom(ce.scene, ce.withName(name)) 164 | # Zone cluster size 165 | mxdstnc = ce.getAttribute(Shapes[0], 'mxdstnc') 166 | # Maximum allowed height 167 | maxHGT = ce.getAttribute(Shapes[0], 'maxHGT') 168 | # Tweet step 169 | twitHGT = ce.getAttribute(Shapes[0], 'twitHGT') 170 | 171 | for i in range(len(Shapes)): 172 | # Gets the position of all objects in scene called Lot1 173 | lotpos = ce.getPosition(Shapes[i]) 174 | lotposX = lotpos[0] 175 | lotposY = lotpos[2] 176 | lotposXa.append(lotposX) 177 | lotposYa.append(lotposY) 178 | 179 | 180 | class StreamWatcherListener(tweepy.StreamListener): 181 | """ Watch the stream of an app, and respond to stream events """ 182 | 183 | def __init__(self): 184 | super(StreamWatcherListener, self).__init__() 185 | self.keywords = set(list( 186 | "weather", 187 | "traffic", 188 | "tdf", 189 | "wimbledon", 190 | "summer", 191 | "london" 192 | )) 193 | 194 | def on_status(self, status): 195 | 196 | if status.coordinates: 197 | # naive split into unique words 198 | intersect = list(set(word.lower() for word in status.text.split()) & self.keywords) 199 | if intersect: 200 | # process the keywords 201 | print "Keyword hits:", ", ".join(intersect) 202 | point = status.coordinates['coordinates'] 203 | # Reverse lon, lat points 204 | point[0], point[1] = point[1], point[0] 205 | global countalltweets 206 | countalltweets += 1 207 | E, N = bng(point[0], point[1]) 208 | N = -N 209 | for i in range(len(Shapes)): 210 | # Finds distance between the original object and every object in the scene 211 | distance = math.sqrt( 212 | (E - lotposXa[i]) * 213 | (E - lotposXa[i]) + 214 | (N - lotposYa[i]) * 215 | (N - lotposYa[i])) 216 | if distance < mxdstnc: 217 | # Even-odd rule to check whether a point falls within a polygon 218 | c = False 219 | shapeslist = ce.getVertices(Shapes[i]) 220 | filtered_shapes = [coord for coord in shapeslist if coord] 221 | global zipped 222 | zipped = zip(filtered_shapes[0::2], filtered_shapes[1::2]) 223 | jj = len(zipped) - 1 224 | for ii in range(len(zipped)): 225 | if (((zipped[ii][1] > N) != (zipped[jj][1] > N)) and 226 | (E < (zipped[jj][0] - zipped[ii][0]) * 227 | (N - (zipped[ii][1])) / ((zipped[jj][1]) - 228 | (zipped[ii][1])) + zipped[ii][0])): 229 | c = not c 230 | jj = ii 231 | if c is True: 232 | count_t1 = ce.getAttribute(Shapes[i], 'count_t') 233 | ce.setAttribute(Shapes[i], 'count_t', count_t1 + 1) 234 | hgt1 = ce.getAttribute(Shapes[i], 'HGT') 235 | ce.generateModels(Shapes[i]) 236 | if hgt1 < maxHGT: 237 | ce.setAttribute( 238 | Shapes[i], 239 | 'HGT', 240 | (hgt1 + ((maxHGT - hgt1) / maxHGT) * twitHGT)) 241 | else: 242 | hgt1 = maxHGT 243 | 244 | 245 | def on_error(self, status_code): 246 | """ Echo any errors """ 247 | print 'An error has occured! Status code = %s' % status_code 248 | if status_code == 420: 249 | print 'Exiting due to auth error - we have to respect a 420 error' 250 | raise 251 | # Keep stream alive 252 | return True 253 | 254 | 255 | def on_timeout(self): 256 | """ Echo a timeout """ 257 | print 'Snoozing Zzzzzz' 258 | 259 | 260 | def main(): 261 | """ Main function """ 262 | 263 | def authorise(*args): 264 | """ 265 | Tweepy OAuth dance 266 | Accepts: consumer key, secret; access key, secret 267 | 268 | """ 269 | auth = tweepy.OAuthHandler( 270 | args[0], 271 | args[1]) 272 | auth.set_access_token( 273 | args[2], 274 | args[3]) 275 | return auth 276 | 277 | capture = Screenshot() 278 | # capture.start() 279 | print 'Begin streaming at - %s' % datetime.datetime.now() 280 | twitter = authorise(con_key, con_secret, acc_key, acc_secret) 281 | swl = StreamWatcherListener() 282 | stream = tweepy.streaming.Stream( 283 | twitter, 284 | swl, 285 | timeout=None, 286 | secure=True) 287 | # This is the London bounding box 288 | stream.filter(locations=[-0.5630, 51.2613, 0.2804, 51.6860]) 289 | 290 | 291 | if __name__ == "__main__": 292 | try: 293 | main() 294 | except (KeyboardInterrupt, SystemExit): 295 | capture.stopped = True 296 | # Actually raise these so it exits cleanly 297 | raise 298 | except Exception, error: 299 | capture.stopped = True 300 | # All other exceptions, so display the error 301 | print "Stack trace:\n", traceback.print_exc(file=sys.stdout) 302 | else: 303 | pass 304 | finally: 305 | # Exit cleanly once we've done everything else 306 | capture.stopped = True 307 | print 'Total no. of Tweets:', countalltweets 308 | print 'End streaming at - %s' % datetime.datetime.now() 309 | sys.exit() 310 | --------------------------------------------------------------------------------