├── BikeSharingNeuralNet ├── DLND Your first neural network.ipynb ├── DLND+Your+first+neural+network.html ├── README.md ├── Readme.txt ├── day.csv ├── dir └── hour.csv ├── DeepLearning_Certificate.pdf ├── GenerateTV_Scripts_RNNs ├── dlnd_tv_script_generation.ipynb ├── dlnd_tv_script_generation_git.html ├── helper.py ├── problem_unittests.py └── readme.md ├── ImageClassification_cifar10 ├── README.md ├── helper.py ├── image_classification.html ├── image_classification.ipynb └── problem_unittests.py └── Image_recognition_MNIST └── code.html /BikeSharingNeuralNet/README.md: -------------------------------------------------------------------------------- 1 | ========================================== 2 | Bike Sharing Dataset 3 | ========================================== 4 | 5 | Hadi Fanaee-T 6 | 7 | Laboratory of Artificial Intelligence and Decision Support (LIAAD), University of Porto 8 | INESC Porto, Campus da FEUP 9 | Rua Dr. Roberto Frias, 378 10 | 4200 - 465 Porto, Portugal 11 | 12 | 13 | ========================================= 14 | Background 15 | ========================================= 16 | 17 | Bike sharing systems are new generation of traditional bike rentals where whole process from membership, rental and return 18 | back has become automatic. Through these systems, user is able to easily rent a bike from a particular position and return 19 | back at another position. Currently, there are about over 500 bike-sharing programs around the world which is composed of 20 | over 500 thousands bicycles. Today, there exists great interest in these systems due to their important role in traffic, 21 | environmental and health issues. 22 | 23 | Apart from interesting real world applications of bike sharing systems, the characteristics of data being generated by 24 | these systems make them attractive for the research. Opposed to other transport services such as bus or subway, the duration 25 | of travel, departure and arrival position is explicitly recorded in these systems. This feature turns bike sharing system into 26 | a virtual sensor network that can be used for sensing mobility in the city. Hence, it is expected that most of important 27 | events in the city could be detected via monitoring these data. 28 | 29 | ========================================= 30 | Data Set 31 | ========================================= 32 | Bike-sharing rental process is highly correlated to the environmental and seasonal settings. For instance, weather conditions, 33 | precipitation, day of week, season, hour of the day, etc. can affect the rental behaviors. The core data set is related to 34 | the two-year historical log corresponding to years 2011 and 2012 from Capital Bikeshare system, Washington D.C., USA which is 35 | publicly available in http://capitalbikeshare.com/system-data. We aggregated the data on two hourly and daily basis and then 36 | extracted and added the corresponding weather and seasonal information. Weather information are extracted from http://www.freemeteo.com. 37 | 38 | ========================================= 39 | Associated tasks 40 | ========================================= 41 | 42 | - Regression: 43 | Predication of bike rental count hourly or daily based on the environmental and seasonal settings. 44 | 45 | - Event and Anomaly Detection: 46 | Count of rented bikes are also correlated to some events in the town which easily are traceable via search engines. 47 | For instance, query like "2012-10-30 washington d.c." in Google returns related results to Hurricane Sandy. Some of the important events are 48 | identified in [1]. Therefore the data can be used for validation of anomaly or event detection algorithms as well. 49 | 50 | 51 | ========================================= 52 | Files 53 | ========================================= 54 | 55 | - Readme.txt 56 | - hour.csv : bike sharing counts aggregated on hourly basis. Records: 17379 hours 57 | - day.csv - bike sharing counts aggregated on daily basis. Records: 731 days 58 | 59 | 60 | ========================================= 61 | Dataset characteristics 62 | ========================================= 63 | Both hour.csv and day.csv have the following fields, except hr which is not available in day.csv 64 | 65 | - instant: record index 66 | - dteday : date 67 | - season : season (1:springer, 2:summer, 3:fall, 4:winter) 68 | - yr : year (0: 2011, 1:2012) 69 | - mnth : month ( 1 to 12) 70 | - hr : hour (0 to 23) 71 | - holiday : weather day is holiday or not (extracted from http://dchr.dc.gov/page/holiday-schedule) 72 | - weekday : day of the week 73 | - workingday : if day is neither weekend nor holiday is 1, otherwise is 0. 74 | + weathersit : 75 | - 1: Clear, Few clouds, Partly cloudy, Partly cloudy 76 | - 2: Mist + Cloudy, Mist + Broken clouds, Mist + Few clouds, Mist 77 | - 3: Light Snow, Light Rain + Thunderstorm + Scattered clouds, Light Rain + Scattered clouds 78 | - 4: Heavy Rain + Ice Pallets + Thunderstorm + Mist, Snow + Fog 79 | - temp : Normalized temperature in Celsius. The values are divided to 41 (max) 80 | - atemp: Normalized feeling temperature in Celsius. The values are divided to 50 (max) 81 | - hum: Normalized humidity. The values are divided to 100 (max) 82 | - windspeed: Normalized wind speed. The values are divided to 67 (max) 83 | - casual: count of casual users 84 | - registered: count of registered users 85 | - cnt: count of total rental bikes including both casual and registered 86 | 87 | ========================================= 88 | License 89 | ========================================= 90 | Use of this dataset in publications must be cited to the following publication: 91 | 92 | [1] Fanaee-T, Hadi, and Gama, Joao, "Event labeling combining ensemble detectors and background knowledge", Progress in Artificial Intelligence (2013): pp. 1-15, Springer Berlin Heidelberg, doi:10.1007/s13748-013-0040-3. 93 | 94 | @article{ 95 | year={2013}, 96 | issn={2192-6352}, 97 | journal={Progress in Artificial Intelligence}, 98 | doi={10.1007/s13748-013-0040-3}, 99 | title={Event labeling combining ensemble detectors and background knowledge}, 100 | url={http://dx.doi.org/10.1007/s13748-013-0040-3}, 101 | publisher={Springer Berlin Heidelberg}, 102 | keywords={Event labeling; Event detection; Ensemble learning; Background knowledge}, 103 | author={Fanaee-T, Hadi and Gama, Joao}, 104 | pages={1-15} 105 | } 106 | 107 | ========================================= 108 | Contact 109 | ========================================= 110 | 111 | For further information about this dataset please contact Hadi Fanaee-T (hadi.fanaee@fe.up.pt) 112 | -------------------------------------------------------------------------------- /BikeSharingNeuralNet/Readme.txt: -------------------------------------------------------------------------------- 1 | ========================================== 2 | Bike Sharing Dataset 3 | ========================================== 4 | 5 | Hadi Fanaee-T 6 | 7 | Laboratory of Artificial Intelligence and Decision Support (LIAAD), University of Porto 8 | INESC Porto, Campus da FEUP 9 | Rua Dr. Roberto Frias, 378 10 | 4200 - 465 Porto, Portugal 11 | 12 | 13 | ========================================= 14 | Background 15 | ========================================= 16 | 17 | Bike sharing systems are new generation of traditional bike rentals where whole process from membership, rental and return 18 | back has become automatic. Through these systems, user is able to easily rent a bike from a particular position and return 19 | back at another position. Currently, there are about over 500 bike-sharing programs around the world which is composed of 20 | over 500 thousands bicycles. Today, there exists great interest in these systems due to their important role in traffic, 21 | environmental and health issues. 22 | 23 | Apart from interesting real world applications of bike sharing systems, the characteristics of data being generated by 24 | these systems make them attractive for the research. Opposed to other transport services such as bus or subway, the duration 25 | of travel, departure and arrival position is explicitly recorded in these systems. This feature turns bike sharing system into 26 | a virtual sensor network that can be used for sensing mobility in the city. Hence, it is expected that most of important 27 | events in the city could be detected via monitoring these data. 28 | 29 | ========================================= 30 | Data Set 31 | ========================================= 32 | Bike-sharing rental process is highly correlated to the environmental and seasonal settings. For instance, weather conditions, 33 | precipitation, day of week, season, hour of the day, etc. can affect the rental behaviors. The core data set is related to 34 | the two-year historical log corresponding to years 2011 and 2012 from Capital Bikeshare system, Washington D.C., USA which is 35 | publicly available in http://capitalbikeshare.com/system-data. We aggregated the data on two hourly and daily basis and then 36 | extracted and added the corresponding weather and seasonal information. Weather information are extracted from http://www.freemeteo.com. 37 | 38 | ========================================= 39 | Associated tasks 40 | ========================================= 41 | 42 | - Regression: 43 | Predication of bike rental count hourly or daily based on the environmental and seasonal settings. 44 | 45 | - Event and Anomaly Detection: 46 | Count of rented bikes are also correlated to some events in the town which easily are traceable via search engines. 47 | For instance, query like "2012-10-30 washington d.c." in Google returns related results to Hurricane Sandy. Some of the important events are 48 | identified in [1]. Therefore the data can be used for validation of anomaly or event detection algorithms as well. 49 | 50 | 51 | ========================================= 52 | Files 53 | ========================================= 54 | 55 | - Readme.txt 56 | - hour.csv : bike sharing counts aggregated on hourly basis. Records: 17379 hours 57 | - day.csv - bike sharing counts aggregated on daily basis. Records: 731 days 58 | 59 | 60 | ========================================= 61 | Dataset characteristics 62 | ========================================= 63 | Both hour.csv and day.csv have the following fields, except hr which is not available in day.csv 64 | 65 | - instant: record index 66 | - dteday : date 67 | - season : season (1:springer, 2:summer, 3:fall, 4:winter) 68 | - yr : year (0: 2011, 1:2012) 69 | - mnth : month ( 1 to 12) 70 | - hr : hour (0 to 23) 71 | - holiday : weather day is holiday or not (extracted from http://dchr.dc.gov/page/holiday-schedule) 72 | - weekday : day of the week 73 | - workingday : if day is neither weekend nor holiday is 1, otherwise is 0. 74 | + weathersit : 75 | - 1: Clear, Few clouds, Partly cloudy, Partly cloudy 76 | - 2: Mist + Cloudy, Mist + Broken clouds, Mist + Few clouds, Mist 77 | - 3: Light Snow, Light Rain + Thunderstorm + Scattered clouds, Light Rain + Scattered clouds 78 | - 4: Heavy Rain + Ice Pallets + Thunderstorm + Mist, Snow + Fog 79 | - temp : Normalized temperature in Celsius. The values are divided to 41 (max) 80 | - atemp: Normalized feeling temperature in Celsius. The values are divided to 50 (max) 81 | - hum: Normalized humidity. The values are divided to 100 (max) 82 | - windspeed: Normalized wind speed. The values are divided to 67 (max) 83 | - casual: count of casual users 84 | - registered: count of registered users 85 | - cnt: count of total rental bikes including both casual and registered 86 | 87 | ========================================= 88 | License 89 | ========================================= 90 | Use of this dataset in publications must be cited to the following publication: 91 | 92 | [1] Fanaee-T, Hadi, and Gama, Joao, "Event labeling combining ensemble detectors and background knowledge", Progress in Artificial Intelligence (2013): pp. 1-15, Springer Berlin Heidelberg, doi:10.1007/s13748-013-0040-3. 93 | 94 | @article{ 95 | year={2013}, 96 | issn={2192-6352}, 97 | journal={Progress in Artificial Intelligence}, 98 | doi={10.1007/s13748-013-0040-3}, 99 | title={Event labeling combining ensemble detectors and background knowledge}, 100 | url={http://dx.doi.org/10.1007/s13748-013-0040-3}, 101 | publisher={Springer Berlin Heidelberg}, 102 | keywords={Event labeling; Event detection; Ensemble learning; Background knowledge}, 103 | author={Fanaee-T, Hadi and Gama, Joao}, 104 | pages={1-15} 105 | } 106 | 107 | ========================================= 108 | Contact 109 | ========================================= 110 | 111 | For further information about this dataset please contact Hadi Fanaee-T (hadi.fanaee@fe.up.pt) 112 | -------------------------------------------------------------------------------- /BikeSharingNeuralNet/day.csv: -------------------------------------------------------------------------------- 1 | instant,dteday,season,yr,mnth,holiday,weekday,workingday,weathersit,temp,atemp,hum,windspeed,casual,registered,cnt 2 | 1,2011-01-01,1,0,1,0,6,0,2,0.344167,0.363625,0.805833,0.160446,331,654,985 3 | 2,2011-01-02,1,0,1,0,0,0,2,0.363478,0.353739,0.696087,0.248539,131,670,801 4 | 3,2011-01-03,1,0,1,0,1,1,1,0.196364,0.189405,0.437273,0.248309,120,1229,1349 5 | 4,2011-01-04,1,0,1,0,2,1,1,0.2,0.212122,0.590435,0.160296,108,1454,1562 6 | 5,2011-01-05,1,0,1,0,3,1,1,0.226957,0.22927,0.436957,0.1869,82,1518,1600 7 | 6,2011-01-06,1,0,1,0,4,1,1,0.204348,0.233209,0.518261,0.0895652,88,1518,1606 8 | 7,2011-01-07,1,0,1,0,5,1,2,0.196522,0.208839,0.498696,0.168726,148,1362,1510 9 | 8,2011-01-08,1,0,1,0,6,0,2,0.165,0.162254,0.535833,0.266804,68,891,959 10 | 9,2011-01-09,1,0,1,0,0,0,1,0.138333,0.116175,0.434167,0.36195,54,768,822 11 | 10,2011-01-10,1,0,1,0,1,1,1,0.150833,0.150888,0.482917,0.223267,41,1280,1321 12 | 11,2011-01-11,1,0,1,0,2,1,2,0.169091,0.191464,0.686364,0.122132,43,1220,1263 13 | 12,2011-01-12,1,0,1,0,3,1,1,0.172727,0.160473,0.599545,0.304627,25,1137,1162 14 | 13,2011-01-13,1,0,1,0,4,1,1,0.165,0.150883,0.470417,0.301,38,1368,1406 15 | 14,2011-01-14,1,0,1,0,5,1,1,0.16087,0.188413,0.537826,0.126548,54,1367,1421 16 | 15,2011-01-15,1,0,1,0,6,0,2,0.233333,0.248112,0.49875,0.157963,222,1026,1248 17 | 16,2011-01-16,1,0,1,0,0,0,1,0.231667,0.234217,0.48375,0.188433,251,953,1204 18 | 17,2011-01-17,1,0,1,1,1,0,2,0.175833,0.176771,0.5375,0.194017,117,883,1000 19 | 18,2011-01-18,1,0,1,0,2,1,2,0.216667,0.232333,0.861667,0.146775,9,674,683 20 | 19,2011-01-19,1,0,1,0,3,1,2,0.292174,0.298422,0.741739,0.208317,78,1572,1650 21 | 20,2011-01-20,1,0,1,0,4,1,2,0.261667,0.25505,0.538333,0.195904,83,1844,1927 22 | 21,2011-01-21,1,0,1,0,5,1,1,0.1775,0.157833,0.457083,0.353242,75,1468,1543 23 | 22,2011-01-22,1,0,1,0,6,0,1,0.0591304,0.0790696,0.4,0.17197,93,888,981 24 | 23,2011-01-23,1,0,1,0,0,0,1,0.0965217,0.0988391,0.436522,0.2466,150,836,986 25 | 24,2011-01-24,1,0,1,0,1,1,1,0.0973913,0.11793,0.491739,0.15833,86,1330,1416 26 | 25,2011-01-25,1,0,1,0,2,1,2,0.223478,0.234526,0.616957,0.129796,186,1799,1985 27 | 26,2011-01-26,1,0,1,0,3,1,3,0.2175,0.2036,0.8625,0.29385,34,472,506 28 | 27,2011-01-27,1,0,1,0,4,1,1,0.195,0.2197,0.6875,0.113837,15,416,431 29 | 28,2011-01-28,1,0,1,0,5,1,2,0.203478,0.223317,0.793043,0.1233,38,1129,1167 30 | 29,2011-01-29,1,0,1,0,6,0,1,0.196522,0.212126,0.651739,0.145365,123,975,1098 31 | 30,2011-01-30,1,0,1,0,0,0,1,0.216522,0.250322,0.722174,0.0739826,140,956,1096 32 | 31,2011-01-31,1,0,1,0,1,1,2,0.180833,0.18625,0.60375,0.187192,42,1459,1501 33 | 32,2011-02-01,1,0,2,0,2,1,2,0.192174,0.23453,0.829565,0.053213,47,1313,1360 34 | 33,2011-02-02,1,0,2,0,3,1,2,0.26,0.254417,0.775417,0.264308,72,1454,1526 35 | 34,2011-02-03,1,0,2,0,4,1,1,0.186957,0.177878,0.437826,0.277752,61,1489,1550 36 | 35,2011-02-04,1,0,2,0,5,1,2,0.211304,0.228587,0.585217,0.127839,88,1620,1708 37 | 36,2011-02-05,1,0,2,0,6,0,2,0.233333,0.243058,0.929167,0.161079,100,905,1005 38 | 37,2011-02-06,1,0,2,0,0,0,1,0.285833,0.291671,0.568333,0.1418,354,1269,1623 39 | 38,2011-02-07,1,0,2,0,1,1,1,0.271667,0.303658,0.738333,0.0454083,120,1592,1712 40 | 39,2011-02-08,1,0,2,0,2,1,1,0.220833,0.198246,0.537917,0.36195,64,1466,1530 41 | 40,2011-02-09,1,0,2,0,3,1,2,0.134783,0.144283,0.494783,0.188839,53,1552,1605 42 | 41,2011-02-10,1,0,2,0,4,1,1,0.144348,0.149548,0.437391,0.221935,47,1491,1538 43 | 42,2011-02-11,1,0,2,0,5,1,1,0.189091,0.213509,0.506364,0.10855,149,1597,1746 44 | 43,2011-02-12,1,0,2,0,6,0,1,0.2225,0.232954,0.544167,0.203367,288,1184,1472 45 | 44,2011-02-13,1,0,2,0,0,0,1,0.316522,0.324113,0.457391,0.260883,397,1192,1589 46 | 45,2011-02-14,1,0,2,0,1,1,1,0.415,0.39835,0.375833,0.417908,208,1705,1913 47 | 46,2011-02-15,1,0,2,0,2,1,1,0.266087,0.254274,0.314348,0.291374,140,1675,1815 48 | 47,2011-02-16,1,0,2,0,3,1,1,0.318261,0.3162,0.423478,0.251791,218,1897,2115 49 | 48,2011-02-17,1,0,2,0,4,1,1,0.435833,0.428658,0.505,0.230104,259,2216,2475 50 | 49,2011-02-18,1,0,2,0,5,1,1,0.521667,0.511983,0.516667,0.264925,579,2348,2927 51 | 50,2011-02-19,1,0,2,0,6,0,1,0.399167,0.391404,0.187917,0.507463,532,1103,1635 52 | 51,2011-02-20,1,0,2,0,0,0,1,0.285217,0.27733,0.407826,0.223235,639,1173,1812 53 | 52,2011-02-21,1,0,2,1,1,0,2,0.303333,0.284075,0.605,0.307846,195,912,1107 54 | 53,2011-02-22,1,0,2,0,2,1,1,0.182222,0.186033,0.577778,0.195683,74,1376,1450 55 | 54,2011-02-23,1,0,2,0,3,1,1,0.221739,0.245717,0.423043,0.094113,139,1778,1917 56 | 55,2011-02-24,1,0,2,0,4,1,2,0.295652,0.289191,0.697391,0.250496,100,1707,1807 57 | 56,2011-02-25,1,0,2,0,5,1,2,0.364348,0.350461,0.712174,0.346539,120,1341,1461 58 | 57,2011-02-26,1,0,2,0,6,0,1,0.2825,0.282192,0.537917,0.186571,424,1545,1969 59 | 58,2011-02-27,1,0,2,0,0,0,1,0.343478,0.351109,0.68,0.125248,694,1708,2402 60 | 59,2011-02-28,1,0,2,0,1,1,2,0.407273,0.400118,0.876364,0.289686,81,1365,1446 61 | 60,2011-03-01,1,0,3,0,2,1,1,0.266667,0.263879,0.535,0.216425,137,1714,1851 62 | 61,2011-03-02,1,0,3,0,3,1,1,0.335,0.320071,0.449583,0.307833,231,1903,2134 63 | 62,2011-03-03,1,0,3,0,4,1,1,0.198333,0.200133,0.318333,0.225754,123,1562,1685 64 | 63,2011-03-04,1,0,3,0,5,1,2,0.261667,0.255679,0.610417,0.203346,214,1730,1944 65 | 64,2011-03-05,1,0,3,0,6,0,2,0.384167,0.378779,0.789167,0.251871,640,1437,2077 66 | 65,2011-03-06,1,0,3,0,0,0,2,0.376522,0.366252,0.948261,0.343287,114,491,605 67 | 66,2011-03-07,1,0,3,0,1,1,1,0.261739,0.238461,0.551304,0.341352,244,1628,1872 68 | 67,2011-03-08,1,0,3,0,2,1,1,0.2925,0.3024,0.420833,0.12065,316,1817,2133 69 | 68,2011-03-09,1,0,3,0,3,1,2,0.295833,0.286608,0.775417,0.22015,191,1700,1891 70 | 69,2011-03-10,1,0,3,0,4,1,3,0.389091,0.385668,0,0.261877,46,577,623 71 | 70,2011-03-11,1,0,3,0,5,1,2,0.316522,0.305,0.649565,0.23297,247,1730,1977 72 | 71,2011-03-12,1,0,3,0,6,0,1,0.329167,0.32575,0.594583,0.220775,724,1408,2132 73 | 72,2011-03-13,1,0,3,0,0,0,1,0.384348,0.380091,0.527391,0.270604,982,1435,2417 74 | 73,2011-03-14,1,0,3,0,1,1,1,0.325217,0.332,0.496957,0.136926,359,1687,2046 75 | 74,2011-03-15,1,0,3,0,2,1,2,0.317391,0.318178,0.655652,0.184309,289,1767,2056 76 | 75,2011-03-16,1,0,3,0,3,1,2,0.365217,0.36693,0.776522,0.203117,321,1871,2192 77 | 76,2011-03-17,1,0,3,0,4,1,1,0.415,0.410333,0.602917,0.209579,424,2320,2744 78 | 77,2011-03-18,1,0,3,0,5,1,1,0.54,0.527009,0.525217,0.231017,884,2355,3239 79 | 78,2011-03-19,1,0,3,0,6,0,1,0.4725,0.466525,0.379167,0.368167,1424,1693,3117 80 | 79,2011-03-20,1,0,3,0,0,0,1,0.3325,0.32575,0.47375,0.207721,1047,1424,2471 81 | 80,2011-03-21,2,0,3,0,1,1,2,0.430435,0.409735,0.737391,0.288783,401,1676,2077 82 | 81,2011-03-22,2,0,3,0,2,1,1,0.441667,0.440642,0.624583,0.22575,460,2243,2703 83 | 82,2011-03-23,2,0,3,0,3,1,2,0.346957,0.337939,0.839565,0.234261,203,1918,2121 84 | 83,2011-03-24,2,0,3,0,4,1,2,0.285,0.270833,0.805833,0.243787,166,1699,1865 85 | 84,2011-03-25,2,0,3,0,5,1,1,0.264167,0.256312,0.495,0.230725,300,1910,2210 86 | 85,2011-03-26,2,0,3,0,6,0,1,0.265833,0.257571,0.394167,0.209571,981,1515,2496 87 | 86,2011-03-27,2,0,3,0,0,0,2,0.253043,0.250339,0.493913,0.1843,472,1221,1693 88 | 87,2011-03-28,2,0,3,0,1,1,1,0.264348,0.257574,0.302174,0.212204,222,1806,2028 89 | 88,2011-03-29,2,0,3,0,2,1,1,0.3025,0.292908,0.314167,0.226996,317,2108,2425 90 | 89,2011-03-30,2,0,3,0,3,1,2,0.3,0.29735,0.646667,0.172888,168,1368,1536 91 | 90,2011-03-31,2,0,3,0,4,1,3,0.268333,0.257575,0.918333,0.217646,179,1506,1685 92 | 91,2011-04-01,2,0,4,0,5,1,2,0.3,0.283454,0.68625,0.258708,307,1920,2227 93 | 92,2011-04-02,2,0,4,0,6,0,2,0.315,0.315637,0.65375,0.197146,898,1354,2252 94 | 93,2011-04-03,2,0,4,0,0,0,1,0.378333,0.378767,0.48,0.182213,1651,1598,3249 95 | 94,2011-04-04,2,0,4,0,1,1,1,0.573333,0.542929,0.42625,0.385571,734,2381,3115 96 | 95,2011-04-05,2,0,4,0,2,1,2,0.414167,0.39835,0.642083,0.388067,167,1628,1795 97 | 96,2011-04-06,2,0,4,0,3,1,1,0.390833,0.387608,0.470833,0.263063,413,2395,2808 98 | 97,2011-04-07,2,0,4,0,4,1,1,0.4375,0.433696,0.602917,0.162312,571,2570,3141 99 | 98,2011-04-08,2,0,4,0,5,1,2,0.335833,0.324479,0.83625,0.226992,172,1299,1471 100 | 99,2011-04-09,2,0,4,0,6,0,2,0.3425,0.341529,0.8775,0.133083,879,1576,2455 101 | 100,2011-04-10,2,0,4,0,0,0,2,0.426667,0.426737,0.8575,0.146767,1188,1707,2895 102 | 101,2011-04-11,2,0,4,0,1,1,2,0.595652,0.565217,0.716956,0.324474,855,2493,3348 103 | 102,2011-04-12,2,0,4,0,2,1,2,0.5025,0.493054,0.739167,0.274879,257,1777,2034 104 | 103,2011-04-13,2,0,4,0,3,1,2,0.4125,0.417283,0.819167,0.250617,209,1953,2162 105 | 104,2011-04-14,2,0,4,0,4,1,1,0.4675,0.462742,0.540417,0.1107,529,2738,3267 106 | 105,2011-04-15,2,0,4,1,5,0,1,0.446667,0.441913,0.67125,0.226375,642,2484,3126 107 | 106,2011-04-16,2,0,4,0,6,0,3,0.430833,0.425492,0.888333,0.340808,121,674,795 108 | 107,2011-04-17,2,0,4,0,0,0,1,0.456667,0.445696,0.479583,0.303496,1558,2186,3744 109 | 108,2011-04-18,2,0,4,0,1,1,1,0.5125,0.503146,0.5425,0.163567,669,2760,3429 110 | 109,2011-04-19,2,0,4,0,2,1,2,0.505833,0.489258,0.665833,0.157971,409,2795,3204 111 | 110,2011-04-20,2,0,4,0,3,1,1,0.595,0.564392,0.614167,0.241925,613,3331,3944 112 | 111,2011-04-21,2,0,4,0,4,1,1,0.459167,0.453892,0.407083,0.325258,745,3444,4189 113 | 112,2011-04-22,2,0,4,0,5,1,2,0.336667,0.321954,0.729583,0.219521,177,1506,1683 114 | 113,2011-04-23,2,0,4,0,6,0,2,0.46,0.450121,0.887917,0.230725,1462,2574,4036 115 | 114,2011-04-24,2,0,4,0,0,0,2,0.581667,0.551763,0.810833,0.192175,1710,2481,4191 116 | 115,2011-04-25,2,0,4,0,1,1,1,0.606667,0.5745,0.776667,0.185333,773,3300,4073 117 | 116,2011-04-26,2,0,4,0,2,1,1,0.631667,0.594083,0.729167,0.3265,678,3722,4400 118 | 117,2011-04-27,2,0,4,0,3,1,2,0.62,0.575142,0.835417,0.3122,547,3325,3872 119 | 118,2011-04-28,2,0,4,0,4,1,2,0.6175,0.578929,0.700833,0.320908,569,3489,4058 120 | 119,2011-04-29,2,0,4,0,5,1,1,0.51,0.497463,0.457083,0.240063,878,3717,4595 121 | 120,2011-04-30,2,0,4,0,6,0,1,0.4725,0.464021,0.503333,0.235075,1965,3347,5312 122 | 121,2011-05-01,2,0,5,0,0,0,2,0.451667,0.448204,0.762083,0.106354,1138,2213,3351 123 | 122,2011-05-02,2,0,5,0,1,1,2,0.549167,0.532833,0.73,0.183454,847,3554,4401 124 | 123,2011-05-03,2,0,5,0,2,1,2,0.616667,0.582079,0.697083,0.342667,603,3848,4451 125 | 124,2011-05-04,2,0,5,0,3,1,2,0.414167,0.40465,0.737083,0.328996,255,2378,2633 126 | 125,2011-05-05,2,0,5,0,4,1,1,0.459167,0.441917,0.444167,0.295392,614,3819,4433 127 | 126,2011-05-06,2,0,5,0,5,1,1,0.479167,0.474117,0.59,0.228246,894,3714,4608 128 | 127,2011-05-07,2,0,5,0,6,0,1,0.52,0.512621,0.54125,0.16045,1612,3102,4714 129 | 128,2011-05-08,2,0,5,0,0,0,1,0.528333,0.518933,0.631667,0.0746375,1401,2932,4333 130 | 129,2011-05-09,2,0,5,0,1,1,1,0.5325,0.525246,0.58875,0.176,664,3698,4362 131 | 130,2011-05-10,2,0,5,0,2,1,1,0.5325,0.522721,0.489167,0.115671,694,4109,4803 132 | 131,2011-05-11,2,0,5,0,3,1,1,0.5425,0.5284,0.632917,0.120642,550,3632,4182 133 | 132,2011-05-12,2,0,5,0,4,1,1,0.535,0.523363,0.7475,0.189667,695,4169,4864 134 | 133,2011-05-13,2,0,5,0,5,1,2,0.5125,0.4943,0.863333,0.179725,692,3413,4105 135 | 134,2011-05-14,2,0,5,0,6,0,2,0.520833,0.500629,0.9225,0.13495,902,2507,3409 136 | 135,2011-05-15,2,0,5,0,0,0,2,0.5625,0.536,0.867083,0.152979,1582,2971,4553 137 | 136,2011-05-16,2,0,5,0,1,1,1,0.5775,0.550512,0.787917,0.126871,773,3185,3958 138 | 137,2011-05-17,2,0,5,0,2,1,2,0.561667,0.538529,0.837917,0.277354,678,3445,4123 139 | 138,2011-05-18,2,0,5,0,3,1,2,0.55,0.527158,0.87,0.201492,536,3319,3855 140 | 139,2011-05-19,2,0,5,0,4,1,2,0.530833,0.510742,0.829583,0.108213,735,3840,4575 141 | 140,2011-05-20,2,0,5,0,5,1,1,0.536667,0.529042,0.719583,0.125013,909,4008,4917 142 | 141,2011-05-21,2,0,5,0,6,0,1,0.6025,0.571975,0.626667,0.12065,2258,3547,5805 143 | 142,2011-05-22,2,0,5,0,0,0,1,0.604167,0.5745,0.749583,0.148008,1576,3084,4660 144 | 143,2011-05-23,2,0,5,0,1,1,2,0.631667,0.590296,0.81,0.233842,836,3438,4274 145 | 144,2011-05-24,2,0,5,0,2,1,2,0.66,0.604813,0.740833,0.207092,659,3833,4492 146 | 145,2011-05-25,2,0,5,0,3,1,1,0.660833,0.615542,0.69625,0.154233,740,4238,4978 147 | 146,2011-05-26,2,0,5,0,4,1,1,0.708333,0.654688,0.6775,0.199642,758,3919,4677 148 | 147,2011-05-27,2,0,5,0,5,1,1,0.681667,0.637008,0.65375,0.240679,871,3808,4679 149 | 148,2011-05-28,2,0,5,0,6,0,1,0.655833,0.612379,0.729583,0.230092,2001,2757,4758 150 | 149,2011-05-29,2,0,5,0,0,0,1,0.6675,0.61555,0.81875,0.213938,2355,2433,4788 151 | 150,2011-05-30,2,0,5,1,1,0,1,0.733333,0.671092,0.685,0.131225,1549,2549,4098 152 | 151,2011-05-31,2,0,5,0,2,1,1,0.775,0.725383,0.636667,0.111329,673,3309,3982 153 | 152,2011-06-01,2,0,6,0,3,1,2,0.764167,0.720967,0.677083,0.207092,513,3461,3974 154 | 153,2011-06-02,2,0,6,0,4,1,1,0.715,0.643942,0.305,0.292287,736,4232,4968 155 | 154,2011-06-03,2,0,6,0,5,1,1,0.62,0.587133,0.354167,0.253121,898,4414,5312 156 | 155,2011-06-04,2,0,6,0,6,0,1,0.635,0.594696,0.45625,0.123142,1869,3473,5342 157 | 156,2011-06-05,2,0,6,0,0,0,2,0.648333,0.616804,0.6525,0.138692,1685,3221,4906 158 | 157,2011-06-06,2,0,6,0,1,1,1,0.678333,0.621858,0.6,0.121896,673,3875,4548 159 | 158,2011-06-07,2,0,6,0,2,1,1,0.7075,0.65595,0.597917,0.187808,763,4070,4833 160 | 159,2011-06-08,2,0,6,0,3,1,1,0.775833,0.727279,0.622083,0.136817,676,3725,4401 161 | 160,2011-06-09,2,0,6,0,4,1,2,0.808333,0.757579,0.568333,0.149883,563,3352,3915 162 | 161,2011-06-10,2,0,6,0,5,1,1,0.755,0.703292,0.605,0.140554,815,3771,4586 163 | 162,2011-06-11,2,0,6,0,6,0,1,0.725,0.678038,0.654583,0.15485,1729,3237,4966 164 | 163,2011-06-12,2,0,6,0,0,0,1,0.6925,0.643325,0.747917,0.163567,1467,2993,4460 165 | 164,2011-06-13,2,0,6,0,1,1,1,0.635,0.601654,0.494583,0.30535,863,4157,5020 166 | 165,2011-06-14,2,0,6,0,2,1,1,0.604167,0.591546,0.507083,0.269283,727,4164,4891 167 | 166,2011-06-15,2,0,6,0,3,1,1,0.626667,0.587754,0.471667,0.167912,769,4411,5180 168 | 167,2011-06-16,2,0,6,0,4,1,2,0.628333,0.595346,0.688333,0.206471,545,3222,3767 169 | 168,2011-06-17,2,0,6,0,5,1,1,0.649167,0.600383,0.735833,0.143029,863,3981,4844 170 | 169,2011-06-18,2,0,6,0,6,0,1,0.696667,0.643954,0.670417,0.119408,1807,3312,5119 171 | 170,2011-06-19,2,0,6,0,0,0,2,0.699167,0.645846,0.666667,0.102,1639,3105,4744 172 | 171,2011-06-20,2,0,6,0,1,1,2,0.635,0.595346,0.74625,0.155475,699,3311,4010 173 | 172,2011-06-21,3,0,6,0,2,1,2,0.680833,0.637646,0.770417,0.171025,774,4061,4835 174 | 173,2011-06-22,3,0,6,0,3,1,1,0.733333,0.693829,0.7075,0.172262,661,3846,4507 175 | 174,2011-06-23,3,0,6,0,4,1,2,0.728333,0.693833,0.703333,0.238804,746,4044,4790 176 | 175,2011-06-24,3,0,6,0,5,1,1,0.724167,0.656583,0.573333,0.222025,969,4022,4991 177 | 176,2011-06-25,3,0,6,0,6,0,1,0.695,0.643313,0.483333,0.209571,1782,3420,5202 178 | 177,2011-06-26,3,0,6,0,0,0,1,0.68,0.637629,0.513333,0.0945333,1920,3385,5305 179 | 178,2011-06-27,3,0,6,0,1,1,2,0.6825,0.637004,0.658333,0.107588,854,3854,4708 180 | 179,2011-06-28,3,0,6,0,2,1,1,0.744167,0.692558,0.634167,0.144283,732,3916,4648 181 | 180,2011-06-29,3,0,6,0,3,1,1,0.728333,0.654688,0.497917,0.261821,848,4377,5225 182 | 181,2011-06-30,3,0,6,0,4,1,1,0.696667,0.637008,0.434167,0.185312,1027,4488,5515 183 | 182,2011-07-01,3,0,7,0,5,1,1,0.7225,0.652162,0.39625,0.102608,1246,4116,5362 184 | 183,2011-07-02,3,0,7,0,6,0,1,0.738333,0.667308,0.444583,0.115062,2204,2915,5119 185 | 184,2011-07-03,3,0,7,0,0,0,2,0.716667,0.668575,0.6825,0.228858,2282,2367,4649 186 | 185,2011-07-04,3,0,7,1,1,0,2,0.726667,0.665417,0.637917,0.0814792,3065,2978,6043 187 | 186,2011-07-05,3,0,7,0,2,1,1,0.746667,0.696338,0.590417,0.126258,1031,3634,4665 188 | 187,2011-07-06,3,0,7,0,3,1,1,0.72,0.685633,0.743333,0.149883,784,3845,4629 189 | 188,2011-07-07,3,0,7,0,4,1,1,0.75,0.686871,0.65125,0.1592,754,3838,4592 190 | 189,2011-07-08,3,0,7,0,5,1,2,0.709167,0.670483,0.757917,0.225129,692,3348,4040 191 | 190,2011-07-09,3,0,7,0,6,0,1,0.733333,0.664158,0.609167,0.167912,1988,3348,5336 192 | 191,2011-07-10,3,0,7,0,0,0,1,0.7475,0.690025,0.578333,0.183471,1743,3138,4881 193 | 192,2011-07-11,3,0,7,0,1,1,1,0.7625,0.729804,0.635833,0.282337,723,3363,4086 194 | 193,2011-07-12,3,0,7,0,2,1,1,0.794167,0.739275,0.559167,0.200254,662,3596,4258 195 | 194,2011-07-13,3,0,7,0,3,1,1,0.746667,0.689404,0.631667,0.146133,748,3594,4342 196 | 195,2011-07-14,3,0,7,0,4,1,1,0.680833,0.635104,0.47625,0.240667,888,4196,5084 197 | 196,2011-07-15,3,0,7,0,5,1,1,0.663333,0.624371,0.59125,0.182833,1318,4220,5538 198 | 197,2011-07-16,3,0,7,0,6,0,1,0.686667,0.638263,0.585,0.208342,2418,3505,5923 199 | 198,2011-07-17,3,0,7,0,0,0,1,0.719167,0.669833,0.604167,0.245033,2006,3296,5302 200 | 199,2011-07-18,3,0,7,0,1,1,1,0.746667,0.703925,0.65125,0.215804,841,3617,4458 201 | 200,2011-07-19,3,0,7,0,2,1,1,0.776667,0.747479,0.650417,0.1306,752,3789,4541 202 | 201,2011-07-20,3,0,7,0,3,1,1,0.768333,0.74685,0.707083,0.113817,644,3688,4332 203 | 202,2011-07-21,3,0,7,0,4,1,2,0.815,0.826371,0.69125,0.222021,632,3152,3784 204 | 203,2011-07-22,3,0,7,0,5,1,1,0.848333,0.840896,0.580417,0.1331,562,2825,3387 205 | 204,2011-07-23,3,0,7,0,6,0,1,0.849167,0.804287,0.5,0.131221,987,2298,3285 206 | 205,2011-07-24,3,0,7,0,0,0,1,0.83,0.794829,0.550833,0.169171,1050,2556,3606 207 | 206,2011-07-25,3,0,7,0,1,1,1,0.743333,0.720958,0.757083,0.0908083,568,3272,3840 208 | 207,2011-07-26,3,0,7,0,2,1,1,0.771667,0.696979,0.540833,0.200258,750,3840,4590 209 | 208,2011-07-27,3,0,7,0,3,1,1,0.775,0.690667,0.402917,0.183463,755,3901,4656 210 | 209,2011-07-28,3,0,7,0,4,1,1,0.779167,0.7399,0.583333,0.178479,606,3784,4390 211 | 210,2011-07-29,3,0,7,0,5,1,1,0.838333,0.785967,0.5425,0.174138,670,3176,3846 212 | 211,2011-07-30,3,0,7,0,6,0,1,0.804167,0.728537,0.465833,0.168537,1559,2916,4475 213 | 212,2011-07-31,3,0,7,0,0,0,1,0.805833,0.729796,0.480833,0.164813,1524,2778,4302 214 | 213,2011-08-01,3,0,8,0,1,1,1,0.771667,0.703292,0.550833,0.156717,729,3537,4266 215 | 214,2011-08-02,3,0,8,0,2,1,1,0.783333,0.707071,0.49125,0.20585,801,4044,4845 216 | 215,2011-08-03,3,0,8,0,3,1,2,0.731667,0.679937,0.6575,0.135583,467,3107,3574 217 | 216,2011-08-04,3,0,8,0,4,1,2,0.71,0.664788,0.7575,0.19715,799,3777,4576 218 | 217,2011-08-05,3,0,8,0,5,1,1,0.710833,0.656567,0.630833,0.184696,1023,3843,4866 219 | 218,2011-08-06,3,0,8,0,6,0,2,0.716667,0.676154,0.755,0.22825,1521,2773,4294 220 | 219,2011-08-07,3,0,8,0,0,0,1,0.7425,0.715292,0.752917,0.201487,1298,2487,3785 221 | 220,2011-08-08,3,0,8,0,1,1,1,0.765,0.703283,0.592083,0.192175,846,3480,4326 222 | 221,2011-08-09,3,0,8,0,2,1,1,0.775,0.724121,0.570417,0.151121,907,3695,4602 223 | 222,2011-08-10,3,0,8,0,3,1,1,0.766667,0.684983,0.424167,0.200258,884,3896,4780 224 | 223,2011-08-11,3,0,8,0,4,1,1,0.7175,0.651521,0.42375,0.164796,812,3980,4792 225 | 224,2011-08-12,3,0,8,0,5,1,1,0.708333,0.654042,0.415,0.125621,1051,3854,4905 226 | 225,2011-08-13,3,0,8,0,6,0,2,0.685833,0.645858,0.729583,0.211454,1504,2646,4150 227 | 226,2011-08-14,3,0,8,0,0,0,2,0.676667,0.624388,0.8175,0.222633,1338,2482,3820 228 | 227,2011-08-15,3,0,8,0,1,1,1,0.665833,0.616167,0.712083,0.208954,775,3563,4338 229 | 228,2011-08-16,3,0,8,0,2,1,1,0.700833,0.645837,0.578333,0.236329,721,4004,4725 230 | 229,2011-08-17,3,0,8,0,3,1,1,0.723333,0.666671,0.575417,0.143667,668,4026,4694 231 | 230,2011-08-18,3,0,8,0,4,1,1,0.711667,0.662258,0.654583,0.233208,639,3166,3805 232 | 231,2011-08-19,3,0,8,0,5,1,2,0.685,0.633221,0.722917,0.139308,797,3356,4153 233 | 232,2011-08-20,3,0,8,0,6,0,1,0.6975,0.648996,0.674167,0.104467,1914,3277,5191 234 | 233,2011-08-21,3,0,8,0,0,0,1,0.710833,0.675525,0.77,0.248754,1249,2624,3873 235 | 234,2011-08-22,3,0,8,0,1,1,1,0.691667,0.638254,0.47,0.27675,833,3925,4758 236 | 235,2011-08-23,3,0,8,0,2,1,1,0.640833,0.606067,0.455417,0.146763,1281,4614,5895 237 | 236,2011-08-24,3,0,8,0,3,1,1,0.673333,0.630692,0.605,0.253108,949,4181,5130 238 | 237,2011-08-25,3,0,8,0,4,1,2,0.684167,0.645854,0.771667,0.210833,435,3107,3542 239 | 238,2011-08-26,3,0,8,0,5,1,1,0.7,0.659733,0.76125,0.0839625,768,3893,4661 240 | 239,2011-08-27,3,0,8,0,6,0,2,0.68,0.635556,0.85,0.375617,226,889,1115 241 | 240,2011-08-28,3,0,8,0,0,0,1,0.707059,0.647959,0.561765,0.304659,1415,2919,4334 242 | 241,2011-08-29,3,0,8,0,1,1,1,0.636667,0.607958,0.554583,0.159825,729,3905,4634 243 | 242,2011-08-30,3,0,8,0,2,1,1,0.639167,0.594704,0.548333,0.125008,775,4429,5204 244 | 243,2011-08-31,3,0,8,0,3,1,1,0.656667,0.611121,0.597917,0.0833333,688,4370,5058 245 | 244,2011-09-01,3,0,9,0,4,1,1,0.655,0.614921,0.639167,0.141796,783,4332,5115 246 | 245,2011-09-02,3,0,9,0,5,1,2,0.643333,0.604808,0.727083,0.139929,875,3852,4727 247 | 246,2011-09-03,3,0,9,0,6,0,1,0.669167,0.633213,0.716667,0.185325,1935,2549,4484 248 | 247,2011-09-04,3,0,9,0,0,0,1,0.709167,0.665429,0.742083,0.206467,2521,2419,4940 249 | 248,2011-09-05,3,0,9,1,1,0,2,0.673333,0.625646,0.790417,0.212696,1236,2115,3351 250 | 249,2011-09-06,3,0,9,0,2,1,3,0.54,0.5152,0.886957,0.343943,204,2506,2710 251 | 250,2011-09-07,3,0,9,0,3,1,3,0.599167,0.544229,0.917083,0.0970208,118,1878,1996 252 | 251,2011-09-08,3,0,9,0,4,1,3,0.633913,0.555361,0.939565,0.192748,153,1689,1842 253 | 252,2011-09-09,3,0,9,0,5,1,2,0.65,0.578946,0.897917,0.124379,417,3127,3544 254 | 253,2011-09-10,3,0,9,0,6,0,1,0.66,0.607962,0.75375,0.153608,1750,3595,5345 255 | 254,2011-09-11,3,0,9,0,0,0,1,0.653333,0.609229,0.71375,0.115054,1633,3413,5046 256 | 255,2011-09-12,3,0,9,0,1,1,1,0.644348,0.60213,0.692174,0.088913,690,4023,4713 257 | 256,2011-09-13,3,0,9,0,2,1,1,0.650833,0.603554,0.7125,0.141804,701,4062,4763 258 | 257,2011-09-14,3,0,9,0,3,1,1,0.673333,0.6269,0.697083,0.1673,647,4138,4785 259 | 258,2011-09-15,3,0,9,0,4,1,2,0.5775,0.553671,0.709167,0.271146,428,3231,3659 260 | 259,2011-09-16,3,0,9,0,5,1,2,0.469167,0.461475,0.590417,0.164183,742,4018,4760 261 | 260,2011-09-17,3,0,9,0,6,0,2,0.491667,0.478512,0.718333,0.189675,1434,3077,4511 262 | 261,2011-09-18,3,0,9,0,0,0,1,0.5075,0.490537,0.695,0.178483,1353,2921,4274 263 | 262,2011-09-19,3,0,9,0,1,1,2,0.549167,0.529675,0.69,0.151742,691,3848,4539 264 | 263,2011-09-20,3,0,9,0,2,1,2,0.561667,0.532217,0.88125,0.134954,438,3203,3641 265 | 264,2011-09-21,3,0,9,0,3,1,2,0.595,0.550533,0.9,0.0964042,539,3813,4352 266 | 265,2011-09-22,3,0,9,0,4,1,2,0.628333,0.554963,0.902083,0.128125,555,4240,4795 267 | 266,2011-09-23,4,0,9,0,5,1,2,0.609167,0.522125,0.9725,0.0783667,258,2137,2395 268 | 267,2011-09-24,4,0,9,0,6,0,2,0.606667,0.564412,0.8625,0.0783833,1776,3647,5423 269 | 268,2011-09-25,4,0,9,0,0,0,2,0.634167,0.572637,0.845,0.0503792,1544,3466,5010 270 | 269,2011-09-26,4,0,9,0,1,1,2,0.649167,0.589042,0.848333,0.1107,684,3946,4630 271 | 270,2011-09-27,4,0,9,0,2,1,2,0.636667,0.574525,0.885417,0.118171,477,3643,4120 272 | 271,2011-09-28,4,0,9,0,3,1,2,0.635,0.575158,0.84875,0.148629,480,3427,3907 273 | 272,2011-09-29,4,0,9,0,4,1,1,0.616667,0.574512,0.699167,0.172883,653,4186,4839 274 | 273,2011-09-30,4,0,9,0,5,1,1,0.564167,0.544829,0.6475,0.206475,830,4372,5202 275 | 274,2011-10-01,4,0,10,0,6,0,2,0.41,0.412863,0.75375,0.292296,480,1949,2429 276 | 275,2011-10-02,4,0,10,0,0,0,2,0.356667,0.345317,0.791667,0.222013,616,2302,2918 277 | 276,2011-10-03,4,0,10,0,1,1,2,0.384167,0.392046,0.760833,0.0833458,330,3240,3570 278 | 277,2011-10-04,4,0,10,0,2,1,1,0.484167,0.472858,0.71,0.205854,486,3970,4456 279 | 278,2011-10-05,4,0,10,0,3,1,1,0.538333,0.527138,0.647917,0.17725,559,4267,4826 280 | 279,2011-10-06,4,0,10,0,4,1,1,0.494167,0.480425,0.620833,0.134954,639,4126,4765 281 | 280,2011-10-07,4,0,10,0,5,1,1,0.510833,0.504404,0.684167,0.0223917,949,4036,4985 282 | 281,2011-10-08,4,0,10,0,6,0,1,0.521667,0.513242,0.70125,0.0454042,2235,3174,5409 283 | 282,2011-10-09,4,0,10,0,0,0,1,0.540833,0.523983,0.7275,0.06345,2397,3114,5511 284 | 283,2011-10-10,4,0,10,1,1,0,1,0.570833,0.542925,0.73375,0.0423042,1514,3603,5117 285 | 284,2011-10-11,4,0,10,0,2,1,2,0.566667,0.546096,0.80875,0.143042,667,3896,4563 286 | 285,2011-10-12,4,0,10,0,3,1,3,0.543333,0.517717,0.90625,0.24815,217,2199,2416 287 | 286,2011-10-13,4,0,10,0,4,1,2,0.589167,0.551804,0.896667,0.141787,290,2623,2913 288 | 287,2011-10-14,4,0,10,0,5,1,2,0.550833,0.529675,0.71625,0.223883,529,3115,3644 289 | 288,2011-10-15,4,0,10,0,6,0,1,0.506667,0.498725,0.483333,0.258083,1899,3318,5217 290 | 289,2011-10-16,4,0,10,0,0,0,1,0.511667,0.503154,0.486667,0.281717,1748,3293,5041 291 | 290,2011-10-17,4,0,10,0,1,1,1,0.534167,0.510725,0.579583,0.175379,713,3857,4570 292 | 291,2011-10-18,4,0,10,0,2,1,2,0.5325,0.522721,0.701667,0.110087,637,4111,4748 293 | 292,2011-10-19,4,0,10,0,3,1,3,0.541739,0.513848,0.895217,0.243339,254,2170,2424 294 | 293,2011-10-20,4,0,10,0,4,1,1,0.475833,0.466525,0.63625,0.422275,471,3724,4195 295 | 294,2011-10-21,4,0,10,0,5,1,1,0.4275,0.423596,0.574167,0.221396,676,3628,4304 296 | 295,2011-10-22,4,0,10,0,6,0,1,0.4225,0.425492,0.629167,0.0926667,1499,2809,4308 297 | 296,2011-10-23,4,0,10,0,0,0,1,0.421667,0.422333,0.74125,0.0995125,1619,2762,4381 298 | 297,2011-10-24,4,0,10,0,1,1,1,0.463333,0.457067,0.772083,0.118792,699,3488,4187 299 | 298,2011-10-25,4,0,10,0,2,1,1,0.471667,0.463375,0.622917,0.166658,695,3992,4687 300 | 299,2011-10-26,4,0,10,0,3,1,2,0.484167,0.472846,0.720417,0.148642,404,3490,3894 301 | 300,2011-10-27,4,0,10,0,4,1,2,0.47,0.457046,0.812917,0.197763,240,2419,2659 302 | 301,2011-10-28,4,0,10,0,5,1,2,0.330833,0.318812,0.585833,0.229479,456,3291,3747 303 | 302,2011-10-29,4,0,10,0,6,0,3,0.254167,0.227913,0.8825,0.351371,57,570,627 304 | 303,2011-10-30,4,0,10,0,0,0,1,0.319167,0.321329,0.62375,0.176617,885,2446,3331 305 | 304,2011-10-31,4,0,10,0,1,1,1,0.34,0.356063,0.703333,0.10635,362,3307,3669 306 | 305,2011-11-01,4,0,11,0,2,1,1,0.400833,0.397088,0.68375,0.135571,410,3658,4068 307 | 306,2011-11-02,4,0,11,0,3,1,1,0.3775,0.390133,0.71875,0.0820917,370,3816,4186 308 | 307,2011-11-03,4,0,11,0,4,1,1,0.408333,0.405921,0.702083,0.136817,318,3656,3974 309 | 308,2011-11-04,4,0,11,0,5,1,2,0.403333,0.403392,0.6225,0.271779,470,3576,4046 310 | 309,2011-11-05,4,0,11,0,6,0,1,0.326667,0.323854,0.519167,0.189062,1156,2770,3926 311 | 310,2011-11-06,4,0,11,0,0,0,1,0.348333,0.362358,0.734583,0.0920542,952,2697,3649 312 | 311,2011-11-07,4,0,11,0,1,1,1,0.395,0.400871,0.75875,0.057225,373,3662,4035 313 | 312,2011-11-08,4,0,11,0,2,1,1,0.408333,0.412246,0.721667,0.0690375,376,3829,4205 314 | 313,2011-11-09,4,0,11,0,3,1,1,0.4,0.409079,0.758333,0.0621958,305,3804,4109 315 | 314,2011-11-10,4,0,11,0,4,1,2,0.38,0.373721,0.813333,0.189067,190,2743,2933 316 | 315,2011-11-11,4,0,11,1,5,0,1,0.324167,0.306817,0.44625,0.314675,440,2928,3368 317 | 316,2011-11-12,4,0,11,0,6,0,1,0.356667,0.357942,0.552917,0.212062,1275,2792,4067 318 | 317,2011-11-13,4,0,11,0,0,0,1,0.440833,0.43055,0.458333,0.281721,1004,2713,3717 319 | 318,2011-11-14,4,0,11,0,1,1,1,0.53,0.524612,0.587083,0.306596,595,3891,4486 320 | 319,2011-11-15,4,0,11,0,2,1,2,0.53,0.507579,0.68875,0.199633,449,3746,4195 321 | 320,2011-11-16,4,0,11,0,3,1,3,0.456667,0.451988,0.93,0.136829,145,1672,1817 322 | 321,2011-11-17,4,0,11,0,4,1,2,0.341667,0.323221,0.575833,0.305362,139,2914,3053 323 | 322,2011-11-18,4,0,11,0,5,1,1,0.274167,0.272721,0.41,0.168533,245,3147,3392 324 | 323,2011-11-19,4,0,11,0,6,0,1,0.329167,0.324483,0.502083,0.224496,943,2720,3663 325 | 324,2011-11-20,4,0,11,0,0,0,2,0.463333,0.457058,0.684583,0.18595,787,2733,3520 326 | 325,2011-11-21,4,0,11,0,1,1,3,0.4475,0.445062,0.91,0.138054,220,2545,2765 327 | 326,2011-11-22,4,0,11,0,2,1,3,0.416667,0.421696,0.9625,0.118792,69,1538,1607 328 | 327,2011-11-23,4,0,11,0,3,1,2,0.440833,0.430537,0.757917,0.335825,112,2454,2566 329 | 328,2011-11-24,4,0,11,1,4,0,1,0.373333,0.372471,0.549167,0.167304,560,935,1495 330 | 329,2011-11-25,4,0,11,0,5,1,1,0.375,0.380671,0.64375,0.0988958,1095,1697,2792 331 | 330,2011-11-26,4,0,11,0,6,0,1,0.375833,0.385087,0.681667,0.0684208,1249,1819,3068 332 | 331,2011-11-27,4,0,11,0,0,0,1,0.459167,0.4558,0.698333,0.208954,810,2261,3071 333 | 332,2011-11-28,4,0,11,0,1,1,1,0.503478,0.490122,0.743043,0.142122,253,3614,3867 334 | 333,2011-11-29,4,0,11,0,2,1,2,0.458333,0.451375,0.830833,0.258092,96,2818,2914 335 | 334,2011-11-30,4,0,11,0,3,1,1,0.325,0.311221,0.613333,0.271158,188,3425,3613 336 | 335,2011-12-01,4,0,12,0,4,1,1,0.3125,0.305554,0.524583,0.220158,182,3545,3727 337 | 336,2011-12-02,4,0,12,0,5,1,1,0.314167,0.331433,0.625833,0.100754,268,3672,3940 338 | 337,2011-12-03,4,0,12,0,6,0,1,0.299167,0.310604,0.612917,0.0957833,706,2908,3614 339 | 338,2011-12-04,4,0,12,0,0,0,1,0.330833,0.3491,0.775833,0.0839583,634,2851,3485 340 | 339,2011-12-05,4,0,12,0,1,1,2,0.385833,0.393925,0.827083,0.0622083,233,3578,3811 341 | 340,2011-12-06,4,0,12,0,2,1,3,0.4625,0.4564,0.949583,0.232583,126,2468,2594 342 | 341,2011-12-07,4,0,12,0,3,1,3,0.41,0.400246,0.970417,0.266175,50,655,705 343 | 342,2011-12-08,4,0,12,0,4,1,1,0.265833,0.256938,0.58,0.240058,150,3172,3322 344 | 343,2011-12-09,4,0,12,0,5,1,1,0.290833,0.317542,0.695833,0.0827167,261,3359,3620 345 | 344,2011-12-10,4,0,12,0,6,0,1,0.275,0.266412,0.5075,0.233221,502,2688,3190 346 | 345,2011-12-11,4,0,12,0,0,0,1,0.220833,0.253154,0.49,0.0665417,377,2366,2743 347 | 346,2011-12-12,4,0,12,0,1,1,1,0.238333,0.270196,0.670833,0.06345,143,3167,3310 348 | 347,2011-12-13,4,0,12,0,2,1,1,0.2825,0.301138,0.59,0.14055,155,3368,3523 349 | 348,2011-12-14,4,0,12,0,3,1,2,0.3175,0.338362,0.66375,0.0609583,178,3562,3740 350 | 349,2011-12-15,4,0,12,0,4,1,2,0.4225,0.412237,0.634167,0.268042,181,3528,3709 351 | 350,2011-12-16,4,0,12,0,5,1,2,0.375,0.359825,0.500417,0.260575,178,3399,3577 352 | 351,2011-12-17,4,0,12,0,6,0,2,0.258333,0.249371,0.560833,0.243167,275,2464,2739 353 | 352,2011-12-18,4,0,12,0,0,0,1,0.238333,0.245579,0.58625,0.169779,220,2211,2431 354 | 353,2011-12-19,4,0,12,0,1,1,1,0.276667,0.280933,0.6375,0.172896,260,3143,3403 355 | 354,2011-12-20,4,0,12,0,2,1,2,0.385833,0.396454,0.595417,0.0615708,216,3534,3750 356 | 355,2011-12-21,1,0,12,0,3,1,2,0.428333,0.428017,0.858333,0.2214,107,2553,2660 357 | 356,2011-12-22,1,0,12,0,4,1,2,0.423333,0.426121,0.7575,0.047275,227,2841,3068 358 | 357,2011-12-23,1,0,12,0,5,1,1,0.373333,0.377513,0.68625,0.274246,163,2046,2209 359 | 358,2011-12-24,1,0,12,0,6,0,1,0.3025,0.299242,0.5425,0.190304,155,856,1011 360 | 359,2011-12-25,1,0,12,0,0,0,1,0.274783,0.279961,0.681304,0.155091,303,451,754 361 | 360,2011-12-26,1,0,12,1,1,0,1,0.321739,0.315535,0.506957,0.239465,430,887,1317 362 | 361,2011-12-27,1,0,12,0,2,1,2,0.325,0.327633,0.7625,0.18845,103,1059,1162 363 | 362,2011-12-28,1,0,12,0,3,1,1,0.29913,0.279974,0.503913,0.293961,255,2047,2302 364 | 363,2011-12-29,1,0,12,0,4,1,1,0.248333,0.263892,0.574167,0.119412,254,2169,2423 365 | 364,2011-12-30,1,0,12,0,5,1,1,0.311667,0.318812,0.636667,0.134337,491,2508,2999 366 | 365,2011-12-31,1,0,12,0,6,0,1,0.41,0.414121,0.615833,0.220154,665,1820,2485 367 | 366,2012-01-01,1,1,1,0,0,0,1,0.37,0.375621,0.6925,0.192167,686,1608,2294 368 | 367,2012-01-02,1,1,1,1,1,0,1,0.273043,0.252304,0.381304,0.329665,244,1707,1951 369 | 368,2012-01-03,1,1,1,0,2,1,1,0.15,0.126275,0.44125,0.365671,89,2147,2236 370 | 369,2012-01-04,1,1,1,0,3,1,2,0.1075,0.119337,0.414583,0.1847,95,2273,2368 371 | 370,2012-01-05,1,1,1,0,4,1,1,0.265833,0.278412,0.524167,0.129987,140,3132,3272 372 | 371,2012-01-06,1,1,1,0,5,1,1,0.334167,0.340267,0.542083,0.167908,307,3791,4098 373 | 372,2012-01-07,1,1,1,0,6,0,1,0.393333,0.390779,0.531667,0.174758,1070,3451,4521 374 | 373,2012-01-08,1,1,1,0,0,0,1,0.3375,0.340258,0.465,0.191542,599,2826,3425 375 | 374,2012-01-09,1,1,1,0,1,1,2,0.224167,0.247479,0.701667,0.0989,106,2270,2376 376 | 375,2012-01-10,1,1,1,0,2,1,1,0.308696,0.318826,0.646522,0.187552,173,3425,3598 377 | 376,2012-01-11,1,1,1,0,3,1,2,0.274167,0.282821,0.8475,0.131221,92,2085,2177 378 | 377,2012-01-12,1,1,1,0,4,1,2,0.3825,0.381938,0.802917,0.180967,269,3828,4097 379 | 378,2012-01-13,1,1,1,0,5,1,1,0.274167,0.249362,0.5075,0.378108,174,3040,3214 380 | 379,2012-01-14,1,1,1,0,6,0,1,0.18,0.183087,0.4575,0.187183,333,2160,2493 381 | 380,2012-01-15,1,1,1,0,0,0,1,0.166667,0.161625,0.419167,0.251258,284,2027,2311 382 | 381,2012-01-16,1,1,1,1,1,0,1,0.19,0.190663,0.5225,0.231358,217,2081,2298 383 | 382,2012-01-17,1,1,1,0,2,1,2,0.373043,0.364278,0.716087,0.34913,127,2808,2935 384 | 383,2012-01-18,1,1,1,0,3,1,1,0.303333,0.275254,0.443333,0.415429,109,3267,3376 385 | 384,2012-01-19,1,1,1,0,4,1,1,0.19,0.190038,0.4975,0.220158,130,3162,3292 386 | 385,2012-01-20,1,1,1,0,5,1,2,0.2175,0.220958,0.45,0.20275,115,3048,3163 387 | 386,2012-01-21,1,1,1,0,6,0,2,0.173333,0.174875,0.83125,0.222642,67,1234,1301 388 | 387,2012-01-22,1,1,1,0,0,0,2,0.1625,0.16225,0.79625,0.199638,196,1781,1977 389 | 388,2012-01-23,1,1,1,0,1,1,2,0.218333,0.243058,0.91125,0.110708,145,2287,2432 390 | 389,2012-01-24,1,1,1,0,2,1,1,0.3425,0.349108,0.835833,0.123767,439,3900,4339 391 | 390,2012-01-25,1,1,1,0,3,1,1,0.294167,0.294821,0.64375,0.161071,467,3803,4270 392 | 391,2012-01-26,1,1,1,0,4,1,2,0.341667,0.35605,0.769583,0.0733958,244,3831,4075 393 | 392,2012-01-27,1,1,1,0,5,1,2,0.425,0.415383,0.74125,0.342667,269,3187,3456 394 | 393,2012-01-28,1,1,1,0,6,0,1,0.315833,0.326379,0.543333,0.210829,775,3248,4023 395 | 394,2012-01-29,1,1,1,0,0,0,1,0.2825,0.272721,0.31125,0.24005,558,2685,3243 396 | 395,2012-01-30,1,1,1,0,1,1,1,0.269167,0.262625,0.400833,0.215792,126,3498,3624 397 | 396,2012-01-31,1,1,1,0,2,1,1,0.39,0.381317,0.416667,0.261817,324,4185,4509 398 | 397,2012-02-01,1,1,2,0,3,1,1,0.469167,0.466538,0.507917,0.189067,304,4275,4579 399 | 398,2012-02-02,1,1,2,0,4,1,2,0.399167,0.398971,0.672917,0.187187,190,3571,3761 400 | 399,2012-02-03,1,1,2,0,5,1,1,0.313333,0.309346,0.526667,0.178496,310,3841,4151 401 | 400,2012-02-04,1,1,2,0,6,0,2,0.264167,0.272725,0.779583,0.121896,384,2448,2832 402 | 401,2012-02-05,1,1,2,0,0,0,2,0.265833,0.264521,0.687917,0.175996,318,2629,2947 403 | 402,2012-02-06,1,1,2,0,1,1,1,0.282609,0.296426,0.622174,0.1538,206,3578,3784 404 | 403,2012-02-07,1,1,2,0,2,1,1,0.354167,0.361104,0.49625,0.147379,199,4176,4375 405 | 404,2012-02-08,1,1,2,0,3,1,2,0.256667,0.266421,0.722917,0.133721,109,2693,2802 406 | 405,2012-02-09,1,1,2,0,4,1,1,0.265,0.261988,0.562083,0.194037,163,3667,3830 407 | 406,2012-02-10,1,1,2,0,5,1,2,0.280833,0.293558,0.54,0.116929,227,3604,3831 408 | 407,2012-02-11,1,1,2,0,6,0,3,0.224167,0.210867,0.73125,0.289796,192,1977,2169 409 | 408,2012-02-12,1,1,2,0,0,0,1,0.1275,0.101658,0.464583,0.409212,73,1456,1529 410 | 409,2012-02-13,1,1,2,0,1,1,1,0.2225,0.227913,0.41125,0.167283,94,3328,3422 411 | 410,2012-02-14,1,1,2,0,2,1,2,0.319167,0.333946,0.50875,0.141179,135,3787,3922 412 | 411,2012-02-15,1,1,2,0,3,1,1,0.348333,0.351629,0.53125,0.1816,141,4028,4169 413 | 412,2012-02-16,1,1,2,0,4,1,2,0.316667,0.330162,0.752917,0.091425,74,2931,3005 414 | 413,2012-02-17,1,1,2,0,5,1,1,0.343333,0.351629,0.634583,0.205846,349,3805,4154 415 | 414,2012-02-18,1,1,2,0,6,0,1,0.346667,0.355425,0.534583,0.190929,1435,2883,4318 416 | 415,2012-02-19,1,1,2,0,0,0,2,0.28,0.265788,0.515833,0.253112,618,2071,2689 417 | 416,2012-02-20,1,1,2,1,1,0,1,0.28,0.273391,0.507826,0.229083,502,2627,3129 418 | 417,2012-02-21,1,1,2,0,2,1,1,0.287826,0.295113,0.594348,0.205717,163,3614,3777 419 | 418,2012-02-22,1,1,2,0,3,1,1,0.395833,0.392667,0.567917,0.234471,394,4379,4773 420 | 419,2012-02-23,1,1,2,0,4,1,1,0.454167,0.444446,0.554583,0.190913,516,4546,5062 421 | 420,2012-02-24,1,1,2,0,5,1,2,0.4075,0.410971,0.7375,0.237567,246,3241,3487 422 | 421,2012-02-25,1,1,2,0,6,0,1,0.290833,0.255675,0.395833,0.421642,317,2415,2732 423 | 422,2012-02-26,1,1,2,0,0,0,1,0.279167,0.268308,0.41,0.205229,515,2874,3389 424 | 423,2012-02-27,1,1,2,0,1,1,1,0.366667,0.357954,0.490833,0.268033,253,4069,4322 425 | 424,2012-02-28,1,1,2,0,2,1,1,0.359167,0.353525,0.395833,0.193417,229,4134,4363 426 | 425,2012-02-29,1,1,2,0,3,1,2,0.344348,0.34847,0.804783,0.179117,65,1769,1834 427 | 426,2012-03-01,1,1,3,0,4,1,1,0.485833,0.475371,0.615417,0.226987,325,4665,4990 428 | 427,2012-03-02,1,1,3,0,5,1,2,0.353333,0.359842,0.657083,0.144904,246,2948,3194 429 | 428,2012-03-03,1,1,3,0,6,0,2,0.414167,0.413492,0.62125,0.161079,956,3110,4066 430 | 429,2012-03-04,1,1,3,0,0,0,1,0.325833,0.303021,0.403333,0.334571,710,2713,3423 431 | 430,2012-03-05,1,1,3,0,1,1,1,0.243333,0.241171,0.50625,0.228858,203,3130,3333 432 | 431,2012-03-06,1,1,3,0,2,1,1,0.258333,0.255042,0.456667,0.200875,221,3735,3956 433 | 432,2012-03-07,1,1,3,0,3,1,1,0.404167,0.3851,0.513333,0.345779,432,4484,4916 434 | 433,2012-03-08,1,1,3,0,4,1,1,0.5275,0.524604,0.5675,0.441563,486,4896,5382 435 | 434,2012-03-09,1,1,3,0,5,1,2,0.410833,0.397083,0.407083,0.4148,447,4122,4569 436 | 435,2012-03-10,1,1,3,0,6,0,1,0.2875,0.277767,0.350417,0.22575,968,3150,4118 437 | 436,2012-03-11,1,1,3,0,0,0,1,0.361739,0.35967,0.476957,0.222587,1658,3253,4911 438 | 437,2012-03-12,1,1,3,0,1,1,1,0.466667,0.459592,0.489167,0.207713,838,4460,5298 439 | 438,2012-03-13,1,1,3,0,2,1,1,0.565,0.542929,0.6175,0.23695,762,5085,5847 440 | 439,2012-03-14,1,1,3,0,3,1,1,0.5725,0.548617,0.507083,0.115062,997,5315,6312 441 | 440,2012-03-15,1,1,3,0,4,1,1,0.5575,0.532825,0.579583,0.149883,1005,5187,6192 442 | 441,2012-03-16,1,1,3,0,5,1,2,0.435833,0.436229,0.842083,0.113192,548,3830,4378 443 | 442,2012-03-17,1,1,3,0,6,0,2,0.514167,0.505046,0.755833,0.110704,3155,4681,7836 444 | 443,2012-03-18,1,1,3,0,0,0,2,0.4725,0.464,0.81,0.126883,2207,3685,5892 445 | 444,2012-03-19,1,1,3,0,1,1,1,0.545,0.532821,0.72875,0.162317,982,5171,6153 446 | 445,2012-03-20,1,1,3,0,2,1,1,0.560833,0.538533,0.807917,0.121271,1051,5042,6093 447 | 446,2012-03-21,2,1,3,0,3,1,2,0.531667,0.513258,0.82125,0.0895583,1122,5108,6230 448 | 447,2012-03-22,2,1,3,0,4,1,1,0.554167,0.531567,0.83125,0.117562,1334,5537,6871 449 | 448,2012-03-23,2,1,3,0,5,1,2,0.601667,0.570067,0.694167,0.1163,2469,5893,8362 450 | 449,2012-03-24,2,1,3,0,6,0,2,0.5025,0.486733,0.885417,0.192783,1033,2339,3372 451 | 450,2012-03-25,2,1,3,0,0,0,2,0.4375,0.437488,0.880833,0.220775,1532,3464,4996 452 | 451,2012-03-26,2,1,3,0,1,1,1,0.445833,0.43875,0.477917,0.386821,795,4763,5558 453 | 452,2012-03-27,2,1,3,0,2,1,1,0.323333,0.315654,0.29,0.187192,531,4571,5102 454 | 453,2012-03-28,2,1,3,0,3,1,1,0.484167,0.47095,0.48125,0.291671,674,5024,5698 455 | 454,2012-03-29,2,1,3,0,4,1,1,0.494167,0.482304,0.439167,0.31965,834,5299,6133 456 | 455,2012-03-30,2,1,3,0,5,1,2,0.37,0.375621,0.580833,0.138067,796,4663,5459 457 | 456,2012-03-31,2,1,3,0,6,0,2,0.424167,0.421708,0.738333,0.250617,2301,3934,6235 458 | 457,2012-04-01,2,1,4,0,0,0,2,0.425833,0.417287,0.67625,0.172267,2347,3694,6041 459 | 458,2012-04-02,2,1,4,0,1,1,1,0.433913,0.427513,0.504348,0.312139,1208,4728,5936 460 | 459,2012-04-03,2,1,4,0,2,1,1,0.466667,0.461483,0.396667,0.100133,1348,5424,6772 461 | 460,2012-04-04,2,1,4,0,3,1,1,0.541667,0.53345,0.469583,0.180975,1058,5378,6436 462 | 461,2012-04-05,2,1,4,0,4,1,1,0.435,0.431163,0.374167,0.219529,1192,5265,6457 463 | 462,2012-04-06,2,1,4,0,5,1,1,0.403333,0.390767,0.377083,0.300388,1807,4653,6460 464 | 463,2012-04-07,2,1,4,0,6,0,1,0.4375,0.426129,0.254167,0.274871,3252,3605,6857 465 | 464,2012-04-08,2,1,4,0,0,0,1,0.5,0.492425,0.275833,0.232596,2230,2939,5169 466 | 465,2012-04-09,2,1,4,0,1,1,1,0.489167,0.476638,0.3175,0.358196,905,4680,5585 467 | 466,2012-04-10,2,1,4,0,2,1,1,0.446667,0.436233,0.435,0.249375,819,5099,5918 468 | 467,2012-04-11,2,1,4,0,3,1,1,0.348696,0.337274,0.469565,0.295274,482,4380,4862 469 | 468,2012-04-12,2,1,4,0,4,1,1,0.3975,0.387604,0.46625,0.290429,663,4746,5409 470 | 469,2012-04-13,2,1,4,0,5,1,1,0.4425,0.431808,0.408333,0.155471,1252,5146,6398 471 | 470,2012-04-14,2,1,4,0,6,0,1,0.495,0.487996,0.502917,0.190917,2795,4665,7460 472 | 471,2012-04-15,2,1,4,0,0,0,1,0.606667,0.573875,0.507917,0.225129,2846,4286,7132 473 | 472,2012-04-16,2,1,4,1,1,0,1,0.664167,0.614925,0.561667,0.284829,1198,5172,6370 474 | 473,2012-04-17,2,1,4,0,2,1,1,0.608333,0.598487,0.390417,0.273629,989,5702,6691 475 | 474,2012-04-18,2,1,4,0,3,1,2,0.463333,0.457038,0.569167,0.167912,347,4020,4367 476 | 475,2012-04-19,2,1,4,0,4,1,1,0.498333,0.493046,0.6125,0.0659292,846,5719,6565 477 | 476,2012-04-20,2,1,4,0,5,1,1,0.526667,0.515775,0.694583,0.149871,1340,5950,7290 478 | 477,2012-04-21,2,1,4,0,6,0,1,0.57,0.542921,0.682917,0.283587,2541,4083,6624 479 | 478,2012-04-22,2,1,4,0,0,0,3,0.396667,0.389504,0.835417,0.344546,120,907,1027 480 | 479,2012-04-23,2,1,4,0,1,1,2,0.321667,0.301125,0.766667,0.303496,195,3019,3214 481 | 480,2012-04-24,2,1,4,0,2,1,1,0.413333,0.405283,0.454167,0.249383,518,5115,5633 482 | 481,2012-04-25,2,1,4,0,3,1,1,0.476667,0.470317,0.427917,0.118792,655,5541,6196 483 | 482,2012-04-26,2,1,4,0,4,1,2,0.498333,0.483583,0.756667,0.176625,475,4551,5026 484 | 483,2012-04-27,2,1,4,0,5,1,1,0.4575,0.452637,0.400833,0.347633,1014,5219,6233 485 | 484,2012-04-28,2,1,4,0,6,0,2,0.376667,0.377504,0.489583,0.129975,1120,3100,4220 486 | 485,2012-04-29,2,1,4,0,0,0,1,0.458333,0.450121,0.587083,0.116908,2229,4075,6304 487 | 486,2012-04-30,2,1,4,0,1,1,2,0.464167,0.457696,0.57,0.171638,665,4907,5572 488 | 487,2012-05-01,2,1,5,0,2,1,2,0.613333,0.577021,0.659583,0.156096,653,5087,5740 489 | 488,2012-05-02,2,1,5,0,3,1,1,0.564167,0.537896,0.797083,0.138058,667,5502,6169 490 | 489,2012-05-03,2,1,5,0,4,1,2,0.56,0.537242,0.768333,0.133696,764,5657,6421 491 | 490,2012-05-04,2,1,5,0,5,1,1,0.6275,0.590917,0.735417,0.162938,1069,5227,6296 492 | 491,2012-05-05,2,1,5,0,6,0,2,0.621667,0.584608,0.756667,0.152992,2496,4387,6883 493 | 492,2012-05-06,2,1,5,0,0,0,2,0.5625,0.546737,0.74,0.149879,2135,4224,6359 494 | 493,2012-05-07,2,1,5,0,1,1,2,0.5375,0.527142,0.664167,0.230721,1008,5265,6273 495 | 494,2012-05-08,2,1,5,0,2,1,2,0.581667,0.557471,0.685833,0.296029,738,4990,5728 496 | 495,2012-05-09,2,1,5,0,3,1,2,0.575,0.553025,0.744167,0.216412,620,4097,4717 497 | 496,2012-05-10,2,1,5,0,4,1,1,0.505833,0.491783,0.552083,0.314063,1026,5546,6572 498 | 497,2012-05-11,2,1,5,0,5,1,1,0.533333,0.520833,0.360417,0.236937,1319,5711,7030 499 | 498,2012-05-12,2,1,5,0,6,0,1,0.564167,0.544817,0.480417,0.123133,2622,4807,7429 500 | 499,2012-05-13,2,1,5,0,0,0,1,0.6125,0.585238,0.57625,0.225117,2172,3946,6118 501 | 500,2012-05-14,2,1,5,0,1,1,2,0.573333,0.5499,0.789583,0.212692,342,2501,2843 502 | 501,2012-05-15,2,1,5,0,2,1,2,0.611667,0.576404,0.794583,0.147392,625,4490,5115 503 | 502,2012-05-16,2,1,5,0,3,1,1,0.636667,0.595975,0.697917,0.122512,991,6433,7424 504 | 503,2012-05-17,2,1,5,0,4,1,1,0.593333,0.572613,0.52,0.229475,1242,6142,7384 505 | 504,2012-05-18,2,1,5,0,5,1,1,0.564167,0.551121,0.523333,0.136817,1521,6118,7639 506 | 505,2012-05-19,2,1,5,0,6,0,1,0.6,0.566908,0.45625,0.083975,3410,4884,8294 507 | 506,2012-05-20,2,1,5,0,0,0,1,0.620833,0.583967,0.530417,0.254367,2704,4425,7129 508 | 507,2012-05-21,2,1,5,0,1,1,2,0.598333,0.565667,0.81125,0.233204,630,3729,4359 509 | 508,2012-05-22,2,1,5,0,2,1,2,0.615,0.580825,0.765833,0.118167,819,5254,6073 510 | 509,2012-05-23,2,1,5,0,3,1,2,0.621667,0.584612,0.774583,0.102,766,4494,5260 511 | 510,2012-05-24,2,1,5,0,4,1,1,0.655,0.6067,0.716667,0.172896,1059,5711,6770 512 | 511,2012-05-25,2,1,5,0,5,1,1,0.68,0.627529,0.747083,0.14055,1417,5317,6734 513 | 512,2012-05-26,2,1,5,0,6,0,1,0.6925,0.642696,0.7325,0.198992,2855,3681,6536 514 | 513,2012-05-27,2,1,5,0,0,0,1,0.69,0.641425,0.697083,0.215171,3283,3308,6591 515 | 514,2012-05-28,2,1,5,1,1,0,1,0.7125,0.6793,0.67625,0.196521,2557,3486,6043 516 | 515,2012-05-29,2,1,5,0,2,1,1,0.7225,0.672992,0.684583,0.2954,880,4863,5743 517 | 516,2012-05-30,2,1,5,0,3,1,2,0.656667,0.611129,0.67,0.134329,745,6110,6855 518 | 517,2012-05-31,2,1,5,0,4,1,1,0.68,0.631329,0.492917,0.195279,1100,6238,7338 519 | 518,2012-06-01,2,1,6,0,5,1,2,0.654167,0.607962,0.755417,0.237563,533,3594,4127 520 | 519,2012-06-02,2,1,6,0,6,0,1,0.583333,0.566288,0.549167,0.186562,2795,5325,8120 521 | 520,2012-06-03,2,1,6,0,0,0,1,0.6025,0.575133,0.493333,0.184087,2494,5147,7641 522 | 521,2012-06-04,2,1,6,0,1,1,1,0.5975,0.578283,0.487083,0.284833,1071,5927,6998 523 | 522,2012-06-05,2,1,6,0,2,1,2,0.540833,0.525892,0.613333,0.209575,968,6033,7001 524 | 523,2012-06-06,2,1,6,0,3,1,1,0.554167,0.542292,0.61125,0.077125,1027,6028,7055 525 | 524,2012-06-07,2,1,6,0,4,1,1,0.6025,0.569442,0.567083,0.15735,1038,6456,7494 526 | 525,2012-06-08,2,1,6,0,5,1,1,0.649167,0.597862,0.467917,0.175383,1488,6248,7736 527 | 526,2012-06-09,2,1,6,0,6,0,1,0.710833,0.648367,0.437083,0.144287,2708,4790,7498 528 | 527,2012-06-10,2,1,6,0,0,0,1,0.726667,0.663517,0.538333,0.133721,2224,4374,6598 529 | 528,2012-06-11,2,1,6,0,1,1,2,0.720833,0.659721,0.587917,0.207713,1017,5647,6664 530 | 529,2012-06-12,2,1,6,0,2,1,2,0.653333,0.597875,0.833333,0.214546,477,4495,4972 531 | 530,2012-06-13,2,1,6,0,3,1,1,0.655833,0.611117,0.582083,0.343279,1173,6248,7421 532 | 531,2012-06-14,2,1,6,0,4,1,1,0.648333,0.624383,0.569583,0.253733,1180,6183,7363 533 | 532,2012-06-15,2,1,6,0,5,1,1,0.639167,0.599754,0.589583,0.176617,1563,6102,7665 534 | 533,2012-06-16,2,1,6,0,6,0,1,0.631667,0.594708,0.504167,0.166667,2963,4739,7702 535 | 534,2012-06-17,2,1,6,0,0,0,1,0.5925,0.571975,0.59875,0.144904,2634,4344,6978 536 | 535,2012-06-18,2,1,6,0,1,1,2,0.568333,0.544842,0.777917,0.174746,653,4446,5099 537 | 536,2012-06-19,2,1,6,0,2,1,1,0.688333,0.654692,0.69,0.148017,968,5857,6825 538 | 537,2012-06-20,2,1,6,0,3,1,1,0.7825,0.720975,0.592083,0.113812,872,5339,6211 539 | 538,2012-06-21,3,1,6,0,4,1,1,0.805833,0.752542,0.567917,0.118787,778,5127,5905 540 | 539,2012-06-22,3,1,6,0,5,1,1,0.7775,0.724121,0.57375,0.182842,964,4859,5823 541 | 540,2012-06-23,3,1,6,0,6,0,1,0.731667,0.652792,0.534583,0.179721,2657,4801,7458 542 | 541,2012-06-24,3,1,6,0,0,0,1,0.743333,0.674254,0.479167,0.145525,2551,4340,6891 543 | 542,2012-06-25,3,1,6,0,1,1,1,0.715833,0.654042,0.504167,0.300383,1139,5640,6779 544 | 543,2012-06-26,3,1,6,0,2,1,1,0.630833,0.594704,0.373333,0.347642,1077,6365,7442 545 | 544,2012-06-27,3,1,6,0,3,1,1,0.6975,0.640792,0.36,0.271775,1077,6258,7335 546 | 545,2012-06-28,3,1,6,0,4,1,1,0.749167,0.675512,0.4225,0.17165,921,5958,6879 547 | 546,2012-06-29,3,1,6,0,5,1,1,0.834167,0.786613,0.48875,0.165417,829,4634,5463 548 | 547,2012-06-30,3,1,6,0,6,0,1,0.765,0.687508,0.60125,0.161071,1455,4232,5687 549 | 548,2012-07-01,3,1,7,0,0,0,1,0.815833,0.750629,0.51875,0.168529,1421,4110,5531 550 | 549,2012-07-02,3,1,7,0,1,1,1,0.781667,0.702038,0.447083,0.195267,904,5323,6227 551 | 550,2012-07-03,3,1,7,0,2,1,1,0.780833,0.70265,0.492083,0.126237,1052,5608,6660 552 | 551,2012-07-04,3,1,7,1,3,0,1,0.789167,0.732337,0.53875,0.13495,2562,4841,7403 553 | 552,2012-07-05,3,1,7,0,4,1,1,0.8275,0.761367,0.457917,0.194029,1405,4836,6241 554 | 553,2012-07-06,3,1,7,0,5,1,1,0.828333,0.752533,0.450833,0.146142,1366,4841,6207 555 | 554,2012-07-07,3,1,7,0,6,0,1,0.861667,0.804913,0.492083,0.163554,1448,3392,4840 556 | 555,2012-07-08,3,1,7,0,0,0,1,0.8225,0.790396,0.57375,0.125629,1203,3469,4672 557 | 556,2012-07-09,3,1,7,0,1,1,2,0.710833,0.654054,0.683333,0.180975,998,5571,6569 558 | 557,2012-07-10,3,1,7,0,2,1,2,0.720833,0.664796,0.6675,0.151737,954,5336,6290 559 | 558,2012-07-11,3,1,7,0,3,1,1,0.716667,0.650271,0.633333,0.151733,975,6289,7264 560 | 559,2012-07-12,3,1,7,0,4,1,1,0.715833,0.654683,0.529583,0.146775,1032,6414,7446 561 | 560,2012-07-13,3,1,7,0,5,1,2,0.731667,0.667933,0.485833,0.08085,1511,5988,7499 562 | 561,2012-07-14,3,1,7,0,6,0,2,0.703333,0.666042,0.699167,0.143679,2355,4614,6969 563 | 562,2012-07-15,3,1,7,0,0,0,1,0.745833,0.705196,0.717917,0.166667,1920,4111,6031 564 | 563,2012-07-16,3,1,7,0,1,1,1,0.763333,0.724125,0.645,0.164187,1088,5742,6830 565 | 564,2012-07-17,3,1,7,0,2,1,1,0.818333,0.755683,0.505833,0.114429,921,5865,6786 566 | 565,2012-07-18,3,1,7,0,3,1,1,0.793333,0.745583,0.577083,0.137442,799,4914,5713 567 | 566,2012-07-19,3,1,7,0,4,1,1,0.77,0.714642,0.600417,0.165429,888,5703,6591 568 | 567,2012-07-20,3,1,7,0,5,1,2,0.665833,0.613025,0.844167,0.208967,747,5123,5870 569 | 568,2012-07-21,3,1,7,0,6,0,3,0.595833,0.549912,0.865417,0.2133,1264,3195,4459 570 | 569,2012-07-22,3,1,7,0,0,0,2,0.6675,0.623125,0.7625,0.0939208,2544,4866,7410 571 | 570,2012-07-23,3,1,7,0,1,1,1,0.741667,0.690017,0.694167,0.138683,1135,5831,6966 572 | 571,2012-07-24,3,1,7,0,2,1,1,0.750833,0.70645,0.655,0.211454,1140,6452,7592 573 | 572,2012-07-25,3,1,7,0,3,1,1,0.724167,0.654054,0.45,0.1648,1383,6790,8173 574 | 573,2012-07-26,3,1,7,0,4,1,1,0.776667,0.739263,0.596667,0.284813,1036,5825,6861 575 | 574,2012-07-27,3,1,7,0,5,1,1,0.781667,0.734217,0.594583,0.152992,1259,5645,6904 576 | 575,2012-07-28,3,1,7,0,6,0,1,0.755833,0.697604,0.613333,0.15735,2234,4451,6685 577 | 576,2012-07-29,3,1,7,0,0,0,1,0.721667,0.667933,0.62375,0.170396,2153,4444,6597 578 | 577,2012-07-30,3,1,7,0,1,1,1,0.730833,0.684987,0.66875,0.153617,1040,6065,7105 579 | 578,2012-07-31,3,1,7,0,2,1,1,0.713333,0.662896,0.704167,0.165425,968,6248,7216 580 | 579,2012-08-01,3,1,8,0,3,1,1,0.7175,0.667308,0.6775,0.141179,1074,6506,7580 581 | 580,2012-08-02,3,1,8,0,4,1,1,0.7525,0.707088,0.659583,0.129354,983,6278,7261 582 | 581,2012-08-03,3,1,8,0,5,1,2,0.765833,0.722867,0.6425,0.215792,1328,5847,7175 583 | 582,2012-08-04,3,1,8,0,6,0,1,0.793333,0.751267,0.613333,0.257458,2345,4479,6824 584 | 583,2012-08-05,3,1,8,0,0,0,1,0.769167,0.731079,0.6525,0.290421,1707,3757,5464 585 | 584,2012-08-06,3,1,8,0,1,1,2,0.7525,0.710246,0.654167,0.129354,1233,5780,7013 586 | 585,2012-08-07,3,1,8,0,2,1,2,0.735833,0.697621,0.70375,0.116908,1278,5995,7273 587 | 586,2012-08-08,3,1,8,0,3,1,2,0.75,0.707717,0.672917,0.1107,1263,6271,7534 588 | 587,2012-08-09,3,1,8,0,4,1,1,0.755833,0.699508,0.620417,0.1561,1196,6090,7286 589 | 588,2012-08-10,3,1,8,0,5,1,2,0.715833,0.667942,0.715833,0.238813,1065,4721,5786 590 | 589,2012-08-11,3,1,8,0,6,0,2,0.6925,0.638267,0.732917,0.206479,2247,4052,6299 591 | 590,2012-08-12,3,1,8,0,0,0,1,0.700833,0.644579,0.530417,0.122512,2182,4362,6544 592 | 591,2012-08-13,3,1,8,0,1,1,1,0.720833,0.662254,0.545417,0.136212,1207,5676,6883 593 | 592,2012-08-14,3,1,8,0,2,1,1,0.726667,0.676779,0.686667,0.169158,1128,5656,6784 594 | 593,2012-08-15,3,1,8,0,3,1,1,0.706667,0.654037,0.619583,0.169771,1198,6149,7347 595 | 594,2012-08-16,3,1,8,0,4,1,1,0.719167,0.654688,0.519167,0.141796,1338,6267,7605 596 | 595,2012-08-17,3,1,8,0,5,1,1,0.723333,0.2424,0.570833,0.231354,1483,5665,7148 597 | 596,2012-08-18,3,1,8,0,6,0,1,0.678333,0.618071,0.603333,0.177867,2827,5038,7865 598 | 597,2012-08-19,3,1,8,0,0,0,2,0.635833,0.603554,0.711667,0.08645,1208,3341,4549 599 | 598,2012-08-20,3,1,8,0,1,1,2,0.635833,0.595967,0.734167,0.129979,1026,5504,6530 600 | 599,2012-08-21,3,1,8,0,2,1,1,0.649167,0.601025,0.67375,0.0727708,1081,5925,7006 601 | 600,2012-08-22,3,1,8,0,3,1,1,0.6675,0.621854,0.677083,0.0702833,1094,6281,7375 602 | 601,2012-08-23,3,1,8,0,4,1,1,0.695833,0.637008,0.635833,0.0845958,1363,6402,7765 603 | 602,2012-08-24,3,1,8,0,5,1,2,0.7025,0.6471,0.615,0.0721458,1325,6257,7582 604 | 603,2012-08-25,3,1,8,0,6,0,2,0.661667,0.618696,0.712917,0.244408,1829,4224,6053 605 | 604,2012-08-26,3,1,8,0,0,0,2,0.653333,0.595996,0.845833,0.228858,1483,3772,5255 606 | 605,2012-08-27,3,1,8,0,1,1,1,0.703333,0.654688,0.730417,0.128733,989,5928,6917 607 | 606,2012-08-28,3,1,8,0,2,1,1,0.728333,0.66605,0.62,0.190925,935,6105,7040 608 | 607,2012-08-29,3,1,8,0,3,1,1,0.685,0.635733,0.552083,0.112562,1177,6520,7697 609 | 608,2012-08-30,3,1,8,0,4,1,1,0.706667,0.652779,0.590417,0.0771167,1172,6541,7713 610 | 609,2012-08-31,3,1,8,0,5,1,1,0.764167,0.6894,0.5875,0.168533,1433,5917,7350 611 | 610,2012-09-01,3,1,9,0,6,0,2,0.753333,0.702654,0.638333,0.113187,2352,3788,6140 612 | 611,2012-09-02,3,1,9,0,0,0,2,0.696667,0.649,0.815,0.0640708,2613,3197,5810 613 | 612,2012-09-03,3,1,9,1,1,0,1,0.7075,0.661629,0.790833,0.151121,1965,4069,6034 614 | 613,2012-09-04,3,1,9,0,2,1,1,0.725833,0.686888,0.755,0.236321,867,5997,6864 615 | 614,2012-09-05,3,1,9,0,3,1,1,0.736667,0.708983,0.74125,0.187808,832,6280,7112 616 | 615,2012-09-06,3,1,9,0,4,1,2,0.696667,0.655329,0.810417,0.142421,611,5592,6203 617 | 616,2012-09-07,3,1,9,0,5,1,1,0.703333,0.657204,0.73625,0.171646,1045,6459,7504 618 | 617,2012-09-08,3,1,9,0,6,0,2,0.659167,0.611121,0.799167,0.281104,1557,4419,5976 619 | 618,2012-09-09,3,1,9,0,0,0,1,0.61,0.578925,0.5475,0.224496,2570,5657,8227 620 | 619,2012-09-10,3,1,9,0,1,1,1,0.583333,0.565654,0.50375,0.258713,1118,6407,7525 621 | 620,2012-09-11,3,1,9,0,2,1,1,0.5775,0.554292,0.52,0.0920542,1070,6697,7767 622 | 621,2012-09-12,3,1,9,0,3,1,1,0.599167,0.570075,0.577083,0.131846,1050,6820,7870 623 | 622,2012-09-13,3,1,9,0,4,1,1,0.6125,0.579558,0.637083,0.0827208,1054,6750,7804 624 | 623,2012-09-14,3,1,9,0,5,1,1,0.633333,0.594083,0.6725,0.103863,1379,6630,8009 625 | 624,2012-09-15,3,1,9,0,6,0,1,0.608333,0.585867,0.501667,0.247521,3160,5554,8714 626 | 625,2012-09-16,3,1,9,0,0,0,1,0.58,0.563125,0.57,0.0901833,2166,5167,7333 627 | 626,2012-09-17,3,1,9,0,1,1,2,0.580833,0.55305,0.734583,0.151742,1022,5847,6869 628 | 627,2012-09-18,3,1,9,0,2,1,2,0.623333,0.565067,0.8725,0.357587,371,3702,4073 629 | 628,2012-09-19,3,1,9,0,3,1,1,0.5525,0.540404,0.536667,0.215175,788,6803,7591 630 | 629,2012-09-20,3,1,9,0,4,1,1,0.546667,0.532192,0.618333,0.118167,939,6781,7720 631 | 630,2012-09-21,3,1,9,0,5,1,1,0.599167,0.571971,0.66875,0.154229,1250,6917,8167 632 | 631,2012-09-22,3,1,9,0,6,0,1,0.65,0.610488,0.646667,0.283583,2512,5883,8395 633 | 632,2012-09-23,4,1,9,0,0,0,1,0.529167,0.518933,0.467083,0.223258,2454,5453,7907 634 | 633,2012-09-24,4,1,9,0,1,1,1,0.514167,0.502513,0.492917,0.142404,1001,6435,7436 635 | 634,2012-09-25,4,1,9,0,2,1,1,0.55,0.544179,0.57,0.236321,845,6693,7538 636 | 635,2012-09-26,4,1,9,0,3,1,1,0.635,0.596613,0.630833,0.2444,787,6946,7733 637 | 636,2012-09-27,4,1,9,0,4,1,2,0.65,0.607975,0.690833,0.134342,751,6642,7393 638 | 637,2012-09-28,4,1,9,0,5,1,2,0.619167,0.585863,0.69,0.164179,1045,6370,7415 639 | 638,2012-09-29,4,1,9,0,6,0,1,0.5425,0.530296,0.542917,0.227604,2589,5966,8555 640 | 639,2012-09-30,4,1,9,0,0,0,1,0.526667,0.517663,0.583333,0.134958,2015,4874,6889 641 | 640,2012-10-01,4,1,10,0,1,1,2,0.520833,0.512,0.649167,0.0908042,763,6015,6778 642 | 641,2012-10-02,4,1,10,0,2,1,3,0.590833,0.542333,0.871667,0.104475,315,4324,4639 643 | 642,2012-10-03,4,1,10,0,3,1,2,0.6575,0.599133,0.79375,0.0665458,728,6844,7572 644 | 643,2012-10-04,4,1,10,0,4,1,2,0.6575,0.607975,0.722917,0.117546,891,6437,7328 645 | 644,2012-10-05,4,1,10,0,5,1,1,0.615,0.580187,0.6275,0.10635,1516,6640,8156 646 | 645,2012-10-06,4,1,10,0,6,0,1,0.554167,0.538521,0.664167,0.268025,3031,4934,7965 647 | 646,2012-10-07,4,1,10,0,0,0,2,0.415833,0.419813,0.708333,0.141162,781,2729,3510 648 | 647,2012-10-08,4,1,10,1,1,0,2,0.383333,0.387608,0.709583,0.189679,874,4604,5478 649 | 648,2012-10-09,4,1,10,0,2,1,2,0.446667,0.438112,0.761667,0.1903,601,5791,6392 650 | 649,2012-10-10,4,1,10,0,3,1,1,0.514167,0.503142,0.630833,0.187821,780,6911,7691 651 | 650,2012-10-11,4,1,10,0,4,1,1,0.435,0.431167,0.463333,0.181596,834,6736,7570 652 | 651,2012-10-12,4,1,10,0,5,1,1,0.4375,0.433071,0.539167,0.235092,1060,6222,7282 653 | 652,2012-10-13,4,1,10,0,6,0,1,0.393333,0.391396,0.494583,0.146142,2252,4857,7109 654 | 653,2012-10-14,4,1,10,0,0,0,1,0.521667,0.508204,0.640417,0.278612,2080,4559,6639 655 | 654,2012-10-15,4,1,10,0,1,1,2,0.561667,0.53915,0.7075,0.296037,760,5115,5875 656 | 655,2012-10-16,4,1,10,0,2,1,1,0.468333,0.460846,0.558333,0.182221,922,6612,7534 657 | 656,2012-10-17,4,1,10,0,3,1,1,0.455833,0.450108,0.692917,0.101371,979,6482,7461 658 | 657,2012-10-18,4,1,10,0,4,1,2,0.5225,0.512625,0.728333,0.236937,1008,6501,7509 659 | 658,2012-10-19,4,1,10,0,5,1,2,0.563333,0.537896,0.815,0.134954,753,4671,5424 660 | 659,2012-10-20,4,1,10,0,6,0,1,0.484167,0.472842,0.572917,0.117537,2806,5284,8090 661 | 660,2012-10-21,4,1,10,0,0,0,1,0.464167,0.456429,0.51,0.166054,2132,4692,6824 662 | 661,2012-10-22,4,1,10,0,1,1,1,0.4875,0.482942,0.568333,0.0814833,830,6228,7058 663 | 662,2012-10-23,4,1,10,0,2,1,1,0.544167,0.530304,0.641667,0.0945458,841,6625,7466 664 | 663,2012-10-24,4,1,10,0,3,1,1,0.5875,0.558721,0.63625,0.0727792,795,6898,7693 665 | 664,2012-10-25,4,1,10,0,4,1,2,0.55,0.529688,0.800417,0.124375,875,6484,7359 666 | 665,2012-10-26,4,1,10,0,5,1,2,0.545833,0.52275,0.807083,0.132467,1182,6262,7444 667 | 666,2012-10-27,4,1,10,0,6,0,2,0.53,0.515133,0.72,0.235692,2643,5209,7852 668 | 667,2012-10-28,4,1,10,0,0,0,2,0.4775,0.467771,0.694583,0.398008,998,3461,4459 669 | 668,2012-10-29,4,1,10,0,1,1,3,0.44,0.4394,0.88,0.3582,2,20,22 670 | 669,2012-10-30,4,1,10,0,2,1,2,0.318182,0.309909,0.825455,0.213009,87,1009,1096 671 | 670,2012-10-31,4,1,10,0,3,1,2,0.3575,0.3611,0.666667,0.166667,419,5147,5566 672 | 671,2012-11-01,4,1,11,0,4,1,2,0.365833,0.369942,0.581667,0.157346,466,5520,5986 673 | 672,2012-11-02,4,1,11,0,5,1,1,0.355,0.356042,0.522083,0.266175,618,5229,5847 674 | 673,2012-11-03,4,1,11,0,6,0,2,0.343333,0.323846,0.49125,0.270529,1029,4109,5138 675 | 674,2012-11-04,4,1,11,0,0,0,1,0.325833,0.329538,0.532917,0.179108,1201,3906,5107 676 | 675,2012-11-05,4,1,11,0,1,1,1,0.319167,0.308075,0.494167,0.236325,378,4881,5259 677 | 676,2012-11-06,4,1,11,0,2,1,1,0.280833,0.281567,0.567083,0.173513,466,5220,5686 678 | 677,2012-11-07,4,1,11,0,3,1,2,0.295833,0.274621,0.5475,0.304108,326,4709,5035 679 | 678,2012-11-08,4,1,11,0,4,1,1,0.352174,0.341891,0.333478,0.347835,340,4975,5315 680 | 679,2012-11-09,4,1,11,0,5,1,1,0.361667,0.355413,0.540833,0.214558,709,5283,5992 681 | 680,2012-11-10,4,1,11,0,6,0,1,0.389167,0.393937,0.645417,0.0578458,2090,4446,6536 682 | 681,2012-11-11,4,1,11,0,0,0,1,0.420833,0.421713,0.659167,0.1275,2290,4562,6852 683 | 682,2012-11-12,4,1,11,1,1,0,1,0.485,0.475383,0.741667,0.173517,1097,5172,6269 684 | 683,2012-11-13,4,1,11,0,2,1,2,0.343333,0.323225,0.662917,0.342046,327,3767,4094 685 | 684,2012-11-14,4,1,11,0,3,1,1,0.289167,0.281563,0.552083,0.199625,373,5122,5495 686 | 685,2012-11-15,4,1,11,0,4,1,2,0.321667,0.324492,0.620417,0.152987,320,5125,5445 687 | 686,2012-11-16,4,1,11,0,5,1,1,0.345,0.347204,0.524583,0.171025,484,5214,5698 688 | 687,2012-11-17,4,1,11,0,6,0,1,0.325,0.326383,0.545417,0.179729,1313,4316,5629 689 | 688,2012-11-18,4,1,11,0,0,0,1,0.3425,0.337746,0.692917,0.227612,922,3747,4669 690 | 689,2012-11-19,4,1,11,0,1,1,2,0.380833,0.375621,0.623333,0.235067,449,5050,5499 691 | 690,2012-11-20,4,1,11,0,2,1,2,0.374167,0.380667,0.685,0.082725,534,5100,5634 692 | 691,2012-11-21,4,1,11,0,3,1,1,0.353333,0.364892,0.61375,0.103246,615,4531,5146 693 | 692,2012-11-22,4,1,11,1,4,0,1,0.34,0.350371,0.580417,0.0528708,955,1470,2425 694 | 693,2012-11-23,4,1,11,0,5,1,1,0.368333,0.378779,0.56875,0.148021,1603,2307,3910 695 | 694,2012-11-24,4,1,11,0,6,0,1,0.278333,0.248742,0.404583,0.376871,532,1745,2277 696 | 695,2012-11-25,4,1,11,0,0,0,1,0.245833,0.257583,0.468333,0.1505,309,2115,2424 697 | 696,2012-11-26,4,1,11,0,1,1,1,0.313333,0.339004,0.535417,0.04665,337,4750,5087 698 | 697,2012-11-27,4,1,11,0,2,1,2,0.291667,0.281558,0.786667,0.237562,123,3836,3959 699 | 698,2012-11-28,4,1,11,0,3,1,1,0.296667,0.289762,0.50625,0.210821,198,5062,5260 700 | 699,2012-11-29,4,1,11,0,4,1,1,0.28087,0.298422,0.555652,0.115522,243,5080,5323 701 | 700,2012-11-30,4,1,11,0,5,1,1,0.298333,0.323867,0.649583,0.0584708,362,5306,5668 702 | 701,2012-12-01,4,1,12,0,6,0,2,0.298333,0.316904,0.806667,0.0597042,951,4240,5191 703 | 702,2012-12-02,4,1,12,0,0,0,2,0.3475,0.359208,0.823333,0.124379,892,3757,4649 704 | 703,2012-12-03,4,1,12,0,1,1,1,0.4525,0.455796,0.7675,0.0827208,555,5679,6234 705 | 704,2012-12-04,4,1,12,0,2,1,1,0.475833,0.469054,0.73375,0.174129,551,6055,6606 706 | 705,2012-12-05,4,1,12,0,3,1,1,0.438333,0.428012,0.485,0.324021,331,5398,5729 707 | 706,2012-12-06,4,1,12,0,4,1,1,0.255833,0.258204,0.50875,0.174754,340,5035,5375 708 | 707,2012-12-07,4,1,12,0,5,1,2,0.320833,0.321958,0.764167,0.1306,349,4659,5008 709 | 708,2012-12-08,4,1,12,0,6,0,2,0.381667,0.389508,0.91125,0.101379,1153,4429,5582 710 | 709,2012-12-09,4,1,12,0,0,0,2,0.384167,0.390146,0.905417,0.157975,441,2787,3228 711 | 710,2012-12-10,4,1,12,0,1,1,2,0.435833,0.435575,0.925,0.190308,329,4841,5170 712 | 711,2012-12-11,4,1,12,0,2,1,2,0.353333,0.338363,0.596667,0.296037,282,5219,5501 713 | 712,2012-12-12,4,1,12,0,3,1,2,0.2975,0.297338,0.538333,0.162937,310,5009,5319 714 | 713,2012-12-13,4,1,12,0,4,1,1,0.295833,0.294188,0.485833,0.174129,425,5107,5532 715 | 714,2012-12-14,4,1,12,0,5,1,1,0.281667,0.294192,0.642917,0.131229,429,5182,5611 716 | 715,2012-12-15,4,1,12,0,6,0,1,0.324167,0.338383,0.650417,0.10635,767,4280,5047 717 | 716,2012-12-16,4,1,12,0,0,0,2,0.3625,0.369938,0.83875,0.100742,538,3248,3786 718 | 717,2012-12-17,4,1,12,0,1,1,2,0.393333,0.4015,0.907083,0.0982583,212,4373,4585 719 | 718,2012-12-18,4,1,12,0,2,1,1,0.410833,0.409708,0.66625,0.221404,433,5124,5557 720 | 719,2012-12-19,4,1,12,0,3,1,1,0.3325,0.342162,0.625417,0.184092,333,4934,5267 721 | 720,2012-12-20,4,1,12,0,4,1,2,0.33,0.335217,0.667917,0.132463,314,3814,4128 722 | 721,2012-12-21,1,1,12,0,5,1,2,0.326667,0.301767,0.556667,0.374383,221,3402,3623 723 | 722,2012-12-22,1,1,12,0,6,0,1,0.265833,0.236113,0.44125,0.407346,205,1544,1749 724 | 723,2012-12-23,1,1,12,0,0,0,1,0.245833,0.259471,0.515417,0.133083,408,1379,1787 725 | 724,2012-12-24,1,1,12,0,1,1,2,0.231304,0.2589,0.791304,0.0772304,174,746,920 726 | 725,2012-12-25,1,1,12,1,2,0,2,0.291304,0.294465,0.734783,0.168726,440,573,1013 727 | 726,2012-12-26,1,1,12,0,3,1,3,0.243333,0.220333,0.823333,0.316546,9,432,441 728 | 727,2012-12-27,1,1,12,0,4,1,2,0.254167,0.226642,0.652917,0.350133,247,1867,2114 729 | 728,2012-12-28,1,1,12,0,5,1,2,0.253333,0.255046,0.59,0.155471,644,2451,3095 730 | 729,2012-12-29,1,1,12,0,6,0,2,0.253333,0.2424,0.752917,0.124383,159,1182,1341 731 | 730,2012-12-30,1,1,12,0,0,0,1,0.255833,0.2317,0.483333,0.350754,364,1432,1796 732 | 731,2012-12-31,1,1,12,0,1,1,2,0.215833,0.223487,0.5775,0.154846,439,2290,2729 733 | -------------------------------------------------------------------------------- /BikeSharingNeuralNet/dir: -------------------------------------------------------------------------------- 1 | Parent 2 | -------------------------------------------------------------------------------- /DeepLearning_Certificate.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/rajivgrover009/Deep-Learning/33d13d3d8010a70ff0f4c6859a4e56b43bb1d341/DeepLearning_Certificate.pdf -------------------------------------------------------------------------------- /GenerateTV_Scripts_RNNs/dlnd_tv_script_generation.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "markdown", 5 | "metadata": { 6 | "deletable": true, 7 | "editable": true 8 | }, 9 | "source": [ 10 | "# TV Script Generation\n", 11 | "In this project, we'll generate our own [Simpsons](https://en.wikipedia.org/wiki/The_Simpsons) TV scripts using RNNs. we'll be using part of the [Simpsons dataset](https://www.kaggle.com/wcukierski/the-simpsons-by-the-data) of scripts from 27 seasons. The Neural Network we'll build will generate a new TV script for a scene at [Moe's Tavern](https://simpsonswiki.com/wiki/Moe's_Tavern).\n", 12 | "## Get the Data\n", 13 | "The data is already provided . we'll be using a subset of the original dataset. It consists of only the scenes in Moe's Tavern. This doesn't include other versions of the tavern, like \"Moe's Cavern\", \"Flaming Moe's\", \"Uncle Moe's Family Feed-Bag\", etc.." 14 | ] 15 | }, 16 | { 17 | "cell_type": "code", 18 | "execution_count": 1, 19 | "metadata": { 20 | "collapsed": false, 21 | "deletable": true, 22 | "editable": true 23 | }, 24 | "outputs": [], 25 | "source": [ 26 | "\n", 27 | "import helper\n", 28 | "\n", 29 | "data_dir = './data/simpsons/moes_tavern_lines.txt'\n", 30 | "text = helper.load_data(data_dir)\n", 31 | "# Ignore notice, since we don't use it for analysing the data\n", 32 | "text = text[81:]\n", 33 | "#text" 34 | ] 35 | }, 36 | { 37 | "cell_type": "markdown", 38 | "metadata": { 39 | "deletable": true, 40 | "editable": true 41 | }, 42 | "source": [ 43 | "## Explore the Data\n", 44 | "Play around with `view_sentence_range` to view different parts of the data." 45 | ] 46 | }, 47 | { 48 | "cell_type": "code", 49 | "execution_count": 2, 50 | "metadata": { 51 | "collapsed": false, 52 | "deletable": true, 53 | "editable": true 54 | }, 55 | "outputs": [ 56 | { 57 | "name": "stdout", 58 | "output_type": "stream", 59 | "text": [ 60 | "Dataset Stats\n", 61 | "Roughly the number of unique words: 11492\n", 62 | "Number of scenes: 262\n", 63 | "Average number of sentences in each scene: 15.248091603053435\n", 64 | "Number of lines: 4257\n", 65 | "Average number of words in each line: 11.50434578341555\n", 66 | "\n", 67 | "The sentences 0 to 10:\n", 68 | "Moe_Szyslak: (INTO PHONE) Moe's Tavern. Where the elite meet to drink.\n", 69 | "Bart_Simpson: Eh, yeah, hello, is Mike there? Last name, Rotch.\n", 70 | "Moe_Szyslak: (INTO PHONE) Hold on, I'll check. (TO BARFLIES) Mike Rotch. Mike Rotch. Hey, has anybody seen Mike Rotch, lately?\n", 71 | "Moe_Szyslak: (INTO PHONE) Listen you little puke. One of these days I'm gonna catch you, and I'm gonna carve my name on your back with an ice pick.\n", 72 | "Moe_Szyslak: What's the matter Homer? You're not your normal effervescent self.\n", 73 | "Homer_Simpson: I got my problems, Moe. Give me another one.\n", 74 | "Moe_Szyslak: Homer, hey, you should not drink to forget your problems.\n", 75 | "Barney_Gumble: Yeah, you should only drink to enhance your social skills.\n", 76 | "\n", 77 | "\n" 78 | ] 79 | } 80 | ], 81 | "source": [ 82 | "view_sentence_range = (0, 10)\n", 83 | "\n", 84 | "\n", 85 | "import numpy as np\n", 86 | "\n", 87 | "print('Dataset Stats')\n", 88 | "print('Roughly the number of unique words: {}'.format(len({word: None for word in text.split()})))\n", 89 | "scenes = text.split('\\n\\n')\n", 90 | "print('Number of scenes: {}'.format(len(scenes)))\n", 91 | "sentence_count_scene = [scene.count('\\n') for scene in scenes]\n", 92 | "print('Average number of sentences in each scene: {}'.format(np.average(sentence_count_scene)))\n", 93 | "\n", 94 | "sentences = [sentence for scene in scenes for sentence in scene.split('\\n')]\n", 95 | "print('Number of lines: {}'.format(len(sentences)))\n", 96 | "word_count_sentence = [len(sentence.split()) for sentence in sentences]\n", 97 | "print('Average number of words in each line: {}'.format(np.average(word_count_sentence)))\n", 98 | "\n", 99 | "print()\n", 100 | "print('The sentences {} to {}:'.format(*view_sentence_range))\n", 101 | "print('\\n'.join(text.split('\\n')[view_sentence_range[0]:view_sentence_range[1]]))" 102 | ] 103 | }, 104 | { 105 | "cell_type": "markdown", 106 | "metadata": { 107 | "deletable": true, 108 | "editable": true 109 | }, 110 | "source": [ 111 | "## Implement Preprocessing Functions\n", 112 | "The first thing to do to any dataset is preprocessing. Implement the following preprocessing functions below:\n", 113 | "- Lookup Table\n", 114 | "- Tokenize Punctuation\n", 115 | "\n", 116 | "### Lookup Table\n", 117 | "To create a word embedding, we first need to transform the words to ids. In this function, we'll create two dictionaries:\n", 118 | "- Dictionary to go from the words to an id, we'll call `vocab_to_int`\n", 119 | "- Dictionary to go from the id to word, we'll call `int_to_vocab`\n", 120 | "\n", 121 | "we'll Return these dictionaries in the following tuple `(vocab_to_int, int_to_vocab)`" 122 | ] 123 | }, 124 | { 125 | "cell_type": "code", 126 | "execution_count": 3, 127 | "metadata": { 128 | "collapsed": false, 129 | "deletable": true, 130 | "editable": true 131 | }, 132 | "outputs": [ 133 | { 134 | "name": "stdout", 135 | "output_type": "stream", 136 | "text": [ 137 | "Tests Passed\n" 138 | ] 139 | } 140 | ], 141 | "source": [ 142 | "import numpy as np\n", 143 | "import problem_unittests as tests\n", 144 | "\n", 145 | "def create_lookup_tables(text):\n", 146 | " \"\"\"\n", 147 | " Create lookup tables for vocabulary\n", 148 | " :param text: The text of tv scripts split into words\n", 149 | " :return: A tuple of dicts (vocab_to_int, int_to_vocab)\n", 150 | " \"\"\"\n", 151 | " # TODO: Implement Function\n", 152 | " vocab=set(text)\n", 153 | " vocab_to_int={word:i for i,word in enumerate(vocab)}\n", 154 | " int_to_vocab=dict(enumerate(vocab))\n", 155 | " \n", 156 | " return vocab_to_int, int_to_vocab\n", 157 | "\n", 158 | "\n", 159 | "\n", 160 | "tests.test_create_lookup_tables(create_lookup_tables)" 161 | ] 162 | }, 163 | { 164 | "cell_type": "markdown", 165 | "metadata": { 166 | "deletable": true, 167 | "editable": true 168 | }, 169 | "source": [ 170 | "### Tokenize Punctuation\n", 171 | "We'll be splitting the script into a word array using spaces as delimiters. However, punctuations like periods and exclamation marks make it hard for the neural network to distinguish between the word \"bye\" and \"bye!\".\n", 172 | "\n", 173 | "Implement the function `token_lookup` to return a dict that will be used to tokenize symbols like \"!\" into \"||Exclamation_Mark||\". Create a dictionary for the following symbols where the symbol is the key and value is the token:\n", 174 | "- Period ( . )\n", 175 | "- Comma ( , )\n", 176 | "- Quotation Mark ( \" )\n", 177 | "- Semicolon ( ; )\n", 178 | "- Exclamation mark ( ! )\n", 179 | "- Question mark ( ? )\n", 180 | "- Left Parentheses ( ( )\n", 181 | "- Right Parentheses ( ) )\n", 182 | "- Dash ( -- )\n", 183 | "- Return ( \\n )\n", 184 | "\n", 185 | "This dictionary will be used to token the symbols and add the delimiter (space) around it. This separates the symbols as it's own word, making it easier for the neural network to predict on the next word. Make sure you don't use a token that could be confused as a word. Instead of using the token \"dash\", try using something like \"||dash||\"." 186 | ] 187 | }, 188 | { 189 | "cell_type": "code", 190 | "execution_count": 4, 191 | "metadata": { 192 | "collapsed": false, 193 | "deletable": true, 194 | "editable": true 195 | }, 196 | "outputs": [ 197 | { 198 | "name": "stdout", 199 | "output_type": "stream", 200 | "text": [ 201 | "Tests Passed\n" 202 | ] 203 | } 204 | ], 205 | "source": [ 206 | "def token_lookup():\n", 207 | " \"\"\"\n", 208 | " Generate a dict to turn punctuation into a token.\n", 209 | " :return: Tokenize dictionary where the key is the punctuation and the value is the token\n", 210 | " \"\"\"\n", 211 | " # TODO: Implement Function\n", 212 | " tokenize_punct={\".\":\"||Period||\",\n", 213 | " \",\":\"||Comma||\",\n", 214 | " \"\\\"\":\"||Quotation_Mark||\",\n", 215 | " \";\":\"||SemiColon||\",\n", 216 | " \"!\":\"||Exclamation_Mark||\",\n", 217 | " \"?\":\"||Question_Mark||\",\n", 218 | " \"(\":\"||Left_Parenthesis||\",\n", 219 | " \")\":\"||Right_Parenthesis||\",\n", 220 | " \"--\":\"||Dash||\",\n", 221 | " \"\\n\":\"||Return||\"}\n", 222 | " return tokenize_punct\n", 223 | "\n", 224 | "\n", 225 | "tests.test_tokenize(token_lookup)" 226 | ] 227 | }, 228 | { 229 | "cell_type": "markdown", 230 | "metadata": { 231 | "deletable": true, 232 | "editable": true 233 | }, 234 | "source": [ 235 | "## Preprocess all the data and save it\n", 236 | "Running the code cell below will preprocess all the data and save it to file." 237 | ] 238 | }, 239 | { 240 | "cell_type": "code", 241 | "execution_count": 5, 242 | "metadata": { 243 | "collapsed": false, 244 | "deletable": true, 245 | "editable": true 246 | }, 247 | "outputs": [], 248 | "source": [ 249 | "\n", 250 | "# Preprocess Training, Validation, and Testing Data\n", 251 | "helper.preprocess_and_save_data(data_dir, token_lookup, create_lookup_tables)" 252 | ] 253 | }, 254 | { 255 | "cell_type": "markdown", 256 | "metadata": { 257 | "deletable": true, 258 | "editable": true 259 | }, 260 | "source": [ 261 | "# Check Point\n", 262 | "This is your first checkpoint. If you ever decide to come back to this notebook or have to restart the notebook, you can start from here. The preprocessed data has been saved to disk." 263 | ] 264 | }, 265 | { 266 | "cell_type": "code", 267 | "execution_count": 6, 268 | "metadata": { 269 | "collapsed": false, 270 | "deletable": true, 271 | "editable": true 272 | }, 273 | "outputs": [], 274 | "source": [ 275 | "\n", 276 | "import helper\n", 277 | "import numpy as np\n", 278 | "import problem_unittests as tests\n", 279 | "\n", 280 | "int_text, vocab_to_int, int_to_vocab, token_dict = helper.load_preprocess()" 281 | ] 282 | }, 283 | { 284 | "cell_type": "markdown", 285 | "metadata": { 286 | "deletable": true, 287 | "editable": true 288 | }, 289 | "source": [ 290 | "## Build the Neural Network\n", 291 | "You'll build the components necessary to build a RNN by implementing the following functions below:\n", 292 | "- get_inputs\n", 293 | "- get_init_cell\n", 294 | "- get_embed\n", 295 | "- build_rnn\n", 296 | "- build_nn\n", 297 | "- get_batches\n", 298 | "\n", 299 | "### Check the Version of TensorFlow and Access to GPU" 300 | ] 301 | }, 302 | { 303 | "cell_type": "code", 304 | "execution_count": 7, 305 | "metadata": { 306 | "collapsed": false, 307 | "deletable": true, 308 | "editable": true 309 | }, 310 | "outputs": [ 311 | { 312 | "name": "stdout", 313 | "output_type": "stream", 314 | "text": [ 315 | "TensorFlow Version: 1.0.0\n", 316 | "Default GPU Device: /gpu:0\n" 317 | ] 318 | } 319 | ], 320 | "source": [ 321 | "\n", 322 | "from distutils.version import LooseVersion\n", 323 | "import warnings\n", 324 | "import tensorflow as tf\n", 325 | "\n", 326 | "# Check TensorFlow Version\n", 327 | "assert LooseVersion(tf.__version__) >= LooseVersion('1.0'), 'Please use TensorFlow version 1.0 or newer'\n", 328 | "print('TensorFlow Version: {}'.format(tf.__version__))\n", 329 | "\n", 330 | "# Check for a GPU\n", 331 | "if not tf.test.gpu_device_name():\n", 332 | " warnings.warn('No GPU found. Please use a GPU to train your neural network.')\n", 333 | "else:\n", 334 | " print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))" 335 | ] 336 | }, 337 | { 338 | "cell_type": "markdown", 339 | "metadata": { 340 | "deletable": true, 341 | "editable": true 342 | }, 343 | "source": [ 344 | "### Input\n", 345 | "Implement the `get_inputs()` function to create TF Placeholders for the Neural Network. It should create the following placeholders:\n", 346 | "- Input text placeholder named \"input\" using the [TF Placeholder](https://www.tensorflow.org/api_docs/python/tf/placeholder) `name` parameter.\n", 347 | "- Targets placeholder\n", 348 | "- Learning Rate placeholder\n", 349 | "\n", 350 | "Return the placeholders in the following the tuple `(Input, Targets, LearingRate)`" 351 | ] 352 | }, 353 | { 354 | "cell_type": "code", 355 | "execution_count": 8, 356 | "metadata": { 357 | "collapsed": false, 358 | "deletable": true, 359 | "editable": true 360 | }, 361 | "outputs": [ 362 | { 363 | "name": "stdout", 364 | "output_type": "stream", 365 | "text": [ 366 | "Tests Passed\n" 367 | ] 368 | } 369 | ], 370 | "source": [ 371 | "def get_inputs():\n", 372 | " \"\"\"\n", 373 | " Create TF Placeholders for input, targets, and learning rate.\n", 374 | " :return: Tuple (input, targets, learning rate)\n", 375 | " \"\"\"\n", 376 | " # TODO: Implement Function\n", 377 | " inputs=tf.placeholder(tf.int32,[None,None],name=\"input\")\n", 378 | " targets=tf.placeholder(tf.int32,[None,None],name=\"targets\")\n", 379 | " learning_rate=tf.placeholder(tf.float32,None,name=\"learning_rate\")\n", 380 | " return inputs, targets, learning_rate\n", 381 | "\n", 382 | "\n", 383 | "\n", 384 | "tests.test_get_inputs(get_inputs)" 385 | ] 386 | }, 387 | { 388 | "cell_type": "markdown", 389 | "metadata": { 390 | "deletable": true, 391 | "editable": true 392 | }, 393 | "source": [ 394 | "### Build RNN Cell and Initialize\n", 395 | "Stack one or more [`BasicLSTMCells`](https://www.tensorflow.org/api_docs/python/tf/contrib/rnn/BasicLSTMCell) in a [`MultiRNNCell`](https://www.tensorflow.org/api_docs/python/tf/contrib/rnn/MultiRNNCell).\n", 396 | "- The Rnn size should be set using `rnn_size`\n", 397 | "- Initalize Cell State using the MultiRNNCell's [`zero_state()`](https://www.tensorflow.org/api_docs/python/tf/contrib/rnn/MultiRNNCell#zero_state) function\n", 398 | " - Apply the name \"initial_state\" to the initial state using [`tf.identity()`](https://www.tensorflow.org/api_docs/python/tf/identity)\n", 399 | "\n", 400 | "Return the cell and initial state in the following tuple `(Cell, InitialState)`" 401 | ] 402 | }, 403 | { 404 | "cell_type": "code", 405 | "execution_count": 9, 406 | "metadata": { 407 | "collapsed": false, 408 | "deletable": true, 409 | "editable": true 410 | }, 411 | "outputs": [ 412 | { 413 | "name": "stdout", 414 | "output_type": "stream", 415 | "text": [ 416 | "Tests Passed\n" 417 | ] 418 | } 419 | ], 420 | "source": [ 421 | "def get_init_cell(batch_size, rnn_size):\n", 422 | " \"\"\"\n", 423 | " Create an RNN Cell and initialize it.\n", 424 | " :param batch_size: Size of batches\n", 425 | " :param rnn_size: Size of RNNs\n", 426 | " :return: Tuple (cell, initialize state)\n", 427 | " \"\"\"\n", 428 | " # TODO: Implement Function\n", 429 | " num_layers=2\n", 430 | " lstm_layer=tf.contrib.rnn.BasicLSTMCell(rnn_size)\n", 431 | " cell=tf.contrib.rnn.MultiRNNCell([lstm_layer]*num_layers)\n", 432 | " initial_state = cell.zero_state(batch_size, tf.float32)\n", 433 | " initial_state=tf.identity(initial_state,name=\"initial_state\")\n", 434 | " return cell, initial_state\n", 435 | "\n", 436 | "\n", 437 | "\n", 438 | "tests.test_get_init_cell(get_init_cell)" 439 | ] 440 | }, 441 | { 442 | "cell_type": "markdown", 443 | "metadata": { 444 | "deletable": true, 445 | "editable": true 446 | }, 447 | "source": [ 448 | "### Word Embedding\n", 449 | "Apply embedding to `input_data` using TensorFlow. Return the embedded sequence." 450 | ] 451 | }, 452 | { 453 | "cell_type": "code", 454 | "execution_count": 10, 455 | "metadata": { 456 | "collapsed": false, 457 | "deletable": true, 458 | "editable": true 459 | }, 460 | "outputs": [ 461 | { 462 | "name": "stdout", 463 | "output_type": "stream", 464 | "text": [ 465 | "Tests Passed\n" 466 | ] 467 | } 468 | ], 469 | "source": [ 470 | "def get_embed(input_data, vocab_size, embed_dim):\n", 471 | " \"\"\"\n", 472 | " Create embedding for .\n", 473 | " :param input_data: TF placeholder for text input.\n", 474 | " :param vocab_size: Number of words in vocabulary.\n", 475 | " :param embed_dim: Number of embedding dimensions\n", 476 | " :return: Embedded input.\n", 477 | " \"\"\"\n", 478 | " # TODO: Implement Function\n", 479 | " embedding = tf.Variable(tf.random_uniform((vocab_size, embed_dim), -1, 1))\n", 480 | " embed = tf.nn.embedding_lookup(embedding, input_data)\n", 481 | " return embed\n", 482 | "\n", 483 | "\n", 484 | "\n", 485 | "tests.test_get_embed(get_embed)" 486 | ] 487 | }, 488 | { 489 | "cell_type": "markdown", 490 | "metadata": { 491 | "deletable": true, 492 | "editable": true 493 | }, 494 | "source": [ 495 | "### Build RNN\n", 496 | "You created a RNN Cell in the `get_init_cell()` function. Time to use the cell to create a RNN.\n", 497 | "- Build the RNN using the [`tf.nn.dynamic_rnn()`](https://www.tensorflow.org/api_docs/python/tf/nn/dynamic_rnn)\n", 498 | " - Apply the name \"final_state\" to the final state using [`tf.identity()`](https://www.tensorflow.org/api_docs/python/tf/identity)\n", 499 | "\n", 500 | "Return the outputs and final_state state in the following tuple `(Outputs, FinalState)` " 501 | ] 502 | }, 503 | { 504 | "cell_type": "code", 505 | "execution_count": 11, 506 | "metadata": { 507 | "collapsed": false, 508 | "deletable": true, 509 | "editable": true 510 | }, 511 | "outputs": [ 512 | { 513 | "name": "stdout", 514 | "output_type": "stream", 515 | "text": [ 516 | "Tests Passed\n" 517 | ] 518 | } 519 | ], 520 | "source": [ 521 | "def build_rnn(cell, inputs):\n", 522 | " \"\"\"\n", 523 | " Create a RNN using a RNN Cell\n", 524 | " :param cell: RNN Cell\n", 525 | " :param inputs: Input text data\n", 526 | " :return: Tuple (Outputs, Final State)\n", 527 | " \"\"\"\n", 528 | " # TODO: Implement Function\n", 529 | " outputs,final_state=tf.nn.dynamic_rnn(cell,inputs,dtype=tf.float32)\n", 530 | " final_state=tf.identity(final_state,name=\"final_state\")\n", 531 | " return outputs, final_state\n", 532 | "\n", 533 | "\n", 534 | "\n", 535 | "tests.test_build_rnn(build_rnn)" 536 | ] 537 | }, 538 | { 539 | "cell_type": "markdown", 540 | "metadata": { 541 | "deletable": true, 542 | "editable": true 543 | }, 544 | "source": [ 545 | "### Build the Neural Network\n", 546 | "Apply the functions you implemented above to:\n", 547 | "- Apply embedding to `input_data` using your `get_embed(input_data, vocab_size, embed_dim)` function.\n", 548 | "- Build RNN using `cell` and your `build_rnn(cell, inputs)` function.\n", 549 | "- Apply a fully connected layer with a linear activation and `vocab_size` as the number of outputs.\n", 550 | "\n", 551 | "Return the logits and final state in the following tuple (Logits, FinalState) " 552 | ] 553 | }, 554 | { 555 | "cell_type": "code", 556 | "execution_count": 12, 557 | "metadata": { 558 | "collapsed": false, 559 | "deletable": true, 560 | "editable": true 561 | }, 562 | "outputs": [ 563 | { 564 | "name": "stdout", 565 | "output_type": "stream", 566 | "text": [ 567 | "Tests Passed\n" 568 | ] 569 | } 570 | ], 571 | "source": [ 572 | "def build_nn(cell, rnn_size, input_data, vocab_size, embed_dim):\n", 573 | " \"\"\"\n", 574 | " Build part of the neural network\n", 575 | " :param cell: RNN cell\n", 576 | " :param rnn_size: Size of rnns\n", 577 | " :param input_data: Input data\n", 578 | " :param vocab_size: Vocabulary size\n", 579 | " :param embed_dim: Number of embedding dimensions\n", 580 | " :return: Tuple (Logits, FinalState)\n", 581 | " \"\"\"\n", 582 | " # TODO: Implement Function\n", 583 | " embeddings=get_embed(input_data,vocab_size,embed_dim)\n", 584 | " outputs,final_state=build_rnn(cell,embeddings)\n", 585 | " w_init=tf.truncated_normal_initializer(stddev=0.01)\n", 586 | " b_init=tf.zeros_initializer()\n", 587 | " fully_connected= tf.contrib.layers.fully_connected(outputs,\n", 588 | " vocab_size,\n", 589 | " weights_initializer=w_init,\n", 590 | " biases_initializer=b_init,\n", 591 | " activation_fn=None)\n", 592 | " return fully_connected,final_state\n", 593 | "\n", 594 | "\n", 595 | "\n", 596 | "tests.test_build_nn(build_nn)" 597 | ] 598 | }, 599 | { 600 | "cell_type": "markdown", 601 | "metadata": { 602 | "deletable": true, 603 | "editable": true 604 | }, 605 | "source": [ 606 | "### Batches\n", 607 | "Implement `get_batches` to create batches of input and targets using `int_text`. The batches should be a Numpy array with the shape `(number of batches, 2, batch size, sequence length)`. Each batch contains two elements:\n", 608 | "- The first element is a single batch of **input** with the shape `[batch size, sequence length]`\n", 609 | "- The second element is a single batch of **targets** with the shape `[batch size, sequence length]`\n", 610 | "\n", 611 | "If you can't fill the last batch with enough data, drop the last batch.\n", 612 | "\n", 613 | "For exmple, `get_batches([1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15], 2, 3)` would return a Numpy array of the following:\n", 614 | "```\n", 615 | "[\n", 616 | " # First Batch\n", 617 | " [\n", 618 | " # Batch of Input\n", 619 | " [[ 1 2 3], [ 7 8 9]],\n", 620 | " # Batch of targets\n", 621 | " [[ 2 3 4], [ 8 9 10]]\n", 622 | " ],\n", 623 | " \n", 624 | " # Second Batch\n", 625 | " [\n", 626 | " # Batch of Input\n", 627 | " [[ 4 5 6], [10 11 12]],\n", 628 | " # Batch of targets\n", 629 | " [[ 5 6 7], [11 12 13]]\n", 630 | " ]\n", 631 | "]\n", 632 | "```" 633 | ] 634 | }, 635 | { 636 | "cell_type": "code", 637 | "execution_count": 13, 638 | "metadata": { 639 | "collapsed": false, 640 | "deletable": true, 641 | "editable": true 642 | }, 643 | "outputs": [ 644 | { 645 | "name": "stdout", 646 | "output_type": "stream", 647 | "text": [ 648 | "Tests Passed\n" 649 | ] 650 | } 651 | ], 652 | "source": [ 653 | "def get_batches(int_text, batch_size, seq_length):\n", 654 | " \"\"\"\n", 655 | " Return batches of input and target\n", 656 | " :param int_text: Text with the words replaced by their ids\n", 657 | " :param batch_size: The size of batch\n", 658 | " :param seq_length: The length of sequence\n", 659 | " :return: Batches as a Numpy array\n", 660 | " \"\"\"\n", 661 | " # TODO: Implement Function\n", 662 | " n_batches= int(len(int_text)//(batch_size*seq_length))\n", 663 | " \n", 664 | " x=np.array(int_text[:n_batches*batch_size*seq_length])\n", 665 | " y=np.array(int_text[1:n_batches*batch_size*seq_length+1])\n", 666 | " #print(x.shape)\n", 667 | " #print(y.shape)\n", 668 | "\n", 669 | " y[-1]=x[0]\n", 670 | "\n", 671 | " x_batches=np.array(np.split(x.reshape(batch_size,-1),n_batches,1))\n", 672 | " #print(x_batches)\n", 673 | " y_batches=np.array(np.split(y.reshape(batch_size,-1),n_batches,1))\n", 674 | " \n", 675 | " return np.array(list(zip(x_batches,y_batches)))\n", 676 | "\n", 677 | "\n", 678 | "\n", 679 | "tests.test_get_batches(get_batches)" 680 | ] 681 | }, 682 | { 683 | "cell_type": "markdown", 684 | "metadata": { 685 | "deletable": true, 686 | "editable": true 687 | }, 688 | "source": [ 689 | "## Neural Network Training\n", 690 | "### Hyperparameters\n", 691 | "Tune the following parameters:\n", 692 | "\n", 693 | "- Set `num_epochs` to the number of epochs.\n", 694 | "- Set `batch_size` to the batch size.\n", 695 | "- Set `rnn_size` to the size of the RNNs.\n", 696 | "- Set `embed_dim` to the size of the embedding.\n", 697 | "- Set `seq_length` to the length of sequence.\n", 698 | "- Set `learning_rate` to the learning rate.\n", 699 | "- Set `show_every_n_batches` to the number of batches the neural network should print progress." 700 | ] 701 | }, 702 | { 703 | "cell_type": "code", 704 | "execution_count": 14, 705 | "metadata": { 706 | "collapsed": true, 707 | "deletable": true, 708 | "editable": true 709 | }, 710 | "outputs": [], 711 | "source": [ 712 | "# Number of Epochs\n", 713 | "num_epochs = 200\n", 714 | "# Batch Size\n", 715 | "batch_size = 512\n", 716 | "# RNN Size\n", 717 | "rnn_size = 1024\n", 718 | "# Embedding Dimension Size\n", 719 | "embed_dim = 500\n", 720 | "# Sequence Length\n", 721 | "seq_length = 15\n", 722 | "# Learning Rate\n", 723 | "learning_rate = .001\n", 724 | "# Show stats for every n number of batches\n", 725 | "show_every_n_batches = 100\n", 726 | "\n", 727 | "\n", 728 | "save_dir = './save'" 729 | ] 730 | }, 731 | { 732 | "cell_type": "markdown", 733 | "metadata": { 734 | "deletable": true, 735 | "editable": true 736 | }, 737 | "source": [ 738 | "### Build the Graph\n", 739 | "Build the graph using the neural network you implemented." 740 | ] 741 | }, 742 | { 743 | "cell_type": "code", 744 | "execution_count": 15, 745 | "metadata": { 746 | "collapsed": false, 747 | "deletable": true, 748 | "editable": true 749 | }, 750 | "outputs": [], 751 | "source": [ 752 | "\n", 753 | "from tensorflow.contrib import seq2seq\n", 754 | "\n", 755 | "train_graph = tf.Graph()\n", 756 | "with train_graph.as_default():\n", 757 | " vocab_size = len(int_to_vocab)\n", 758 | " input_text, targets, lr = get_inputs()\n", 759 | " input_data_shape = tf.shape(input_text)\n", 760 | " cell, initial_state = get_init_cell(input_data_shape[0], rnn_size)\n", 761 | " logits, final_state = build_nn(cell, rnn_size, input_text, vocab_size, embed_dim)\n", 762 | "\n", 763 | " # Probabilities for generating words\n", 764 | " probs = tf.nn.softmax(logits, name='probs')\n", 765 | "\n", 766 | " # Loss function\n", 767 | " cost = seq2seq.sequence_loss(\n", 768 | " logits,\n", 769 | " targets,\n", 770 | " tf.ones([input_data_shape[0], input_data_shape[1]]))\n", 771 | "\n", 772 | " # Optimizer\n", 773 | " optimizer = tf.train.AdamOptimizer(lr)\n", 774 | "\n", 775 | " # Gradient Clipping\n", 776 | " gradients = optimizer.compute_gradients(cost)\n", 777 | " capped_gradients = [(tf.clip_by_value(grad, -1., 1.), var) for grad, var in gradients]\n", 778 | " train_op = optimizer.apply_gradients(capped_gradients)" 779 | ] 780 | }, 781 | { 782 | "cell_type": "markdown", 783 | "metadata": { 784 | "deletable": true, 785 | "editable": true 786 | }, 787 | "source": [ 788 | "## Train\n", 789 | "Train the neural network on the preprocessed data. If you have a hard time getting a good loss, check the [forms](https://discussions.udacity.com/) to see if anyone is having the same problem." 790 | ] 791 | }, 792 | { 793 | "cell_type": "code", 794 | "execution_count": 16, 795 | "metadata": { 796 | "collapsed": false, 797 | "deletable": true, 798 | "editable": true 799 | }, 800 | "outputs": [ 801 | { 802 | "name": "stdout", 803 | "output_type": "stream", 804 | "text": [ 805 | "Epoch 0 Batch 0/8 train_loss = 8.821\n", 806 | "Epoch 12 Batch 4/8 train_loss = 5.590\n", 807 | "Epoch 25 Batch 0/8 train_loss = 4.842\n", 808 | "Epoch 37 Batch 4/8 train_loss = 4.344\n", 809 | "Epoch 50 Batch 0/8 train_loss = 3.856\n", 810 | "Epoch 62 Batch 4/8 train_loss = 3.453\n", 811 | "Epoch 75 Batch 0/8 train_loss = 3.023\n", 812 | "Epoch 87 Batch 4/8 train_loss = 2.550\n", 813 | "Epoch 100 Batch 0/8 train_loss = 2.238\n", 814 | "Epoch 112 Batch 4/8 train_loss = 1.769\n", 815 | "Epoch 125 Batch 0/8 train_loss = 1.245\n", 816 | "Epoch 137 Batch 4/8 train_loss = 0.874\n", 817 | "Epoch 150 Batch 0/8 train_loss = 0.710\n", 818 | "Epoch 162 Batch 4/8 train_loss = 0.430\n", 819 | "Epoch 175 Batch 0/8 train_loss = 0.345\n", 820 | "Epoch 187 Batch 4/8 train_loss = 0.257\n", 821 | "Model Trained and Saved\n" 822 | ] 823 | } 824 | ], 825 | "source": [ 826 | "\"\"\"\n", 827 | "DON'T MODIFY ANYTHING IN THIS CELL\n", 828 | "\"\"\"\n", 829 | "batches = get_batches(int_text, batch_size, seq_length)\n", 830 | "\n", 831 | "with tf.Session(graph=train_graph) as sess:\n", 832 | " sess.run(tf.global_variables_initializer())\n", 833 | "\n", 834 | " for epoch_i in range(num_epochs):\n", 835 | " state = sess.run(initial_state, {input_text: batches[0][0]})\n", 836 | "\n", 837 | " for batch_i, (x, y) in enumerate(batches):\n", 838 | " feed = {\n", 839 | " input_text: x,\n", 840 | " targets: y,\n", 841 | " initial_state: state,\n", 842 | " lr: learning_rate}\n", 843 | " train_loss, state, _ = sess.run([cost, final_state, train_op], feed)\n", 844 | "\n", 845 | " # Show every batches\n", 846 | " if (epoch_i * len(batches) + batch_i) % show_every_n_batches == 0:\n", 847 | " print('Epoch {:>3} Batch {:>4}/{} train_loss = {:.3f}'.format(\n", 848 | " epoch_i,\n", 849 | " batch_i,\n", 850 | " len(batches),\n", 851 | " train_loss))\n", 852 | "\n", 853 | " # Save Model\n", 854 | " saver = tf.train.Saver()\n", 855 | " saver.save(sess, save_dir)\n", 856 | " print('Model Trained and Saved')" 857 | ] 858 | }, 859 | { 860 | "cell_type": "markdown", 861 | "metadata": { 862 | "deletable": true, 863 | "editable": true 864 | }, 865 | "source": [ 866 | "## Save Parameters\n", 867 | "Save `seq_length` and `save_dir` for generating a new TV script." 868 | ] 869 | }, 870 | { 871 | "cell_type": "code", 872 | "execution_count": 17, 873 | "metadata": { 874 | "collapsed": false, 875 | "deletable": true, 876 | "editable": true 877 | }, 878 | "outputs": [], 879 | "source": [ 880 | "\n", 881 | "# Save parameters for checkpoint\n", 882 | "helper.save_params((seq_length, save_dir))" 883 | ] 884 | }, 885 | { 886 | "cell_type": "markdown", 887 | "metadata": { 888 | "deletable": true, 889 | "editable": true 890 | }, 891 | "source": [ 892 | "# Checkpoint" 893 | ] 894 | }, 895 | { 896 | "cell_type": "code", 897 | "execution_count": 18, 898 | "metadata": { 899 | "collapsed": false, 900 | "deletable": true, 901 | "editable": true 902 | }, 903 | "outputs": [], 904 | "source": [ 905 | "\n", 906 | "import tensorflow as tf\n", 907 | "import numpy as np\n", 908 | "import helper\n", 909 | "import problem_unittests as tests\n", 910 | "\n", 911 | "_, vocab_to_int, int_to_vocab, token_dict = helper.load_preprocess()\n", 912 | "seq_length, load_dir = helper.load_params()" 913 | ] 914 | }, 915 | { 916 | "cell_type": "markdown", 917 | "metadata": { 918 | "deletable": true, 919 | "editable": true 920 | }, 921 | "source": [ 922 | "## Implement Generate Functions\n", 923 | "### Get Tensors\n", 924 | "Get tensors from `loaded_graph` using the function [`get_tensor_by_name()`](https://www.tensorflow.org/api_docs/python/tf/Graph#get_tensor_by_name). Get the tensors using the following names:\n", 925 | "- \"input:0\"\n", 926 | "- \"initial_state:0\"\n", 927 | "- \"final_state:0\"\n", 928 | "- \"probs:0\"\n", 929 | "\n", 930 | "Return the tensors in the following tuple `(InputTensor, InitialStateTensor, FinalStateTensor, ProbsTensor)` " 931 | ] 932 | }, 933 | { 934 | "cell_type": "code", 935 | "execution_count": 19, 936 | "metadata": { 937 | "collapsed": false, 938 | "deletable": true, 939 | "editable": true 940 | }, 941 | "outputs": [ 942 | { 943 | "name": "stdout", 944 | "output_type": "stream", 945 | "text": [ 946 | "Tests Passed\n" 947 | ] 948 | } 949 | ], 950 | "source": [ 951 | "def get_tensors(loaded_graph):\n", 952 | " \"\"\"\n", 953 | " Get input, initial state, final state, and probabilities tensor from \n", 954 | " :param loaded_graph: TensorFlow graph loaded from file\n", 955 | " :return: Tuple (InputTensor, InitialStateTensor, FinalStateTensor, ProbsTensor)\n", 956 | " \"\"\"\n", 957 | " # TODO: Implement Function\n", 958 | " InputTensor=loaded_graph.get_tensor_by_name('input:0')\n", 959 | " InitialStateTensor=loaded_graph.get_tensor_by_name('initial_state:0')\n", 960 | " FinalStateTensor=loaded_graph.get_tensor_by_name('final_state:0')\n", 961 | " ProbsTensor=loaded_graph.get_tensor_by_name('probs:0')\n", 962 | " return (InputTensor, InitialStateTensor, FinalStateTensor, ProbsTensor)\n", 963 | "\n", 964 | "\n", 965 | "\n", 966 | "tests.test_get_tensors(get_tensors)" 967 | ] 968 | }, 969 | { 970 | "cell_type": "markdown", 971 | "metadata": { 972 | "deletable": true, 973 | "editable": true 974 | }, 975 | "source": [ 976 | "### Choose Word\n", 977 | "Implement the `pick_word()` function to select the next word using `probabilities`." 978 | ] 979 | }, 980 | { 981 | "cell_type": "code", 982 | "execution_count": 20, 983 | "metadata": { 984 | "collapsed": false, 985 | "deletable": true, 986 | "editable": true 987 | }, 988 | "outputs": [ 989 | { 990 | "name": "stdout", 991 | "output_type": "stream", 992 | "text": [ 993 | "Tests Passed\n" 994 | ] 995 | } 996 | ], 997 | "source": [ 998 | "def pick_word(probabilities, int_to_vocab):\n", 999 | " \"\"\"\n", 1000 | " Pick the next word in the generated text\n", 1001 | " :param probabilities: Probabilites of the next word\n", 1002 | " :param int_to_vocab: Dictionary of word ids as the keys and words as the values\n", 1003 | " :return: String of the predicted word\n", 1004 | " \"\"\"\n", 1005 | " # TODO: Implement Function\n", 1006 | " word_id=np.random.choice(len(probabilities),p=probabilities)\n", 1007 | " \n", 1008 | " return int_to_vocab[word_id]\n", 1009 | "\n", 1010 | "\n", 1011 | "\n", 1012 | "tests.test_pick_word(pick_word)" 1013 | ] 1014 | }, 1015 | { 1016 | "cell_type": "markdown", 1017 | "metadata": { 1018 | "deletable": true, 1019 | "editable": true 1020 | }, 1021 | "source": [ 1022 | "## Generate TV Script\n", 1023 | "This will generate the TV script for you. Set `gen_length` to the length of TV script you want to generate." 1024 | ] 1025 | }, 1026 | { 1027 | "cell_type": "code", 1028 | "execution_count": 21, 1029 | "metadata": { 1030 | "collapsed": false, 1031 | "deletable": true, 1032 | "editable": true 1033 | }, 1034 | "outputs": [ 1035 | { 1036 | "name": "stdout", 1037 | "output_type": "stream", 1038 | "text": [ 1039 | "moe_szyslak: marge, i loves youse. will youse be mines?\n", 1040 | "lloyd: so, bee fast. and i can't believe him. you know, ya, ma'am and money something. no more down her and thirty same while we'll an arse to be grace and make me to survive a guy.\n", 1041 | "homer_simpson:(slyly) i want to?\n", 1042 | "moe_szyslak: yeah? oh, thanks, moe!\n", 1043 | "marge_simpson:(tipsy) keep the old woman. to the matter with this jar.\n", 1044 | "duffman: homer, i'd get back to the doug: conference dollars.\n", 1045 | "homer_simpson: i was so everybody we're a guy. but i know, here. fifty name. all uh-huh...\n", 1046 | "lenny_leonard:(upset) hey sees moe, grampa! yeah.\n", 1047 | "homer_simpson: it was like our hollowed-out not in the bill.\n", 1048 | "moe_szyslak: six of it.\n", 1049 | "moe_szyslak: aw, in that was, your beloved of make me from me too.\n", 1050 | "moe_szyslak: heh! he floated up banks, liver machine.\n", 1051 | "moe_szyslak: hey, why did i go. and me\n" 1052 | ] 1053 | } 1054 | ], 1055 | "source": [ 1056 | "gen_length = 200\n", 1057 | "# homer_simpson, moe_szyslak, or Barney_Gumble\n", 1058 | "prime_word = 'moe_szyslak'\n", 1059 | "\n", 1060 | "\n", 1061 | "loaded_graph = tf.Graph()\n", 1062 | "with tf.Session(graph=loaded_graph) as sess:\n", 1063 | " # Load saved model\n", 1064 | " loader = tf.train.import_meta_graph(load_dir + '.meta')\n", 1065 | " loader.restore(sess, load_dir)\n", 1066 | "\n", 1067 | " # Get Tensors from loaded model\n", 1068 | " input_text, initial_state, final_state, probs = get_tensors(loaded_graph)\n", 1069 | "\n", 1070 | " # Sentences generation setup\n", 1071 | " gen_sentences = [prime_word + ':']\n", 1072 | " prev_state = sess.run(initial_state, {input_text: np.array([[1]])})\n", 1073 | "\n", 1074 | " # Generate sentences\n", 1075 | " for n in range(gen_length):\n", 1076 | " # Dynamic Input\n", 1077 | " dyn_input = [[vocab_to_int[word] for word in gen_sentences[-seq_length:]]]\n", 1078 | " dyn_seq_length = len(dyn_input[0])\n", 1079 | "\n", 1080 | " # Get Prediction\n", 1081 | " probabilities, prev_state = sess.run(\n", 1082 | " [probs, final_state],\n", 1083 | " {input_text: dyn_input, initial_state: prev_state})\n", 1084 | " \n", 1085 | " pred_word = pick_word(probabilities[dyn_seq_length-1], int_to_vocab)\n", 1086 | "\n", 1087 | " gen_sentences.append(pred_word)\n", 1088 | " \n", 1089 | " # Remove tokens\n", 1090 | " tv_script = ' '.join(gen_sentences)\n", 1091 | " for key, token in token_dict.items():\n", 1092 | " ending = ' ' if key in ['\\n', '(', '\"'] else ''\n", 1093 | " tv_script = tv_script.replace(' ' + token.lower(), key)\n", 1094 | " tv_script = tv_script.replace('\\n ', '\\n')\n", 1095 | " tv_script = tv_script.replace('( ', '(')\n", 1096 | " \n", 1097 | " print(tv_script)" 1098 | ] 1099 | }, 1100 | { 1101 | "cell_type": "markdown", 1102 | "metadata": { 1103 | "deletable": true, 1104 | "editable": true 1105 | }, 1106 | "source": [ 1107 | "# The TV Script is Nonsensical\n", 1108 | "It's ok if the TV script doesn't make any sense. We trained on less than a megabyte of text. In order to get good results, you'll have to use a smaller vocabulary or get more data. Luckly there's more data! As we mentioned in the begging of this project, this is a subset of [another dataset](https://www.kaggle.com/wcukierski/the-simpsons-by-the-data). We didn't have you train on all the data, because that would take too long. However, you are free to train your neural network on all the data. After you complete the project, of course.\n" 1109 | ] 1110 | }, 1111 | { 1112 | "cell_type": "code", 1113 | "execution_count": null, 1114 | "metadata": { 1115 | "collapsed": true, 1116 | "deletable": true, 1117 | "editable": true 1118 | }, 1119 | "outputs": [], 1120 | "source": [] 1121 | } 1122 | ], 1123 | "metadata": { 1124 | "anaconda-cloud": {}, 1125 | "kernelspec": { 1126 | "display_name": "Python [default]", 1127 | "language": "python", 1128 | "name": "python3" 1129 | }, 1130 | "language_info": { 1131 | "codemirror_mode": { 1132 | "name": "ipython", 1133 | "version": 3 1134 | }, 1135 | "file_extension": ".py", 1136 | "mimetype": "text/x-python", 1137 | "name": "python", 1138 | "nbconvert_exporter": "python", 1139 | "pygments_lexer": "ipython3", 1140 | "version": "3.5.2" 1141 | } 1142 | }, 1143 | "nbformat": 4, 1144 | "nbformat_minor": 0 1145 | } 1146 | -------------------------------------------------------------------------------- /GenerateTV_Scripts_RNNs/helper.py: -------------------------------------------------------------------------------- 1 | import os 2 | import pickle 3 | 4 | 5 | def load_data(path): 6 | """ 7 | Load Dataset from File 8 | """ 9 | input_file = os.path.join(path) 10 | with open(input_file, "r") as f: 11 | data = f.read() 12 | 13 | return data 14 | 15 | 16 | def preprocess_and_save_data(dataset_path, token_lookup, create_lookup_tables): 17 | """ 18 | Preprocess Text Data 19 | """ 20 | text = load_data(dataset_path) 21 | 22 | # Ignore notice, since we don't use it for analysing the data 23 | text = text[81:] 24 | 25 | token_dict = token_lookup() 26 | for key, token in token_dict.items(): 27 | text = text.replace(key, ' {} '.format(token)) 28 | 29 | text = text.lower() 30 | text = text.split() 31 | 32 | vocab_to_int, int_to_vocab = create_lookup_tables(text) 33 | int_text = [vocab_to_int[word] for word in text] 34 | pickle.dump((int_text, vocab_to_int, int_to_vocab, token_dict), open('preprocess.p', 'wb')) 35 | 36 | 37 | def load_preprocess(): 38 | """ 39 | Load the Preprocessed Training data and return them in batches of or less 40 | """ 41 | return pickle.load(open('preprocess.p', mode='rb')) 42 | 43 | 44 | def save_params(params): 45 | """ 46 | Save parameters to file 47 | """ 48 | pickle.dump(params, open('params.p', 'wb')) 49 | 50 | 51 | def load_params(): 52 | """ 53 | Load parameters from file 54 | """ 55 | return pickle.load(open('params.p', mode='rb')) 56 | -------------------------------------------------------------------------------- /GenerateTV_Scripts_RNNs/problem_unittests.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | import tensorflow as tf 3 | from tensorflow.contrib import rnn 4 | 5 | 6 | def _print_success_message(): 7 | print('Tests Passed') 8 | 9 | 10 | def test_create_lookup_tables(create_lookup_tables): 11 | with tf.Graph().as_default(): 12 | test_text = ''' 13 | Moe_Szyslak Moe's Tavern Where the elite meet to drink 14 | Bart_Simpson Eh yeah hello is Mike there Last name Rotch 15 | Moe_Szyslak Hold on I'll check Mike Rotch Mike Rotch Hey has anybody seen Mike Rotch lately 16 | Moe_Szyslak Listen you little puke One of these days I'm gonna catch you and I'm gonna carve my name on your back with an ice pick 17 | Moe_Szyslak Whats the matter Homer You're not your normal effervescent self 18 | Homer_Simpson I got my problems Moe Give me another one 19 | Moe_Szyslak Homer hey you should not drink to forget your problems 20 | Barney_Gumble Yeah you should only drink to enhance your social skills''' 21 | 22 | test_text = test_text.lower() 23 | test_text = test_text.split() 24 | 25 | vocab_to_int, int_to_vocab = create_lookup_tables(test_text) 26 | 27 | # Check types 28 | assert isinstance(vocab_to_int, dict),\ 29 | 'vocab_to_int is not a dictionary.' 30 | assert isinstance(int_to_vocab, dict),\ 31 | 'int_to_vocab is not a dictionary.' 32 | 33 | # Compare lengths of dicts 34 | assert len(vocab_to_int) == len(int_to_vocab),\ 35 | 'Length of vocab_to_int and int_to_vocab don\'t match. ' \ 36 | 'vocab_to_int is length {}. int_to_vocab is length {}'.format(len(vocab_to_int), len(int_to_vocab)) 37 | 38 | # Make sure the dicts have the same words 39 | vocab_to_int_word_set = set(vocab_to_int.keys()) 40 | int_to_vocab_word_set = set(int_to_vocab.values()) 41 | 42 | assert not (vocab_to_int_word_set - int_to_vocab_word_set),\ 43 | 'vocab_to_int and int_to_vocab don\'t have the same words.' \ 44 | '{} found in vocab_to_int, but not in int_to_vocab'.format(vocab_to_int_word_set - int_to_vocab_word_set) 45 | assert not (int_to_vocab_word_set - vocab_to_int_word_set),\ 46 | 'vocab_to_int and int_to_vocab don\'t have the same words.' \ 47 | '{} found in int_to_vocab, but not in vocab_to_int'.format(int_to_vocab_word_set - vocab_to_int_word_set) 48 | 49 | # Make sure the dicts have the same word ids 50 | vocab_to_int_word_id_set = set(vocab_to_int.values()) 51 | int_to_vocab_word_id_set = set(int_to_vocab.keys()) 52 | 53 | assert not (vocab_to_int_word_id_set - int_to_vocab_word_id_set),\ 54 | 'vocab_to_int and int_to_vocab don\'t contain the same word ids.' \ 55 | '{} found in vocab_to_int, but not in int_to_vocab'.format(vocab_to_int_word_id_set - int_to_vocab_word_id_set) 56 | assert not (int_to_vocab_word_id_set - vocab_to_int_word_id_set),\ 57 | 'vocab_to_int and int_to_vocab don\'t contain the same word ids.' \ 58 | '{} found in int_to_vocab, but not in vocab_to_int'.format(int_to_vocab_word_id_set - vocab_to_int_word_id_set) 59 | 60 | # Make sure the dicts make the same lookup 61 | missmatches = [(word, id, id, int_to_vocab[id]) for word, id in vocab_to_int.items() if int_to_vocab[id] != word] 62 | 63 | assert not missmatches,\ 64 | 'Found {} missmatche(s). First missmatch: vocab_to_int[{}] = {} and int_to_vocab[{}] = {}'.format( 65 | len(missmatches), 66 | *missmatches[0]) 67 | 68 | assert len(vocab_to_int) > len(set(test_text))/2,\ 69 | 'The length of vocab seems too small. Found a length of {}'.format(len(vocab_to_int)) 70 | 71 | _print_success_message() 72 | 73 | 74 | def test_get_batches(get_batches): 75 | with tf.Graph().as_default(): 76 | test_batch_size = 128 77 | test_seq_length = 5 78 | test_int_text = list(range(1000*test_seq_length)) 79 | batches = get_batches(test_int_text, test_batch_size, test_seq_length) 80 | 81 | # Check type 82 | assert isinstance(batches, np.ndarray),\ 83 | 'Batches is not a Numpy array' 84 | 85 | # Check shape 86 | assert batches.shape == (7, 2, 128, 5),\ 87 | 'Batches returned wrong shape. Found {}'.format(batches.shape) 88 | 89 | _print_success_message() 90 | 91 | 92 | def test_tokenize(token_lookup): 93 | with tf.Graph().as_default(): 94 | symbols = set(['.', ',', '"', ';', '!', '?', '(', ')', '--', '\n']) 95 | token_dict = token_lookup() 96 | 97 | # Check type 98 | assert isinstance(token_dict, dict), \ 99 | 'Returned type is {}.'.format(type(token_dict)) 100 | 101 | # Check symbols 102 | missing_symbols = symbols - set(token_dict.keys()) 103 | unknown_symbols = set(token_dict.keys()) - symbols 104 | 105 | assert not missing_symbols, \ 106 | 'Missing symbols: {}'.format(missing_symbols) 107 | assert not unknown_symbols, \ 108 | 'Unknown symbols: {}'.format(unknown_symbols) 109 | 110 | # Check values type 111 | bad_value_type = [type(val) for val in token_dict.values() if not isinstance(val, str)] 112 | 113 | assert not bad_value_type,\ 114 | 'Found token as {} type.'.format(bad_value_type[0]) 115 | 116 | # Check for spaces 117 | key_has_spaces = [k for k in token_dict.keys() if ' ' in k] 118 | val_has_spaces = [val for val in token_dict.values() if ' ' in val] 119 | 120 | assert not key_has_spaces,\ 121 | 'The key "{}" includes spaces. Remove spaces from keys and values'.format(key_has_spaces[0]) 122 | assert not val_has_spaces,\ 123 | 'The value "{}" includes spaces. Remove spaces from keys and values'.format(val_has_spaces[0]) 124 | 125 | # Check for symbols in values 126 | symbol_val = () 127 | for symbol in symbols: 128 | for val in token_dict.values(): 129 | if symbol in val: 130 | symbol_val = (symbol, val) 131 | 132 | assert not symbol_val,\ 133 | 'Don\'t use a symbol that will be replaced in your tokens. Found the symbol {} in value {}'.format(*symbol_val) 134 | 135 | _print_success_message() 136 | 137 | 138 | def test_get_inputs(get_inputs): 139 | with tf.Graph().as_default(): 140 | input_data, targets, lr = get_inputs() 141 | 142 | # Check type 143 | assert input_data.op.type == 'Placeholder',\ 144 | 'Input not a Placeholder.' 145 | assert targets.op.type == 'Placeholder',\ 146 | 'Targets not a Placeholder.' 147 | assert lr.op.type == 'Placeholder',\ 148 | 'Learning Rate not a Placeholder.' 149 | 150 | # Check name 151 | assert input_data.name == 'input:0',\ 152 | 'Input has bad name. Found name {}'.format(input_data.name) 153 | 154 | # Check rank 155 | input_rank = 0 if input_data.get_shape() == None else len(input_data.get_shape()) 156 | targets_rank = 0 if targets.get_shape() == None else len(targets.get_shape()) 157 | lr_rank = 0 if lr.get_shape() == None else len(lr.get_shape()) 158 | 159 | assert input_rank == 2,\ 160 | 'Input has wrong rank. Rank {} found.'.format(input_rank) 161 | assert targets_rank == 2,\ 162 | 'Targets has wrong rank. Rank {} found.'.format(targets_rank) 163 | assert lr_rank == 0,\ 164 | 'Learning Rate has wrong rank. Rank {} found'.format(lr_rank) 165 | 166 | _print_success_message() 167 | 168 | 169 | def test_get_init_cell(get_init_cell): 170 | with tf.Graph().as_default(): 171 | test_batch_size_ph = tf.placeholder(tf.int32) 172 | test_rnn_size = 256 173 | 174 | cell, init_state = get_init_cell(test_batch_size_ph, test_rnn_size) 175 | 176 | # Check type 177 | assert isinstance(cell, tf.contrib.rnn.MultiRNNCell),\ 178 | 'Cell is wrong type. Found {} type'.format(type(cell)) 179 | 180 | # Check for name attribute 181 | assert hasattr(init_state, 'name'),\ 182 | 'Initial state doesn\'t have the "name" attribute. Try using `tf.identity` to set the name.' 183 | 184 | # Check name 185 | assert init_state.name == 'initial_state:0',\ 186 | 'Initial state doesn\'t have the correct name. Found the name {}'.format(init_state.name) 187 | 188 | _print_success_message() 189 | 190 | 191 | def test_get_embed(get_embed): 192 | with tf.Graph().as_default(): 193 | embed_shape = [50, 5, 256] 194 | test_input_data = tf.placeholder(tf.int32, embed_shape[:2]) 195 | test_vocab_size = 27 196 | test_embed_dim = embed_shape[2] 197 | 198 | embed = get_embed(test_input_data, test_vocab_size, test_embed_dim) 199 | 200 | # Check shape 201 | assert embed.shape == embed_shape,\ 202 | 'Wrong shape. Found shape {}'.format(embed.shape) 203 | 204 | _print_success_message() 205 | 206 | 207 | def test_build_rnn(build_rnn): 208 | with tf.Graph().as_default(): 209 | test_rnn_size = 256 210 | test_rnn_layer_size = 2 211 | test_cell = rnn.MultiRNNCell([rnn.BasicLSTMCell(test_rnn_size)] * test_rnn_layer_size) 212 | 213 | test_inputs = tf.placeholder(tf.float32, [None, None, test_rnn_size]) 214 | outputs, final_state = build_rnn(test_cell, test_inputs) 215 | 216 | # Check name 217 | assert hasattr(final_state, 'name'),\ 218 | 'Final state doesn\'t have the "name" attribute. Try using `tf.identity` to set the name.' 219 | assert final_state.name == 'final_state:0',\ 220 | 'Final state doesn\'t have the correct name. Found the name {}'.format(final_state.name) 221 | 222 | # Check shape 223 | assert outputs.get_shape().as_list() == [None, None, test_rnn_size],\ 224 | 'Outputs has wrong shape. Found shape {}'.format(outputs.get_shape()) 225 | assert final_state.get_shape().as_list() == [test_rnn_layer_size, 2, None, test_rnn_size],\ 226 | 'Final state wrong shape. Found shape {}'.format(final_state.get_shape()) 227 | 228 | _print_success_message() 229 | 230 | 231 | def test_build_nn(build_nn): 232 | with tf.Graph().as_default(): 233 | test_input_data_shape = [128, 5] 234 | test_input_data = tf.placeholder(tf.int32, test_input_data_shape) 235 | test_rnn_size = 256 236 | test_embed_dim = 300 237 | test_rnn_layer_size = 2 238 | test_vocab_size = 27 239 | test_cell = rnn.MultiRNNCell([rnn.BasicLSTMCell(test_rnn_size)] * test_rnn_layer_size) 240 | 241 | logits, final_state = build_nn(test_cell, test_rnn_size, test_input_data, test_vocab_size, test_embed_dim) 242 | 243 | # Check name 244 | assert hasattr(final_state, 'name'), \ 245 | 'Final state doesn\'t have the "name" attribute. Are you using build_rnn?' 246 | assert final_state.name == 'final_state:0', \ 247 | 'Final state doesn\'t have the correct name. Found the name {}. Are you using build_rnn?'.format(final_state.name) 248 | 249 | # Check Shape 250 | assert logits.get_shape().as_list() == test_input_data_shape + [test_vocab_size], \ 251 | 'Outputs has wrong shape. Found shape {}'.format(logits.get_shape()) 252 | assert final_state.get_shape().as_list() == [test_rnn_layer_size, 2, None, test_rnn_size], \ 253 | 'Final state wrong shape. Found shape {}'.format(final_state.get_shape()) 254 | 255 | _print_success_message() 256 | 257 | 258 | def test_get_tensors(get_tensors): 259 | test_graph = tf.Graph() 260 | with test_graph.as_default(): 261 | test_input = tf.placeholder(tf.int32, name='input') 262 | test_initial_state = tf.placeholder(tf.int32, name='initial_state') 263 | test_final_state = tf.placeholder(tf.int32, name='final_state') 264 | test_probs = tf.placeholder(tf.float32, name='probs') 265 | 266 | input_text, initial_state, final_state, probs = get_tensors(test_graph) 267 | 268 | # Check correct tensor 269 | assert input_text == test_input,\ 270 | 'Test input is wrong tensor' 271 | assert initial_state == test_initial_state, \ 272 | 'Initial state is wrong tensor' 273 | assert final_state == test_final_state, \ 274 | 'Final state is wrong tensor' 275 | assert probs == test_probs, \ 276 | 'Probabilities is wrong tensor' 277 | 278 | _print_success_message() 279 | 280 | 281 | def test_pick_word(pick_word): 282 | with tf.Graph().as_default(): 283 | test_probabilities = np.array([0.1, 0.8, 0.05, 0.05]) 284 | test_int_to_vocab = {word_i: word for word_i, word in enumerate(['this', 'is', 'a', 'test'])} 285 | 286 | pred_word = pick_word(test_probabilities, test_int_to_vocab) 287 | 288 | # Check type 289 | assert isinstance(pred_word, str),\ 290 | 'Predicted word is wrong type. Found {} type.'.format(type(pred_word)) 291 | 292 | # Check word is from vocab 293 | assert pred_word in test_int_to_vocab.values(),\ 294 | 'Predicted word not found in int_to_vocab.' 295 | 296 | 297 | _print_success_message() -------------------------------------------------------------------------------- /GenerateTV_Scripts_RNNs/readme.md: -------------------------------------------------------------------------------- 1 | In this project, we'll generate our own Simpsons TV scripts using RNNs. we'll be using part of the Simpsons dataset of scripts from 27 seasons. The Neural Network we'll build will generate a new TV script for a scene at Moe's Tavern. 2 | -------------------------------------------------------------------------------- /ImageClassification_cifar10/README.md: -------------------------------------------------------------------------------- 1 | ## Image classification(CIFAR10) - Using TensorFlow Library in Python ## 2 | -------------------------------------------------------------------------------- /ImageClassification_cifar10/helper.py: -------------------------------------------------------------------------------- 1 | import pickle 2 | import numpy as np 3 | import matplotlib.pyplot as plt 4 | from sklearn.preprocessing import LabelBinarizer 5 | 6 | 7 | def _load_label_names(): 8 | """ 9 | Load the label names from file 10 | """ 11 | return ['airplane', 'automobile', 'bird', 'cat', 'deer', 'dog', 'frog', 'horse', 'ship', 'truck'] 12 | 13 | 14 | def load_cfar10_batch(cifar10_dataset_folder_path, batch_id): 15 | """ 16 | Load a batch of the dataset 17 | """ 18 | with open(cifar10_dataset_folder_path + '/data_batch_' + str(batch_id), mode='rb') as file: 19 | batch = pickle.load(file, encoding='latin1') 20 | 21 | features = batch['data'].reshape((len(batch['data']), 3, 32, 32)).transpose(0, 2, 3, 1) 22 | labels = batch['labels'] 23 | 24 | return features, labels 25 | 26 | 27 | def display_stats(cifar10_dataset_folder_path, batch_id, sample_id): 28 | """ 29 | Display Stats of the the dataset 30 | """ 31 | batch_ids = list(range(1, 6)) 32 | 33 | if batch_id not in batch_ids: 34 | print('Batch Id out of Range. Possible Batch Ids: {}'.format(batch_ids)) 35 | return None 36 | 37 | features, labels = load_cfar10_batch(cifar10_dataset_folder_path, batch_id) 38 | 39 | if not (0 <= sample_id < len(features)): 40 | print('{} samples in batch {}. {} is out of range.'.format(len(features), batch_id, sample_id)) 41 | return None 42 | 43 | print('\nStats of batch {}:'.format(batch_id)) 44 | print('Samples: {}'.format(len(features))) 45 | print('Label Counts: {}'.format(dict(zip(*np.unique(labels, return_counts=True))))) 46 | print('First 20 Labels: {}'.format(labels[:20])) 47 | 48 | sample_image = features[sample_id] 49 | sample_label = labels[sample_id] 50 | label_names = _load_label_names() 51 | 52 | print('\nExample of Image {}:'.format(sample_id)) 53 | print('Image - Min Value: {} Max Value: {}'.format(sample_image.min(), sample_image.max())) 54 | print('Image - Shape: {}'.format(sample_image.shape)) 55 | print('Label - Label Id: {} Name: {}'.format(sample_label, label_names[sample_label])) 56 | plt.axis('off') 57 | plt.imshow(sample_image) 58 | 59 | 60 | def _preprocess_and_save(normalize, one_hot_encode, features, labels, filename): 61 | """ 62 | Preprocess data and save it to file 63 | """ 64 | features = normalize(features) 65 | labels = one_hot_encode(labels) 66 | 67 | pickle.dump((features, labels), open(filename, 'wb')) 68 | 69 | 70 | def preprocess_and_save_data(cifar10_dataset_folder_path, normalize, one_hot_encode): 71 | """ 72 | Preprocess Training and Validation Data 73 | """ 74 | n_batches = 5 75 | valid_features = [] 76 | valid_labels = [] 77 | 78 | for batch_i in range(1, n_batches + 1): 79 | features, labels = load_cfar10_batch(cifar10_dataset_folder_path, batch_i) 80 | validation_count = int(len(features) * 0.1) 81 | 82 | # Prprocess and save a batch of training data 83 | _preprocess_and_save( 84 | normalize, 85 | one_hot_encode, 86 | features[:-validation_count], 87 | labels[:-validation_count], 88 | 'preprocess_batch_' + str(batch_i) + '.p') 89 | 90 | # Use a portion of training batch for validation 91 | valid_features.extend(features[-validation_count:]) 92 | valid_labels.extend(labels[-validation_count:]) 93 | 94 | # Preprocess and Save all validation data 95 | _preprocess_and_save( 96 | normalize, 97 | one_hot_encode, 98 | np.array(valid_features), 99 | np.array(valid_labels), 100 | 'preprocess_validation.p') 101 | 102 | with open(cifar10_dataset_folder_path + '/test_batch', mode='rb') as file: 103 | batch = pickle.load(file, encoding='latin1') 104 | 105 | # load the test data 106 | test_features = batch['data'].reshape((len(batch['data']), 3, 32, 32)).transpose(0, 2, 3, 1) 107 | test_labels = batch['labels'] 108 | 109 | # Preprocess and Save all test data 110 | _preprocess_and_save( 111 | normalize, 112 | one_hot_encode, 113 | np.array(test_features), 114 | np.array(test_labels), 115 | 'preprocess_test.p') 116 | 117 | 118 | def batch_features_labels(features, labels, batch_size): 119 | """ 120 | Split features and labels into batches 121 | """ 122 | for start in range(0, len(features), batch_size): 123 | end = min(start + batch_size, len(features)) 124 | yield features[start:end], labels[start:end] 125 | 126 | 127 | def load_preprocess_training_batch(batch_id, batch_size): 128 | """ 129 | Load the Preprocessed Training data and return them in batches of or less 130 | """ 131 | filename = 'preprocess_batch_' + str(batch_id) + '.p' 132 | features, labels = pickle.load(open(filename, mode='rb')) 133 | 134 | # Return the training data in batches of size or less 135 | return batch_features_labels(features, labels, batch_size) 136 | 137 | 138 | def display_image_predictions(features, labels, predictions): 139 | n_classes = 10 140 | label_names = _load_label_names() 141 | label_binarizer = LabelBinarizer() 142 | label_binarizer.fit(range(n_classes)) 143 | label_ids = label_binarizer.inverse_transform(np.array(labels)) 144 | 145 | fig, axies = plt.subplots(nrows=4, ncols=2) 146 | fig.tight_layout() 147 | fig.suptitle('Softmax Predictions', fontsize=20, y=1.1) 148 | 149 | n_predictions = 3 150 | margin = 0.05 151 | ind = np.arange(n_predictions) 152 | width = (1. - 2. * margin) / n_predictions 153 | 154 | for image_i, (feature, label_id, pred_indicies, pred_values) in enumerate(zip(features, label_ids, predictions.indices, predictions.values)): 155 | pred_names = [label_names[pred_i] for pred_i in pred_indicies] 156 | correct_name = label_names[label_id] 157 | 158 | axies[image_i][0].imshow(feature) 159 | axies[image_i][0].set_title(correct_name) 160 | axies[image_i][0].set_axis_off() 161 | 162 | axies[image_i][1].barh(ind + margin, pred_values[::-1], width) 163 | axies[image_i][1].set_yticks(ind + margin) 164 | axies[image_i][1].set_yticklabels(pred_names[::-1]) 165 | axies[image_i][1].set_xticks([0, 0.5, 1.0]) 166 | -------------------------------------------------------------------------------- /ImageClassification_cifar10/problem_unittests.py: -------------------------------------------------------------------------------- 1 | import os 2 | import numpy as np 3 | import tensorflow as tf 4 | import random 5 | from unittest.mock import MagicMock 6 | 7 | 8 | def _print_success_message(): 9 | print('Tests Passed') 10 | 11 | 12 | def test_folder_path(cifar10_dataset_folder_path): 13 | assert cifar10_dataset_folder_path is not None,\ 14 | 'Cifar-10 data folder not set.' 15 | assert cifar10_dataset_folder_path[-1] != '/',\ 16 | 'The "/" shouldn\'t be added to the end of the path.' 17 | assert os.path.exists(cifar10_dataset_folder_path),\ 18 | 'Path not found.' 19 | assert os.path.isdir(cifar10_dataset_folder_path),\ 20 | '{} is not a folder.'.format(os.path.basename(cifar10_dataset_folder_path)) 21 | 22 | train_files = [cifar10_dataset_folder_path + '/data_batch_' + str(batch_id) for batch_id in range(1, 6)] 23 | other_files = [cifar10_dataset_folder_path + '/batches.meta', cifar10_dataset_folder_path + '/test_batch'] 24 | missing_files = [path for path in train_files + other_files if not os.path.exists(path)] 25 | 26 | assert not missing_files,\ 27 | 'Missing files in directory: {}'.format(missing_files) 28 | 29 | print('All files found!') 30 | 31 | 32 | def test_normalize(normalize): 33 | test_shape = (np.random.choice(range(1000)), 32, 32, 3) 34 | test_numbers = np.random.choice(range(256), test_shape) 35 | normalize_out = normalize(test_numbers) 36 | 37 | assert type(normalize_out).__module__ == np.__name__,\ 38 | 'Not Numpy Object' 39 | 40 | assert normalize_out.shape == test_shape,\ 41 | 'Incorrect Shape. {} shape found'.format(normalize_out.shape) 42 | 43 | assert normalize_out.max() <= 1 and normalize_out.min() >= 0,\ 44 | 'Incorect Range. {} to {} found'.format(normalize_out.min(), normalize_out.max()) 45 | 46 | _print_success_message() 47 | 48 | 49 | def test_one_hot_encode(one_hot_encode): 50 | test_shape = np.random.choice(range(1000)) 51 | test_numbers = np.random.choice(range(10), test_shape) 52 | one_hot_out = one_hot_encode(test_numbers) 53 | 54 | assert type(one_hot_out).__module__ == np.__name__,\ 55 | 'Not Numpy Object' 56 | 57 | assert one_hot_out.shape == (test_shape, 10),\ 58 | 'Incorrect Shape. {} shape found'.format(one_hot_out.shape) 59 | 60 | n_encode_tests = 5 61 | test_pairs = list(zip(test_numbers, one_hot_out)) 62 | test_indices = np.random.choice(len(test_numbers), n_encode_tests) 63 | labels = [test_pairs[test_i][0] for test_i in test_indices] 64 | enc_labels = np.array([test_pairs[test_i][1] for test_i in test_indices]) 65 | new_enc_labels = one_hot_encode(labels) 66 | 67 | assert np.array_equal(enc_labels, new_enc_labels),\ 68 | 'Encodings returned different results for the same numbers.\n' \ 69 | 'For the first call it returned:\n' \ 70 | '{}\n' \ 71 | 'For the second call it returned\n' \ 72 | '{}\n' \ 73 | 'Make sure you save the map of labels to encodings outside of the function.'.format(enc_labels, new_enc_labels) 74 | 75 | _print_success_message() 76 | 77 | 78 | def test_nn_image_inputs(neural_net_image_input): 79 | image_shape = (32, 32, 3) 80 | nn_inputs_out_x = neural_net_image_input(image_shape) 81 | 82 | assert nn_inputs_out_x.get_shape().as_list() == [None, image_shape[0], image_shape[1], image_shape[2]],\ 83 | 'Incorrect Image Shape. Found {} shape'.format(nn_inputs_out_x.get_shape().as_list()) 84 | 85 | assert nn_inputs_out_x.op.type == 'Placeholder',\ 86 | 'Incorrect Image Type. Found {} type'.format(nn_inputs_out_x.op.type) 87 | 88 | assert nn_inputs_out_x.name == 'x:0', \ 89 | 'Incorrect Name. Found {}'.format(nn_inputs_out_x.name) 90 | 91 | print('Image Input Tests Passed.') 92 | 93 | 94 | def test_nn_label_inputs(neural_net_label_input): 95 | n_classes = 10 96 | nn_inputs_out_y = neural_net_label_input(n_classes) 97 | 98 | assert nn_inputs_out_y.get_shape().as_list() == [None, n_classes],\ 99 | 'Incorrect Label Shape. Found {} shape'.format(nn_inputs_out_y.get_shape().as_list()) 100 | 101 | assert nn_inputs_out_y.op.type == 'Placeholder',\ 102 | 'Incorrect Label Type. Found {} type'.format(nn_inputs_out_y.op.type) 103 | 104 | assert nn_inputs_out_y.name == 'y:0', \ 105 | 'Incorrect Name. Found {}'.format(nn_inputs_out_y.name) 106 | 107 | print('Label Input Tests Passed.') 108 | 109 | 110 | def test_nn_keep_prob_inputs(neural_net_keep_prob_input): 111 | nn_inputs_out_k = neural_net_keep_prob_input() 112 | 113 | assert nn_inputs_out_k.get_shape().ndims is None,\ 114 | 'Too many dimensions found for keep prob. Found {} dimensions. It should be a scalar (0-Dimension Tensor).'.format(nn_inputs_out_k.get_shape().ndims) 115 | 116 | assert nn_inputs_out_k.op.type == 'Placeholder',\ 117 | 'Incorrect keep prob Type. Found {} type'.format(nn_inputs_out_k.op.type) 118 | 119 | assert nn_inputs_out_k.name == 'keep_prob:0', \ 120 | 'Incorrect Name. Found {}'.format(nn_inputs_out_k.name) 121 | 122 | print('Keep Prob Tests Passed.') 123 | 124 | 125 | def test_con_pool(conv2d_maxpool): 126 | test_x = tf.placeholder(tf.float32, [None, 32, 32, 5]) 127 | test_num_outputs = 10 128 | test_con_k = (2, 2) 129 | test_con_s = (4, 4) 130 | test_pool_k = (2, 2) 131 | test_pool_s = (2, 2) 132 | 133 | conv2d_maxpool_out = conv2d_maxpool(test_x, test_num_outputs, test_con_k, test_con_s, test_pool_k, test_pool_s) 134 | 135 | assert conv2d_maxpool_out.get_shape().as_list() == [None, 4, 4, 10],\ 136 | 'Incorrect Shape. Found {} shape'.format(conv2d_maxpool_out.get_shape().as_list()) 137 | 138 | _print_success_message() 139 | 140 | 141 | def test_flatten(flatten): 142 | test_x = tf.placeholder(tf.float32, [None, 10, 30, 6]) 143 | flat_out = flatten(test_x) 144 | 145 | assert flat_out.get_shape().as_list() == [None, 10*30*6],\ 146 | 'Incorrect Shape. Found {} shape'.format(flat_out.get_shape().as_list()) 147 | 148 | _print_success_message() 149 | 150 | 151 | def test_fully_conn(fully_conn): 152 | test_x = tf.placeholder(tf.float32, [None, 128]) 153 | test_num_outputs = 40 154 | 155 | fc_out = fully_conn(test_x, test_num_outputs) 156 | 157 | assert fc_out.get_shape().as_list() == [None, 40],\ 158 | 'Incorrect Shape. Found {} shape'.format(fc_out.get_shape().as_list()) 159 | 160 | _print_success_message() 161 | 162 | 163 | def test_output(output): 164 | test_x = tf.placeholder(tf.float32, [None, 128]) 165 | test_num_outputs = 40 166 | 167 | output_out = output(test_x, test_num_outputs) 168 | 169 | assert output_out.get_shape().as_list() == [None, 40],\ 170 | 'Incorrect Shape. Found {} shape'.format(output_out.get_shape().as_list()) 171 | 172 | _print_success_message() 173 | 174 | 175 | def test_conv_net(conv_net): 176 | test_x = tf.placeholder(tf.float32, [None, 32, 32, 3]) 177 | test_k = tf.placeholder(tf.float32) 178 | 179 | logits_out = conv_net(test_x, test_k) 180 | 181 | assert logits_out.get_shape().as_list() == [None, 10],\ 182 | 'Incorrect Model Output. Found {}'.format(logits_out.get_shape().as_list()) 183 | 184 | print('Neural Network Built!') 185 | 186 | 187 | def test_train_nn(train_neural_network): 188 | mock_session = tf.Session() 189 | test_x = np.random.rand(128, 32, 32, 3) 190 | test_y = np.random.rand(128, 10) 191 | test_k = np.random.rand(1) 192 | test_optimizer = tf.train.AdamOptimizer() 193 | 194 | mock_session.run = MagicMock() 195 | train_neural_network(mock_session, test_optimizer, test_k, test_x, test_y) 196 | 197 | assert mock_session.run.called, 'Session not used' 198 | 199 | _print_success_message() 200 | --------------------------------------------------------------------------------