├── .DS_Store ├── output ├── .DS_Store ├── Screenshot 2023-10-14 at 3.23.44 PM.png ├── Screenshot 2023-10-14 at 3.24.22 PM.png ├── Screenshot 2023-10-14 at 3.24.38 PM.png └── Screenshot 2023-10-14 at 3.24.49 PM.png ├── CODE_OF_CONDUCT.md ├── THIRD-PARTY ├── LICENSE ├── README.md ├── CONTRIBUTING.md ├── cnn_Track05.ipynb ├── carla-manual-2.py └── carla-manual-datacollect.py /.DS_Store: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/aiml-driving-behaviour-cloning-using-cnn/main/.DS_Store -------------------------------------------------------------------------------- /output/.DS_Store: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/aiml-driving-behaviour-cloning-using-cnn/main/output/.DS_Store -------------------------------------------------------------------------------- /output/Screenshot 2023-10-14 at 3.23.44 PM.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/aiml-driving-behaviour-cloning-using-cnn/main/output/Screenshot 2023-10-14 at 3.23.44 PM.png -------------------------------------------------------------------------------- /output/Screenshot 2023-10-14 at 3.24.22 PM.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/aiml-driving-behaviour-cloning-using-cnn/main/output/Screenshot 2023-10-14 at 3.24.22 PM.png -------------------------------------------------------------------------------- /output/Screenshot 2023-10-14 at 3.24.38 PM.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/aiml-driving-behaviour-cloning-using-cnn/main/output/Screenshot 2023-10-14 at 3.24.38 PM.png -------------------------------------------------------------------------------- /output/Screenshot 2023-10-14 at 3.24.49 PM.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/aws-samples/aiml-driving-behaviour-cloning-using-cnn/main/output/Screenshot 2023-10-14 at 3.24.49 PM.png -------------------------------------------------------------------------------- /CODE_OF_CONDUCT.md: -------------------------------------------------------------------------------- 1 | ## Code of Conduct 2 | This project has adopted the [Amazon Open Source Code of Conduct](https://aws.github.io/code-of-conduct). 3 | For more information see the [Code of Conduct FAQ](https://aws.github.io/code-of-conduct-faq) or contact 4 | opensource-codeofconduct@amazon.com with any additional questions or comments. 5 | -------------------------------------------------------------------------------- /THIRD-PARTY: -------------------------------------------------------------------------------- 1 | # Copyright (c) 2019 Computer Vision Center (CVC) at the Universitat Autonoma de 2 | # Barcelona (UAB). 3 | # 4 | # This work is licensed under the terms of the MIT license. 5 | # For a copy, see . 6 | 7 | # Allows controlling a vehicle with a keyboard. For a simpler and more 8 | # documented example, please take a look at tutorial.py. -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. 2 | 3 | Permission is hereby granted, free of charge, to any person obtaining a copy of this 4 | software and associated documentation files (the "Software"), to deal in the Software 5 | without restriction, including without limitation the rights to use, copy, modify, 6 | merge, publish, distribute, sublicense, and/or sell copies of the Software, and to 7 | permit persons to whom the Software is furnished to do so. 8 | 9 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, 10 | INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A 11 | PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT 12 | HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION 13 | OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE 14 | SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # aiml-driving-behaviour-cloning-using-cnn 2 | 3 | 4 | ## Name 5 | Training and Driving a simulator vehicle using Image data and Convolution Neural Network. 6 | 7 | ## Getting started 8 | 9 | Building self-driving models could require a lot of steps starting from gathering data (images, video streams, multiple sensor data from Camera 10 | and other sensors etc). This data collection from real world physical sources could take up a lot of time and effort. 11 | This could also increase the time taken for testing different versions of the model. 12 | 13 | In this sample, we build a simple Convolution Neural Network to only train on Image data and steering input from manual driving. After training this model can be tested by real time inferencing on the simulator with other tracks, parameters, etc. 14 | We also close the loop by letting this model inference on the simulator directly, so that we can see the model driving the car. 15 | 16 | This sample just shows the basic flow of doing this closed loop simulation pattern and lets the builders focus of whatever AWS services can be used to scale this. For eg: this sample runs on an EC2 with GPU to do all the data collection, training the model and inferencing on the simulator. In production or scaled up mode, this can be delegated to run on different services such as Amazon S3, Amazon Sagemaker, Containrized Simluator run on EKS etc. 17 | 18 | ## Dependencies to replicate the samples 19 | The sample has 3 steps: 20 | 1. Ingestion (carla-manual-2.py) 21 | I have used Open source CARLA for the synthetic simulator, however you can use any simulator of your choosing. The objective is to collect Images and steering which are to be input data to the CNN. 22 | CARLA 0.9.13 23 | CARLA client (for 0.9.13) 24 | 25 | 2. CNN Model (cnn_Track05.ipynb) 26 | Tensorflow2 (preferably on python 3.9) 27 | 28 | 3. Real time inferencing on simulator (cnn_control.py) 29 | CARLA 0.9.13 30 | CARLA client (for 0.9.13) 31 | Tensorflow2 (preferably on python 3.9) 32 | 33 | 34 | 35 | ## License 36 | MIT-0 License; please see the LICENSE file 37 | 38 | -------------------------------------------------------------------------------- /CONTRIBUTING.md: -------------------------------------------------------------------------------- 1 | # Contributing Guidelines 2 | 3 | Thank you for your interest in contributing to our project. Whether it's a bug report, new feature, correction, or additional 4 | documentation, we greatly value feedback and contributions from our community. 5 | 6 | Please read through this document before submitting any issues or pull requests to ensure we have all the necessary 7 | information to effectively respond to your bug report or contribution. 8 | 9 | 10 | ## Reporting Bugs/Feature Requests 11 | 12 | We welcome you to use the GitHub issue tracker to report bugs or suggest features. 13 | 14 | When filing an issue, please check existing open, or recently closed, issues to make sure somebody else hasn't already 15 | reported the issue. Please try to include as much information as you can. Details like these are incredibly useful: 16 | 17 | * A reproducible test case or series of steps 18 | * The version of our code being used 19 | * Any modifications you've made relevant to the bug 20 | * Anything unusual about your environment or deployment 21 | 22 | 23 | ## Contributing via Pull Requests 24 | Contributions via pull requests are much appreciated. Before sending us a pull request, please ensure that: 25 | 26 | 1. You are working against the latest source on the *main* branch. 27 | 2. You check existing open, and recently merged, pull requests to make sure someone else hasn't addressed the problem already. 28 | 3. You open an issue to discuss any significant work - we would hate for your time to be wasted. 29 | 30 | To send us a pull request, please: 31 | 32 | 1. Fork the repository. 33 | 2. Modify the source; please focus on the specific change you are contributing. If you also reformat all the code, it will be hard for us to focus on your change. 34 | 3. Ensure local tests pass. 35 | 4. Commit to your fork using clear commit messages. 36 | 5. Send us a pull request, answering any default questions in the pull request interface. 37 | 6. Pay attention to any automated CI failures reported in the pull request, and stay involved in the conversation. 38 | 39 | GitHub provides additional document on [forking a repository](https://help.github.com/articles/fork-a-repo/) and 40 | [creating a pull request](https://help.github.com/articles/creating-a-pull-request/). 41 | 42 | 43 | ## Finding contributions to work on 44 | Looking at the existing issues is a great way to find something to contribute on. As our projects, by default, use the default GitHub issue labels (enhancement/bug/duplicate/help wanted/invalid/question/wontfix), looking at any 'help wanted' issues is a great place to start. 45 | 46 | 47 | ## Code of Conduct 48 | This project has adopted the [Amazon Open Source Code of Conduct](https://aws.github.io/code-of-conduct). 49 | For more information see the [Code of Conduct FAQ](https://aws.github.io/code-of-conduct-faq) or contact 50 | opensource-codeofconduct@amazon.com with any additional questions or comments. 51 | 52 | 53 | ## Security issue notifications 54 | If you discover a potential security issue in this project we ask that you notify AWS/Amazon Security via our [vulnerability reporting page](http://aws.amazon.com/security/vulnerability-reporting/). Please do **not** create a public github issue. 55 | 56 | 57 | ## Licensing 58 | 59 | See the [LICENSE](LICENSE) file for our project's licensing. We will ask you to confirm the licensing of your contribution. 60 | -------------------------------------------------------------------------------- /cnn_Track05.ipynb: -------------------------------------------------------------------------------- 1 | { 2 | "cells": [ 3 | { 4 | "cell_type": "code", 5 | "execution_count": 1, 6 | "id": "b2697758", 7 | "metadata": {}, 8 | "outputs": [], 9 | "source": [ 10 | "import numpy as np\n", 11 | "import cv2\n", 12 | "import os\n", 13 | "from pathlib import Path\n", 14 | "import json\n", 15 | "from sklearn.model_selection import train_test_split\n", 16 | "import tensorflow as tf\n", 17 | "from tensorflow import keras\n", 18 | "from datetime import datetime\n", 19 | "from keras import backend as k\n", 20 | "#from PIL import Image\n", 21 | "#from io import BytesIO\n", 22 | "#import base64" 23 | ] 24 | }, 25 | { 26 | "cell_type": "code", 27 | "execution_count": 5, 28 | "id": "a92c3fb9", 29 | "metadata": {}, 30 | "outputs": [], 31 | "source": [ 32 | "base_path = Path(\"/opt/carla-output\")\n", 33 | "base_data_path = Path(f'{base_path}/dataset')\n", 34 | "base_image_path = Path(f'{base_path}/images')\n", 35 | "base_model_path = Path(f'{base_path}/models')" 36 | ] 37 | }, 38 | { 39 | "cell_type": "code", 40 | "execution_count": 4, 41 | "id": "5ada88fb", 42 | "metadata": {}, 43 | "outputs": [ 44 | { 45 | "name": "stdout", 46 | "output_type": "stream", 47 | "text": [ 48 | "[PosixPath('/opt/carla-output/dataset/12092023-1456'), PosixPath('/opt/carla-output/dataset/08092023-0828'), PosixPath('/opt/carla-output/dataset/12092023-1544'), PosixPath('/opt/carla-output/dataset/06092023-0447'), PosixPath('/opt/carla-output/dataset/11092023-1909'), PosixPath('/opt/carla-output/dataset/12092023-1551'), PosixPath('/opt/carla-output/dataset/01092023-1933'), PosixPath('/opt/carla-output/dataset/06092023-0457'), PosixPath('/opt/carla-output/dataset/08092023-0832'), PosixPath('/opt/carla-output/dataset/08092023-0831'), PosixPath('/opt/carla-output/dataset/user-demand'), PosixPath('/opt/carla-output/dataset/01092023-1930'), PosixPath('/opt/carla-output/dataset/08092023-0829'), PosixPath('/opt/carla-output/dataset/06092023-0452'), PosixPath('/opt/carla-output/dataset/12092023-1502'), PosixPath('/opt/carla-output/dataset/08092023-0819'), PosixPath('/opt/carla-output/dataset/11092023-1910')]\n" 49 | ] 50 | } 51 | ], 52 | "source": [ 53 | "json_dirs = [x for x in base_data_path.iterdir()]\n", 54 | "print(json_dirs)" 55 | ] 56 | }, 57 | { 58 | "cell_type": "code", 59 | "execution_count": null, 60 | "id": "6a3bf966", 61 | "metadata": {}, 62 | "outputs": [], 63 | "source": [ 64 | "\"\"\"\n", 65 | "def substring_after(s, delim):\n", 66 | " return s.partition(delim)[2]\n", 67 | "\"\"\"" 68 | ] 69 | }, 70 | { 71 | "cell_type": "code", 72 | "execution_count": 6, 73 | "id": "72390d2c", 74 | "metadata": {}, 75 | "outputs": [ 76 | { 77 | "name": "stdout", 78 | "output_type": "stream", 79 | "text": [ 80 | "20230912-1701\n" 81 | ] 82 | } 83 | ], 84 | "source": [ 85 | "\"\"\"\n", 86 | "for a, b,c in os.walk(base_model_path):\n", 87 | " split1= substring_after(str(c),\"_\")\n", 88 | " split2 = substring_after(split1)\n", 89 | "\"\"\"\n", 90 | "now = datetime.now()\n", 91 | "date_string = now.strftime(\"%Y%m%d-%H%M\")\n", 92 | "print(date_string)" 93 | ] 94 | }, 95 | { 96 | "cell_type": "code", 97 | "execution_count": 7, 98 | "id": "25da6b3c", 99 | "metadata": {}, 100 | "outputs": [ 101 | { 102 | "name": "stdout", 103 | "output_type": "stream", 104 | "text": [ 105 | "/opt/carla-output/dataset/12092023-1543\n" 106 | ] 107 | } 108 | ], 109 | "source": [ 110 | "src_dir = Path('12092023-1543') # Enter the directory name on which to run ML training job\n", 111 | "json_path = Path(f'{base_data_path}/{src_dir}')\n", 112 | "print(json_path)\n", 113 | "json_file_names = []\n", 114 | "#json_file = \"\"\n", 115 | "for root, dir_name,json_file in os.walk(json_path):\n", 116 | " json_file_names = json_file\n", 117 | "\n", 118 | "#print(json_file_names)\n", 119 | "#print(src_dir)" 120 | ] 121 | }, 122 | { 123 | "cell_type": "code", 124 | "execution_count": 8, 125 | "id": "070f8438", 126 | "metadata": {}, 127 | "outputs": [ 128 | { 129 | "name": "stdout", 130 | "output_type": "stream", 131 | "text": [ 132 | "12092023-1543\n" 133 | ] 134 | } 135 | ], 136 | "source": [ 137 | "x=[]\n", 138 | "y=[]\n", 139 | "#dir_name = '01092023-1933'\n", 140 | "print(src_dir)\n", 141 | "for json_file in json_file_names:\n", 142 | " filename = Path(f'{json_path}/{json_file}')\n", 143 | " #print(filename)\n", 144 | " with open(filename) as f:\n", 145 | " payload = json.load(f)\n", 146 | " #print(payload)\n", 147 | " image_name = payload[\"rgb_image\"]\n", 148 | " #print(image_name)\n", 149 | " image_full_path = f'{base_image_path}/{src_dir}/{image_name}'\n", 150 | " #print(image_full_path)\n", 151 | " x.append(cv2.imread(image_full_path))\n", 152 | " #image_decoded = Image.open(BytesIO(base64.b64decode(image_full_path)))\n", 153 | " #x.append(image_decoded)\n", 154 | " y.append(payload[\"angle\"])\n" 155 | ] 156 | }, 157 | { 158 | "cell_type": "code", 159 | "execution_count": 9, 160 | "id": "85daeaf1", 161 | "metadata": {}, 162 | "outputs": [], 163 | "source": [ 164 | "#Convert into numpy array\n", 165 | "x_arr = np.asarray(x)\n", 166 | "x_fl = x_arr.astype(\"float32\")\n", 167 | "x_norm = x_fl/ 255.0\n", 168 | "\n", 169 | "y_arr = np.asarray(y)" 170 | ] 171 | }, 172 | { 173 | "cell_type": "code", 174 | "execution_count": null, 175 | "id": "1679f287", 176 | "metadata": {}, 177 | "outputs": [], 178 | "source": [ 179 | "#print(x_norm.shape)" 180 | ] 181 | }, 182 | { 183 | "cell_type": "code", 184 | "execution_count": 10, 185 | "id": "94789b56", 186 | "metadata": {}, 187 | "outputs": [], 188 | "source": [ 189 | "x_train, x_test, y_train, y_test = train_test_split(x_norm, y_arr, test_size=0.1)" 190 | ] 191 | }, 192 | { 193 | "cell_type": "code", 194 | "execution_count": 11, 195 | "id": "fa16f7f7", 196 | "metadata": {}, 197 | "outputs": [ 198 | { 199 | "name": "stdout", 200 | "output_type": "stream", 201 | "text": [ 202 | "x_train : (3100, 480, 640, 3)\n", 203 | "y_train : (3100,)\n", 204 | "x_test : (345, 480, 640, 3)\n", 205 | "y_test : (345,)\n" 206 | ] 207 | } 208 | ], 209 | "source": [ 210 | "print(f'x_train : {x_train.shape}')\n", 211 | "print(f'y_train : {y_train.shape}')\n", 212 | "print(f'x_test : {x_test.shape}')\n", 213 | "print(f'y_test : {y_test.shape}')" 214 | ] 215 | }, 216 | { 217 | "cell_type": "code", 218 | "execution_count": 12, 219 | "id": "4a4ea9a3", 220 | "metadata": {}, 221 | "outputs": [ 222 | { 223 | "ename": "NameError", 224 | "evalue": "name 'cnn_model' is not defined", 225 | "output_type": "error", 226 | "traceback": [ 227 | "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", 228 | "\u001b[0;31mNameError\u001b[0m Traceback (most recent call last)", 229 | "\u001b[0;32m/tmp/ipykernel_23604/1253923938.py\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 3\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 4\u001b[0m \u001b[0mk\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mclear_session\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 5\u001b[0;31m \u001b[0;32mdel\u001b[0m \u001b[0mcnn_model\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 6\u001b[0m \u001b[0;31m#tf.reset_default_graph()\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", 230 | "\u001b[0;31mNameError\u001b[0m: name 'cnn_model' is not defined" 231 | ] 232 | } 233 | ], 234 | "source": [ 235 | "########################################################################\n", 236 | "tf.__version__\n", 237 | "\n", 238 | "k.clear_session()\n", 239 | "del cnn_model\n", 240 | "#tf.reset_default_graph()" 241 | ] 242 | }, 243 | { 244 | "cell_type": "code", 245 | "execution_count": 13, 246 | "id": "8ca818f3", 247 | "metadata": {}, 248 | "outputs": [], 249 | "source": [ 250 | "#Creating the model\n", 251 | "\n", 252 | "cnn_model = tf.keras.models.Sequential()" 253 | ] 254 | }, 255 | { 256 | "cell_type": "code", 257 | "execution_count": 14, 258 | "id": "3547ccf7", 259 | "metadata": {}, 260 | "outputs": [], 261 | "source": [ 262 | "#Cropping image\n", 263 | "cnn_model.add(tf.keras.layers.Cropping2D(cropping=((150,150),(150,150))))\n" 264 | ] 265 | }, 266 | { 267 | "cell_type": "code", 268 | "execution_count": 15, 269 | "id": "c706d776", 270 | "metadata": {}, 271 | "outputs": [], 272 | "source": [ 273 | "# Adding Convolution Layer\n", 274 | "cnn_model.add(tf.keras.layers.Conv2D(filters=64, kernel_size=7, activation=\"relu\", padding=\"SAME\",input_shape=[480,640,3]))\n", 275 | "\n", 276 | "#Addning MaxPool Layer\n", 277 | "cnn_model.add(tf.keras.layers.MaxPool2D(pool_size=2))\n", 278 | "\n", 279 | "#DropOut\n", 280 | "#cnn_model.add(tf.keras.layers.Dropout(0.2))\n" 281 | ] 282 | }, 283 | { 284 | "cell_type": "code", 285 | "execution_count": 16, 286 | "id": "4edf8982", 287 | "metadata": {}, 288 | "outputs": [], 289 | "source": [ 290 | "# Adding Convolution Layer\n", 291 | "cnn_model.add(tf.keras.layers.Conv2D(filters=128, kernel_size=3, activation=\"relu\", padding=\"SAME\"))\n", 292 | "cnn_model.add(tf.keras.layers.Conv2D(filters=128, kernel_size=3, activation=\"relu\", padding=\"SAME\"))\n", 293 | "\n", 294 | "#Addning MaxPool Layer\n", 295 | "cnn_model.add(tf.keras.layers.MaxPool2D(pool_size=2))\n", 296 | "\n", 297 | "#DropOut\n", 298 | "#cnn_model.add(tf.keras.layers.Dropout(0.2))" 299 | ] 300 | }, 301 | { 302 | "cell_type": "code", 303 | "execution_count": 17, 304 | "id": "b14bad8d", 305 | "metadata": {}, 306 | "outputs": [], 307 | "source": [ 308 | "#Adding Other Conv and Pooling layers\n", 309 | "\n", 310 | "cnn_model.add(tf.keras.layers.Conv2D(filters=256, kernel_size=3, activation=\"relu\", padding=\"SAME\"))\n", 311 | "cnn_model.add(tf.keras.layers.Conv2D(filters=256, kernel_size=3, activation=\"relu\", padding=\"SAME\"))\n", 312 | "#cnn_model.add(tf.keras.layers.Conv2D(filters=128, kernel_size=3, activation=\"relu\", padding=\"VALID\"))\n", 313 | "\n", 314 | "# MaxPool\n", 315 | "cnn_model.add(tf.keras.layers.MaxPool2D(pool_size=2))\n" 316 | ] 317 | }, 318 | { 319 | "cell_type": "code", 320 | "execution_count": 18, 321 | "id": "0e938c48", 322 | "metadata": {}, 323 | "outputs": [], 324 | "source": [ 325 | "#Adding Other Conv and Pooling layers\n", 326 | "\n", 327 | "#cnn_model.add(tf.keras.layers.Conv2D(filters=256, kernel_size=3, activation=\"relu\", padding=\"VALID\"))\n", 328 | "#cnn_model.add(tf.keras.layers.Conv2D(filters=256, kernel_size=3, activation=\"relu\", padding=\"VALID\"))\n", 329 | "\n", 330 | "#cnn_model.add(tf.keras.layers.MaxPool2D(pool_size=2))" 331 | ] 332 | }, 333 | { 334 | "cell_type": "code", 335 | "execution_count": 19, 336 | "id": "f5de85c2", 337 | "metadata": {}, 338 | "outputs": [], 339 | "source": [ 340 | "#Adding Flattening\n", 341 | "\n", 342 | "cnn_model.add(tf.keras.layers.Flatten())" 343 | ] 344 | }, 345 | { 346 | "cell_type": "code", 347 | "execution_count": 20, 348 | "id": "1058d85b", 349 | "metadata": {}, 350 | "outputs": [], 351 | "source": [ 352 | "#Adding Full Conection and dropouts\n", 353 | "\n", 354 | "cnn_model.add(tf.keras.layers.Dense(units=128, activation=\"relu\"))\n", 355 | "cnn_model.add(tf.keras.layers.Dropout(0.2))\n", 356 | "\n", 357 | "cnn_model.add(tf.keras.layers.Dense(units=64, activation=\"relu\"))\n", 358 | "cnn_model.add(tf.keras.layers.Dropout(0.2))\n", 359 | "\n" 360 | ] 361 | }, 362 | { 363 | "cell_type": "code", 364 | "execution_count": 21, 365 | "id": "9afbb237", 366 | "metadata": {}, 367 | "outputs": [], 368 | "source": [ 369 | "#Output Layer\n", 370 | "cnn_model.add(tf.keras.layers.Dense(units=1))" 371 | ] 372 | }, 373 | { 374 | "cell_type": "code", 375 | "execution_count": 22, 376 | "id": "0312d97e", 377 | "metadata": {}, 378 | "outputs": [], 379 | "source": [ 380 | "##################################################################\n", 381 | "# Model training\n", 382 | "\n", 383 | "cnn_model.compile(optimizer=\"SGD\", loss=\"mean_squared_error\")\n", 384 | "\n" 385 | ] 386 | }, 387 | { 388 | "cell_type": "code", 389 | "execution_count": 23, 390 | "id": "aeb65489", 391 | "metadata": { 392 | "scrolled": true 393 | }, 394 | "outputs": [ 395 | { 396 | "name": "stdout", 397 | "output_type": "stream", 398 | "text": [ 399 | "Epoch 1/50\n", 400 | "97/97 [==============================] - 124s 498ms/step - loss: 0.0096\n", 401 | "Epoch 2/50\n", 402 | "97/97 [==============================] - 36s 376ms/step - loss: 0.0088\n", 403 | "Epoch 3/50\n", 404 | "97/97 [==============================] - 35s 363ms/step - loss: 0.0085\n", 405 | "Epoch 4/50\n", 406 | "97/97 [==============================] - 35s 364ms/step - loss: 0.0081\n", 407 | "Epoch 5/50\n", 408 | "97/97 [==============================] - 35s 357ms/step - loss: 0.0076\n", 409 | "Epoch 6/50\n", 410 | "97/97 [==============================] - 35s 360ms/step - loss: 0.0073\n", 411 | "Epoch 7/50\n", 412 | "97/97 [==============================] - 35s 356ms/step - loss: 0.0070\n", 413 | "Epoch 8/50\n", 414 | "97/97 [==============================] - 35s 356ms/step - loss: 0.0068\n", 415 | "Epoch 9/50\n", 416 | "97/97 [==============================] - 35s 359ms/step - loss: 0.0064\n", 417 | "Epoch 10/50\n", 418 | "97/97 [==============================] - 35s 359ms/step - loss: 0.0063\n", 419 | "Epoch 11/50\n", 420 | "97/97 [==============================] - 35s 362ms/step - loss: 0.0062\n", 421 | "Epoch 12/50\n", 422 | "97/97 [==============================] - 35s 362ms/step - loss: 0.0061\n", 423 | "Epoch 13/50\n", 424 | "97/97 [==============================] - 35s 366ms/step - loss: 0.0060\n", 425 | "Epoch 14/50\n", 426 | "97/97 [==============================] - 34s 355ms/step - loss: 0.0058\n", 427 | "Epoch 15/50\n", 428 | "97/97 [==============================] - 33s 339ms/step - loss: 0.0055\n", 429 | "Epoch 16/50\n", 430 | "97/97 [==============================] - 33s 336ms/step - loss: 0.0056\n", 431 | "Epoch 17/50\n", 432 | "97/97 [==============================] - 33s 335ms/step - loss: 0.0054\n", 433 | "Epoch 18/50\n", 434 | "97/97 [==============================] - 33s 335ms/step - loss: 0.0052\n", 435 | "Epoch 19/50\n", 436 | "97/97 [==============================] - 33s 337ms/step - loss: 0.0053\n", 437 | "Epoch 20/50\n", 438 | "97/97 [==============================] - 33s 339ms/step - loss: 0.0051\n", 439 | "Epoch 21/50\n", 440 | "97/97 [==============================] - 33s 344ms/step - loss: 0.0051\n", 441 | "Epoch 22/50\n", 442 | "97/97 [==============================] - 34s 347ms/step - loss: 0.0049\n", 443 | "Epoch 23/50\n", 444 | "97/97 [==============================] - 34s 353ms/step - loss: 0.0049\n", 445 | "Epoch 24/50\n", 446 | "97/97 [==============================] - 35s 357ms/step - loss: 0.0049\n", 447 | "Epoch 25/50\n", 448 | "97/97 [==============================] - 35s 358ms/step - loss: 0.0049\n", 449 | "Epoch 26/50\n", 450 | "97/97 [==============================] - 35s 357ms/step - loss: 0.0047\n", 451 | "Epoch 27/50\n", 452 | "97/97 [==============================] - 35s 359ms/step - loss: 0.0049\n", 453 | "Epoch 28/50\n", 454 | "97/97 [==============================] - 35s 360ms/step - loss: 0.0048\n", 455 | "Epoch 29/50\n", 456 | "97/97 [==============================] - 35s 361ms/step - loss: 0.0047\n", 457 | "Epoch 30/50\n", 458 | "97/97 [==============================] - 35s 363ms/step - loss: 0.0047\n", 459 | "Epoch 31/50\n", 460 | "97/97 [==============================] - 35s 362ms/step - loss: 0.0046\n", 461 | "Epoch 32/50\n", 462 | "97/97 [==============================] - 35s 364ms/step - loss: 0.0046\n", 463 | "Epoch 33/50\n", 464 | "97/97 [==============================] - 35s 365ms/step - loss: 0.0044\n", 465 | "Epoch 34/50\n", 466 | "97/97 [==============================] - 35s 365ms/step - loss: 0.0045\n", 467 | "Epoch 35/50\n", 468 | "97/97 [==============================] - 35s 365ms/step - loss: 0.0045\n", 469 | "Epoch 36/50\n", 470 | "97/97 [==============================] - 35s 365ms/step - loss: 0.0046\n", 471 | "Epoch 37/50\n", 472 | "97/97 [==============================] - 35s 365ms/step - loss: 0.0046\n", 473 | "Epoch 38/50\n", 474 | "97/97 [==============================] - 35s 365ms/step - loss: 0.0046\n", 475 | "Epoch 39/50\n", 476 | "97/97 [==============================] - 35s 365ms/step - loss: 0.0045\n", 477 | "Epoch 40/50\n", 478 | "97/97 [==============================] - 35s 365ms/step - loss: 0.0042\n", 479 | "Epoch 41/50\n", 480 | "97/97 [==============================] - 35s 365ms/step - loss: 0.0044\n", 481 | "Epoch 42/50\n", 482 | "97/97 [==============================] - 35s 366ms/step - loss: 0.0044\n", 483 | "Epoch 43/50\n", 484 | "97/97 [==============================] - 35s 365ms/step - loss: 0.0043\n", 485 | "Epoch 44/50\n", 486 | "97/97 [==============================] - 36s 368ms/step - loss: 0.0045\n", 487 | "Epoch 45/50\n", 488 | "97/97 [==============================] - 36s 367ms/step - loss: 0.0042\n", 489 | "Epoch 46/50\n", 490 | "97/97 [==============================] - 36s 367ms/step - loss: 0.0043\n", 491 | "Epoch 47/50\n", 492 | "97/97 [==============================] - 36s 367ms/step - loss: 0.0043\n", 493 | "Epoch 48/50\n", 494 | "97/97 [==============================] - 35s 366ms/step - loss: 0.0043\n", 495 | "Epoch 49/50\n", 496 | "97/97 [==============================] - 35s 366ms/step - loss: 0.0043\n", 497 | "Epoch 50/50\n", 498 | "97/97 [==============================] - 35s 365ms/step - loss: 0.0042\n" 499 | ] 500 | }, 501 | { 502 | "data": { 503 | "text/plain": [ 504 | "" 505 | ] 506 | }, 507 | "execution_count": 23, 508 | "metadata": {}, 509 | "output_type": "execute_result" 510 | } 511 | ], 512 | "source": [ 513 | "cnn_model.fit(x_train, y_train, epochs=50, batch_size=32)" 514 | ] 515 | }, 516 | { 517 | "cell_type": "code", 518 | "execution_count": 25, 519 | "id": "8bbdc4b7", 520 | "metadata": { 521 | "scrolled": true 522 | }, 523 | "outputs": [ 524 | { 525 | "name": "stdout", 526 | "output_type": "stream", 527 | "text": [ 528 | "0 Actual is 0.0 Predicted is 0.000542917288839817\n", 529 | "1 Actual is 0.0 Predicted is 0.017370518296957016\n", 530 | "2 Actual is 0.0 Predicted is -0.0034478073939681053\n", 531 | "3 Actual is 0.10000000149011612 Predicted is 0.045138128101825714\n", 532 | "4 Actual is 0.0 Predicted is 0.07943874597549438\n", 533 | "5 Actual is 0.0 Predicted is -0.02459646761417389\n", 534 | "6 Actual is -0.0 Predicted is -0.026436101645231247\n", 535 | "7 Actual is 0.0 Predicted is -0.01580851897597313\n", 536 | "8 Actual is 0.0 Predicted is 0.0017873570322990417\n", 537 | "9 Actual is 0.0 Predicted is -0.01000578235834837\n", 538 | "10 Actual is 0.0 Predicted is 0.0926177054643631\n", 539 | "11 Actual is 0.0 Predicted is -0.0015672780573368073\n", 540 | "12 Actual is 0.10000000149011612 Predicted is 0.05133948102593422\n", 541 | "13 Actual is 0.0 Predicted is 0.11641823500394821\n", 542 | "14 Actual is 0.0 Predicted is 0.021058183163404465\n", 543 | "15 Actual is 0.0 Predicted is -0.00165476743131876\n", 544 | "16 Actual is 0.0 Predicted is 0.11414044350385666\n", 545 | "17 Actual is 0.0 Predicted is 0.05883939936757088\n", 546 | "18 Actual is 0.0 Predicted is 0.03085688129067421\n", 547 | "19 Actual is 0.0 Predicted is -0.005441597662866116\n", 548 | "20 Actual is 0.0 Predicted is 0.04370174929499626\n", 549 | "21 Actual is -0.10000000149011612 Predicted is -0.015441487543284893\n", 550 | "22 Actual is 0.0 Predicted is 0.008865839801728725\n", 551 | "23 Actual is 0.0 Predicted is 0.0009031482040882111\n", 552 | "24 Actual is 0.0 Predicted is -0.006404527463018894\n", 553 | "25 Actual is 0.0 Predicted is -0.010571296326816082\n", 554 | "26 Actual is 0.0 Predicted is -0.001963903196156025\n", 555 | "27 Actual is 0.0 Predicted is 0.0022609811276197433\n", 556 | "28 Actual is 0.0 Predicted is 0.04618454724550247\n", 557 | "29 Actual is 0.0 Predicted is -0.012046207673847675\n", 558 | "30 Actual is 0.0 Predicted is 3.6162324249744415e-05\n", 559 | "31 Actual is 0.0 Predicted is -0.001507934182882309\n", 560 | "32 Actual is 0.30000001192092896 Predicted is 0.16136384010314941\n", 561 | "33 Actual is 0.10000000149011612 Predicted is 0.10123085230588913\n", 562 | "34 Actual is -0.30000001192092896 Predicted is -0.3183859884738922\n", 563 | "35 Actual is 0.0 Predicted is 0.002829441800713539\n", 564 | "36 Actual is 0.0 Predicted is 0.022453242912888527\n", 565 | "37 Actual is 0.0 Predicted is 0.03491409122943878\n", 566 | "38 Actual is 0.0 Predicted is 0.057763539254665375\n", 567 | "39 Actual is 0.0 Predicted is -0.0018511833623051643\n", 568 | "40 Actual is 0.0 Predicted is -0.02761724963784218\n", 569 | "41 Actual is 0.0 Predicted is 0.0010898774489760399\n", 570 | "42 Actual is 0.0 Predicted is 0.12348547577857971\n", 571 | "43 Actual is 0.0 Predicted is -0.020859450101852417\n", 572 | "44 Actual is 0.0 Predicted is -0.0033071329817175865\n", 573 | "45 Actual is -0.0 Predicted is -0.030530381947755814\n", 574 | "46 Actual is 0.0 Predicted is 7.486715912818909e-05\n", 575 | "47 Actual is 0.0 Predicted is 0.040552906692028046\n", 576 | "48 Actual is 0.0 Predicted is 0.011118590831756592\n", 577 | "49 Actual is 0.0 Predicted is 0.0016499832272529602\n", 578 | "50 Actual is 0.0 Predicted is 0.05820120498538017\n", 579 | "51 Actual is 0.10000000149011612 Predicted is 0.07079432904720306\n", 580 | "52 Actual is 0.0 Predicted is -0.01458993460983038\n", 581 | "53 Actual is 0.0 Predicted is 0.08425989747047424\n", 582 | "54 Actual is 0.0 Predicted is 0.015728270635008812\n", 583 | "55 Actual is 0.4000000059604645 Predicted is 0.14949113130569458\n", 584 | "56 Actual is 0.0 Predicted is 0.09787686169147491\n", 585 | "57 Actual is 0.0 Predicted is -0.04030746966600418\n", 586 | "58 Actual is -0.0 Predicted is 0.04523877426981926\n", 587 | "59 Actual is 0.10000000149011612 Predicted is 0.0653441771864891\n", 588 | "60 Actual is 0.0 Predicted is -0.000124477781355381\n", 589 | "61 Actual is 0.0 Predicted is 0.013594873249530792\n", 590 | "62 Actual is 0.0 Predicted is 0.11479488015174866\n", 591 | "63 Actual is 0.0 Predicted is 0.0508849062025547\n", 592 | "64 Actual is -0.20000000298023224 Predicted is -0.026689562946558\n", 593 | "65 Actual is 0.0 Predicted is -0.027830619364976883\n", 594 | "66 Actual is 0.10000000149011612 Predicted is 0.019390862435102463\n", 595 | "67 Actual is -0.0 Predicted is -0.02818499505519867\n", 596 | "68 Actual is 0.0 Predicted is 0.009137948043644428\n", 597 | "69 Actual is -0.0 Predicted is -0.044421203434467316\n", 598 | "70 Actual is 0.0 Predicted is 0.0013082819059491158\n", 599 | "71 Actual is 0.0 Predicted is 0.08228235691785812\n", 600 | "72 Actual is -0.30000001192092896 Predicted is -0.25786569714546204\n", 601 | "73 Actual is 0.0 Predicted is -0.0012344969436526299\n", 602 | "74 Actual is 0.0 Predicted is 0.008969858288764954\n", 603 | "75 Actual is 0.0 Predicted is 0.00023500528186559677\n", 604 | "76 Actual is 0.0 Predicted is 0.056180424988269806\n", 605 | "77 Actual is 0.0 Predicted is 0.0435514971613884\n", 606 | "78 Actual is 0.0 Predicted is -0.05251925811171532\n", 607 | "79 Actual is 0.0 Predicted is -0.004485351033508778\n", 608 | "80 Actual is 0.0 Predicted is -0.01363623421639204\n", 609 | "81 Actual is 0.0 Predicted is -0.05665750429034233\n", 610 | "82 Actual is 0.20000000298023224 Predicted is 0.08339611440896988\n", 611 | "83 Actual is 0.0 Predicted is -0.0017958851531147957\n", 612 | "84 Actual is 0.0 Predicted is 0.061502523720264435\n", 613 | "85 Actual is 0.0 Predicted is 0.08274541795253754\n", 614 | "86 Actual is 0.0 Predicted is -0.034039970487356186\n", 615 | "87 Actual is 0.0 Predicted is -0.01631811261177063\n", 616 | "88 Actual is 0.0 Predicted is 0.0004651155322790146\n", 617 | "89 Actual is 0.10000000149011612 Predicted is 0.009030052460730076\n", 618 | "90 Actual is 0.20000000298023224 Predicted is 0.12387151271104813\n", 619 | "91 Actual is 0.0 Predicted is 0.026891201734542847\n", 620 | "92 Actual is 0.0 Predicted is 0.05867356061935425\n", 621 | "93 Actual is 0.4000000059604645 Predicted is 0.19299139082431793\n", 622 | "94 Actual is 0.0 Predicted is -0.013289428316056728\n", 623 | "95 Actual is 0.0 Predicted is 0.009898342192173004\n", 624 | "96 Actual is 0.0 Predicted is -0.0015759114176034927\n", 625 | "97 Actual is 0.0 Predicted is -0.012775459326803684\n", 626 | "98 Actual is -0.4000000059604645 Predicted is -0.3823990821838379\n", 627 | "99 Actual is 0.0 Predicted is -0.0048654573038220406\n", 628 | "100 Actual is 0.10000000149011612 Predicted is -0.01633390411734581\n", 629 | "101 Actual is -0.10000000149011612 Predicted is 0.043586697429418564\n", 630 | "102 Actual is 0.0 Predicted is 0.00018670223653316498\n", 631 | "103 Actual is 0.0 Predicted is -0.0020033782348036766\n", 632 | "104 Actual is 0.0 Predicted is 0.0012533888220787048\n", 633 | "105 Actual is 0.10000000149011612 Predicted is 0.04421307519078255\n", 634 | "106 Actual is 0.0 Predicted is 0.04248255118727684\n", 635 | "107 Actual is 0.0 Predicted is -0.001357019878923893\n", 636 | "108 Actual is 0.0 Predicted is 0.044624898582696915\n", 637 | "109 Actual is 0.10000000149011612 Predicted is 0.05008161813020706\n", 638 | "110 Actual is 0.0 Predicted is -0.001951306127011776\n", 639 | "111 Actual is 0.10000000149011612 Predicted is 0.009935498237609863\n", 640 | "112 Actual is 0.0 Predicted is -0.010607502423226833\n", 641 | "113 Actual is 0.0 Predicted is -0.017638448625802994\n", 642 | "114 Actual is 0.0 Predicted is -0.033164750784635544\n", 643 | "115 Actual is 0.0 Predicted is 0.005357005633413792\n", 644 | "116 Actual is 0.0 Predicted is 0.09604188054800034\n", 645 | "117 Actual is 0.0 Predicted is 0.038494765758514404\n", 646 | "118 Actual is 0.0 Predicted is 0.04353250563144684\n", 647 | "119 Actual is 0.0 Predicted is -0.007809543050825596\n", 648 | "120 Actual is 0.0 Predicted is 0.008294579572975636\n", 649 | "121 Actual is 0.0 Predicted is 0.002306324429810047\n", 650 | "122 Actual is 0.0 Predicted is 0.0034837452694773674\n", 651 | "123 Actual is 0.0 Predicted is 0.035772185772657394\n", 652 | "124 Actual is 0.0 Predicted is 0.0004687374457716942\n", 653 | "125 Actual is 0.0 Predicted is 0.010375840589404106\n", 654 | "126 Actual is 0.0 Predicted is -0.010564873926341534\n", 655 | "127 Actual is 0.10000000149011612 Predicted is 0.1056014895439148\n", 656 | "128 Actual is 0.0 Predicted is 0.002209634520113468\n", 657 | "129 Actual is 0.0 Predicted is 0.022158939391374588\n", 658 | "130 Actual is 0.0 Predicted is -0.015293707139790058\n", 659 | "131 Actual is 0.0 Predicted is -0.006885346956551075\n", 660 | "132 Actual is 0.0 Predicted is 0.04144423082470894\n", 661 | "133 Actual is 0.0 Predicted is -0.007447072304785252\n", 662 | "134 Actual is 0.0 Predicted is 0.06647276133298874\n", 663 | "135 Actual is 0.10000000149011612 Predicted is 0.13522720336914062\n", 664 | "136 Actual is 0.0 Predicted is 0.004374640993773937\n", 665 | "137 Actual is 0.20000000298023224 Predicted is 0.017415951937437057\n", 666 | "138 Actual is -0.0 Predicted is -0.045062195509672165\n", 667 | "139 Actual is 0.0 Predicted is -0.004337170161306858\n", 668 | "140 Actual is -0.20000000298023224 Predicted is -0.039026279002428055\n", 669 | "141 Actual is 0.0 Predicted is -0.014618045650422573\n", 670 | "142 Actual is 0.0 Predicted is -0.01004165131598711\n", 671 | "143 Actual is 0.0 Predicted is 0.07660803943872452\n", 672 | "144 Actual is 0.0 Predicted is -0.006123933009803295\n", 673 | "145 Actual is 0.0 Predicted is 0.0043402668088674545\n", 674 | "146 Actual is 0.0 Predicted is 0.12156045436859131\n", 675 | "147 Actual is 0.0 Predicted is -0.008739537559449673\n", 676 | "148 Actual is 0.0 Predicted is 0.0580684095621109\n", 677 | "149 Actual is 0.10000000149011612 Predicted is 0.11173925548791885\n", 678 | "150 Actual is 0.0 Predicted is -0.0066262660548090935\n", 679 | "151 Actual is 0.20000000298023224 Predicted is 0.09704112261533737\n", 680 | "152 Actual is 0.0 Predicted is 0.0007799556478857994\n", 681 | "153 Actual is 0.0 Predicted is -0.0026354780420660973\n", 682 | "154 Actual is 0.0 Predicted is -0.009422174654901028\n" 683 | ] 684 | }, 685 | { 686 | "name": "stdout", 687 | "output_type": "stream", 688 | "text": [ 689 | "155 Actual is 0.0 Predicted is 0.01290092058479786\n", 690 | "156 Actual is 0.0 Predicted is 0.09283173829317093\n", 691 | "157 Actual is 0.0 Predicted is 0.0009877420961856842\n", 692 | "158 Actual is 0.0 Predicted is 0.10575376451015472\n", 693 | "159 Actual is 0.0 Predicted is -0.039171990007162094\n", 694 | "160 Actual is 0.0 Predicted is -0.03694339841604233\n", 695 | "161 Actual is -0.0 Predicted is -0.03464614227414131\n", 696 | "162 Actual is 0.0 Predicted is 0.0017509488388895988\n", 697 | "163 Actual is 0.10000000149011612 Predicted is 0.049878232181072235\n", 698 | "164 Actual is 0.0 Predicted is 0.006982305087149143\n", 699 | "165 Actual is 0.0 Predicted is -0.0066297343000769615\n", 700 | "166 Actual is 0.0 Predicted is 0.006025201641023159\n", 701 | "167 Actual is 0.10000000149011612 Predicted is 0.04012560471892357\n", 702 | "168 Actual is 0.10000000149011612 Predicted is 0.048791274428367615\n", 703 | "169 Actual is 0.10000000149011612 Predicted is 0.02233763597905636\n", 704 | "170 Actual is 0.0 Predicted is 0.01026880368590355\n", 705 | "171 Actual is -0.20000000298023224 Predicted is -0.03076765686273575\n", 706 | "172 Actual is 0.0 Predicted is -0.01391774695366621\n", 707 | "173 Actual is 0.0 Predicted is -0.008080489002168179\n", 708 | "174 Actual is 0.0 Predicted is 0.06617707759141922\n", 709 | "175 Actual is 0.0 Predicted is -0.012529459781944752\n", 710 | "176 Actual is 0.0 Predicted is -0.009083707816898823\n", 711 | "177 Actual is 0.20000000298023224 Predicted is 0.05390579625964165\n", 712 | "178 Actual is 0.10000000149011612 Predicted is 0.05115659162402153\n", 713 | "179 Actual is 0.0 Predicted is 0.0065129948779940605\n", 714 | "180 Actual is 0.20000000298023224 Predicted is 0.05659196898341179\n", 715 | "181 Actual is 0.0 Predicted is 0.11879976093769073\n", 716 | "182 Actual is 0.0 Predicted is -0.005136619322001934\n", 717 | "183 Actual is 0.10000000149011612 Predicted is 0.06451443582773209\n", 718 | "184 Actual is 0.10000000149011612 Predicted is 0.0011583603918552399\n", 719 | "185 Actual is 0.20000000298023224 Predicted is 0.05314048379659653\n", 720 | "186 Actual is 0.0 Predicted is -0.0003696577623486519\n", 721 | "187 Actual is 0.0 Predicted is -0.0018942756578326225\n", 722 | "188 Actual is 0.20000000298023224 Predicted is 0.13924825191497803\n", 723 | "189 Actual is 0.0 Predicted is -0.010043677873909473\n", 724 | "190 Actual is 0.10000000149011612 Predicted is -0.015667937695980072\n", 725 | "191 Actual is -0.10000000149011612 Predicted is -0.015159686096012592\n", 726 | "192 Actual is 0.0 Predicted is -0.006469369865953922\n", 727 | "193 Actual is 0.0 Predicted is 0.05451202765107155\n", 728 | "194 Actual is 0.10000000149011612 Predicted is 0.058373354375362396\n", 729 | "195 Actual is 0.0 Predicted is 0.008348571136593819\n", 730 | "196 Actual is 0.0 Predicted is 0.07019883394241333\n", 731 | "197 Actual is 0.0 Predicted is -0.002207336015999317\n", 732 | "198 Actual is 0.0 Predicted is 0.0014321580529212952\n", 733 | "199 Actual is 0.0 Predicted is -0.0004643350839614868\n", 734 | "200 Actual is 0.10000000149011612 Predicted is 0.0029646465554833412\n", 735 | "201 Actual is 0.0 Predicted is 0.0037428531795740128\n", 736 | "202 Actual is 0.0 Predicted is 0.06021037697792053\n", 737 | "203 Actual is 0.0 Predicted is -0.011607757769525051\n", 738 | "204 Actual is 0.0 Predicted is 0.009476153180003166\n", 739 | "205 Actual is 0.0 Predicted is 0.006844765972346067\n", 740 | "206 Actual is 0.0 Predicted is 0.07254748791456223\n", 741 | "207 Actual is 0.0 Predicted is -0.00025962013751268387\n", 742 | "208 Actual is 0.0 Predicted is 0.003187408670783043\n", 743 | "209 Actual is 0.0 Predicted is 0.009220614098012447\n", 744 | "210 Actual is 0.0 Predicted is -0.0013515092432498932\n", 745 | "211 Actual is 0.0 Predicted is -0.00829590205103159\n", 746 | "212 Actual is 0.0 Predicted is 0.009030038490891457\n", 747 | "213 Actual is -0.0 Predicted is 0.007049269508570433\n", 748 | "214 Actual is 0.0 Predicted is 0.04180893301963806\n", 749 | "215 Actual is 0.0 Predicted is -0.017598439007997513\n", 750 | "216 Actual is 0.0 Predicted is 0.06016399338841438\n", 751 | "217 Actual is 0.0 Predicted is -0.06429453939199448\n", 752 | "218 Actual is 0.0 Predicted is -0.005369179882109165\n", 753 | "219 Actual is -0.10000000149011612 Predicted is -0.03489100560545921\n", 754 | "220 Actual is 0.0 Predicted is -0.007152878679335117\n", 755 | "221 Actual is 0.0 Predicted is -0.006989338435232639\n", 756 | "222 Actual is -0.10000000149011612 Predicted is -0.015317740850150585\n", 757 | "223 Actual is 0.0 Predicted is -0.0013790316879749298\n", 758 | "224 Actual is -0.10000000149011612 Predicted is -0.08069723099470139\n", 759 | "225 Actual is 0.0 Predicted is -0.005718681029975414\n", 760 | "226 Actual is 0.0 Predicted is 0.043164536356925964\n", 761 | "227 Actual is -0.30000001192092896 Predicted is -0.27065378427505493\n", 762 | "228 Actual is -0.0 Predicted is -0.03556692600250244\n", 763 | "229 Actual is -0.20000000298023224 Predicted is -0.049846332520246506\n", 764 | "230 Actual is -0.10000000149011612 Predicted is -0.017329182475805283\n", 765 | "231 Actual is 0.0 Predicted is 0.0015233512967824936\n", 766 | "232 Actual is 0.0 Predicted is 0.0490235798060894\n", 767 | "233 Actual is 0.0 Predicted is 0.003312469460070133\n", 768 | "234 Actual is 0.0 Predicted is -0.000903264619410038\n", 769 | "235 Actual is 0.0 Predicted is 0.006104886531829834\n", 770 | "236 Actual is 0.0 Predicted is -0.006150399334728718\n", 771 | "237 Actual is 0.0 Predicted is -0.0069642262533307076\n", 772 | "238 Actual is 0.0 Predicted is 0.0020038122311234474\n", 773 | "239 Actual is 0.0 Predicted is -0.001770869828760624\n", 774 | "240 Actual is 0.10000000149011612 Predicted is 0.009164388291537762\n", 775 | "241 Actual is 0.0 Predicted is -0.012653646059334278\n", 776 | "242 Actual is -0.4000000059604645 Predicted is -0.3525174856185913\n", 777 | "243 Actual is -0.20000000298023224 Predicted is -0.1676555573940277\n", 778 | "244 Actual is 0.0 Predicted is -0.002432522363960743\n", 779 | "245 Actual is 0.0 Predicted is -0.006655837409198284\n", 780 | "246 Actual is 0.0 Predicted is 0.0359085313975811\n", 781 | "247 Actual is 0.0 Predicted is 0.0008681174367666245\n", 782 | "248 Actual is 0.0 Predicted is -0.0013364292681217194\n", 783 | "249 Actual is 0.0 Predicted is -0.0013954974710941315\n", 784 | "250 Actual is -0.10000000149011612 Predicted is -0.11612123996019363\n", 785 | "251 Actual is -0.6000000238418579 Predicted is -0.5139971375465393\n", 786 | "252 Actual is 0.0 Predicted is 0.08473000675439835\n", 787 | "253 Actual is 0.0 Predicted is 0.00800853781402111\n", 788 | "254 Actual is 0.0 Predicted is -0.00683684553951025\n", 789 | "255 Actual is 0.0 Predicted is 0.05421731248497963\n", 790 | "256 Actual is 0.10000000149011612 Predicted is 0.0540255643427372\n", 791 | "257 Actual is 0.0 Predicted is 0.009886389598250389\n", 792 | "258 Actual is 0.0 Predicted is 0.012131638824939728\n", 793 | "259 Actual is 0.0 Predicted is -0.0054847681894898415\n", 794 | "260 Actual is 0.0 Predicted is -0.0076282331719994545\n", 795 | "261 Actual is 0.0 Predicted is -0.007318257354199886\n", 796 | "262 Actual is 0.0 Predicted is 0.05397219955921173\n", 797 | "263 Actual is 0.0 Predicted is -0.0029107751324772835\n", 798 | "264 Actual is 0.10000000149011612 Predicted is 0.010407901369035244\n", 799 | "265 Actual is 0.0 Predicted is -0.006339116953313351\n", 800 | "266 Actual is 0.10000000149011612 Predicted is 0.03846561908721924\n", 801 | "267 Actual is 0.0 Predicted is -0.003107578493654728\n", 802 | "268 Actual is 0.0 Predicted is 0.001960507594048977\n", 803 | "269 Actual is 0.0 Predicted is 0.00046048033982515335\n", 804 | "270 Actual is 0.30000001192092896 Predicted is 0.1300784796476364\n", 805 | "271 Actual is 0.4000000059604645 Predicted is 0.11924195289611816\n", 806 | "272 Actual is 0.0 Predicted is -0.006238427944481373\n", 807 | "273 Actual is 0.0 Predicted is -0.014773366041481495\n", 808 | "274 Actual is 0.0 Predicted is 0.004062281921505928\n", 809 | "275 Actual is 0.0 Predicted is 0.09695994853973389\n", 810 | "276 Actual is 0.0 Predicted is -0.015261677093803883\n", 811 | "277 Actual is 0.0 Predicted is 0.006153343245387077\n", 812 | "278 Actual is 0.0 Predicted is 0.0295424647629261\n", 813 | "279 Actual is 0.0 Predicted is -0.006586830131709576\n", 814 | "280 Actual is 0.0 Predicted is 0.0010337764397263527\n", 815 | "281 Actual is 0.0 Predicted is 0.0058025214821100235\n", 816 | "282 Actual is 0.20000000298023224 Predicted is 0.10279428958892822\n", 817 | "283 Actual is 0.0 Predicted is 0.011310147121548653\n", 818 | "284 Actual is 0.0 Predicted is -0.0014562355354428291\n", 819 | "285 Actual is 0.0 Predicted is 0.08999401330947876\n", 820 | "286 Actual is -0.0 Predicted is 0.0572943314909935\n", 821 | "287 Actual is 0.0 Predicted is -0.04146258905529976\n", 822 | "288 Actual is 0.10000000149011612 Predicted is 0.0587439127266407\n", 823 | "289 Actual is 0.0 Predicted is 0.008710162714123726\n", 824 | "290 Actual is 0.0 Predicted is -0.013455147854983807\n", 825 | "291 Actual is 0.0 Predicted is 0.003786301240324974\n", 826 | "292 Actual is 0.0 Predicted is -0.004057559184730053\n", 827 | "293 Actual is 0.0 Predicted is -0.004013598896563053\n", 828 | "294 Actual is 0.10000000149011612 Predicted is 0.027920396998524666\n", 829 | "295 Actual is 0.0 Predicted is 0.0017354236915707588\n", 830 | "296 Actual is 0.0 Predicted is -0.00305879395455122\n", 831 | "297 Actual is 0.0 Predicted is 0.008892882615327835\n", 832 | "298 Actual is 0.20000000298023224 Predicted is 0.09225491434335709\n", 833 | "299 Actual is 0.0 Predicted is -0.004457329399883747\n", 834 | "300 Actual is 0.0 Predicted is 0.0009183846414089203\n", 835 | "301 Actual is 0.0 Predicted is 0.0018535749986767769\n" 836 | ] 837 | }, 838 | { 839 | "name": "stdout", 840 | "output_type": "stream", 841 | "text": [ 842 | "302 Actual is -0.0 Predicted is -0.014611332677304745\n", 843 | "303 Actual is 0.0 Predicted is 0.04340910539031029\n", 844 | "304 Actual is 0.0 Predicted is -0.005273583345115185\n", 845 | "305 Actual is 0.0 Predicted is -0.002295847050845623\n", 846 | "306 Actual is 0.0 Predicted is 0.003981822170317173\n", 847 | "307 Actual is 0.0 Predicted is 0.05032149702310562\n", 848 | "308 Actual is 0.0 Predicted is 0.07286893576383591\n", 849 | "309 Actual is 0.0 Predicted is -0.035832084715366364\n", 850 | "310 Actual is 0.0 Predicted is -0.0569748692214489\n", 851 | "311 Actual is 0.0 Predicted is 0.00900491513311863\n", 852 | "312 Actual is 0.20000000298023224 Predicted is 0.1438910812139511\n", 853 | "313 Actual is 0.0 Predicted is -0.049331579357385635\n", 854 | "314 Actual is 0.0 Predicted is -0.011645757593214512\n", 855 | "315 Actual is -0.0 Predicted is -0.013606696389615536\n", 856 | "316 Actual is -0.10000000149011612 Predicted is -0.050740230828523636\n", 857 | "317 Actual is 0.0 Predicted is -0.013507676310837269\n", 858 | "318 Actual is -0.0 Predicted is -0.10805417597293854\n", 859 | "319 Actual is 0.0 Predicted is -0.04210088774561882\n", 860 | "320 Actual is 0.0 Predicted is -0.007050936110317707\n", 861 | "321 Actual is 0.0 Predicted is -0.0035707158967852592\n", 862 | "322 Actual is 0.20000000298023224 Predicted is 0.15966251492500305\n", 863 | "323 Actual is 0.0 Predicted is -0.003678080625832081\n", 864 | "324 Actual is 0.0 Predicted is -0.06829240173101425\n", 865 | "325 Actual is 0.0 Predicted is -0.0011577671393752098\n", 866 | "326 Actual is 0.0 Predicted is 0.004093734547495842\n", 867 | "327 Actual is 0.0 Predicted is -0.03496195003390312\n", 868 | "328 Actual is 0.20000000298023224 Predicted is 0.04578986018896103\n", 869 | "329 Actual is 0.0 Predicted is 0.0006079636514186859\n", 870 | "330 Actual is 0.20000000298023224 Predicted is 0.008962647058069706\n", 871 | "331 Actual is 0.0 Predicted is -0.03133491799235344\n", 872 | "332 Actual is -0.10000000149011612 Predicted is -0.025154896080493927\n", 873 | "333 Actual is 0.0 Predicted is -0.0015031397342681885\n", 874 | "334 Actual is 0.0 Predicted is 0.10747271776199341\n", 875 | "335 Actual is 0.30000001192092896 Predicted is 0.16926269233226776\n", 876 | "336 Actual is 0.0 Predicted is 0.008622827008366585\n", 877 | "337 Actual is 0.10000000149011612 Predicted is 0.09464246034622192\n", 878 | "338 Actual is 0.0 Predicted is 0.04812776297330856\n", 879 | "339 Actual is 0.10000000149011612 Predicted is 0.008989646099507809\n", 880 | "340 Actual is -0.10000000149011612 Predicted is 0.020279791206121445\n", 881 | "341 Actual is 0.0 Predicted is -0.04213768243789673\n", 882 | "342 Actual is 0.30000001192092896 Predicted is 0.08681011945009232\n", 883 | "343 Actual is 0.0 Predicted is 0.001065848395228386\n", 884 | "344 Actual is 0.0 Predicted is -0.006603325717151165\n" 885 | ] 886 | } 887 | ], 888 | "source": [ 889 | "# Model evaluation\n", 890 | "#model_eval = cnn_model.evaluate(x_test, y_test)\n", 891 | "i = 0 # Random value from test data set to check model accuracy\n", 892 | "prediction=[]\n", 893 | "for i in range(len(y_test)):\n", 894 | " image_arr = x_test[i]\n", 895 | " #print(image_arr.shape)\n", 896 | " #plt.imshow(image_arr)\n", 897 | " #plt.show()\n", 898 | " #image_fl = image_arr.astype(\"float32\")\n", 899 | " #image_norm = image_fl/ 255.0\n", 900 | " #image_reshape = image_arr.reshape(2, image_arr.shape[0], image_arr.shape[1], image_arr.shape[2])\n", 901 | " image_pred = np.expand_dims(image_arr, axis=0)\n", 902 | " prediction.append(cnn_model.predict(image_pred))\n", 903 | " pred_angle = float(prediction[i])\n", 904 | " angle = y_test[i]\n", 905 | " dev = angle/pred_angle\n", 906 | " print(f'{i} Actual is {angle} Predicted is {pred_angle}')\n", 907 | "\n", 908 | "#print(np.mean(prediction))" 909 | ] 910 | }, 911 | { 912 | "cell_type": "code", 913 | "execution_count": 24, 914 | "id": "c8c9fbb2", 915 | "metadata": {}, 916 | "outputs": [], 917 | "source": [ 918 | "# Save model to disk\n", 919 | "model_name = 'bhv_clone_track05'\n", 920 | "full_model_path = f'{base_model_path}/{model_name}.h5'\n", 921 | "#print(model_name)\n", 922 | "cnn_model.save(full_model_path)" 923 | ] 924 | }, 925 | { 926 | "cell_type": "code", 927 | "execution_count": null, 928 | "id": "d6f5165d", 929 | "metadata": {}, 930 | "outputs": [], 931 | "source": [ 932 | "print(f'Model {full_model_path} saved')" 933 | ] 934 | }, 935 | { 936 | "cell_type": "code", 937 | "execution_count": null, 938 | "id": "64c2d005", 939 | "metadata": {}, 940 | "outputs": [], 941 | "source": [] 942 | } 943 | ], 944 | "metadata": { 945 | "kernelspec": { 946 | "display_name": "Python 3 (ipykernel)", 947 | "language": "python", 948 | "name": "python3" 949 | }, 950 | "language_info": { 951 | "codemirror_mode": { 952 | "name": "ipython", 953 | "version": 3 954 | }, 955 | "file_extension": ".py", 956 | "mimetype": "text/x-python", 957 | "name": "python", 958 | "nbconvert_exporter": "python", 959 | "pygments_lexer": "ipython3", 960 | "version": "3.9.13" 961 | } 962 | }, 963 | "nbformat": 4, 964 | "nbformat_minor": 5 965 | } 966 | -------------------------------------------------------------------------------- /carla-manual-2.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | 3 | # Copyright (c) 2019 Computer Vision Center (CVC) at the Universitat Autonoma de 4 | # Barcelona (UAB). 5 | # 6 | # This work is licensed under the terms of the MIT license. 7 | # For a copy, see . 8 | 9 | # Allows controlling a vehicle with a keyboard. For a simpler and more 10 | # documented example, please take a look at tutorial.py. 11 | 12 | ######################################################################## 13 | # Author: Neel (itsneel@amazon.com) 14 | # Adding multiple functionalities to capture image and steering data 15 | # from the simulation 16 | ######################################################################## 17 | 18 | 19 | """ 20 | Welcome to CARLA manual control. 21 | 22 | Use ARROWS or WASD keys for control. 23 | 24 | W : throttle 25 | S : brake 26 | A/D : steer left/right 27 | Q : toggle reverse 28 | Space : hand-brake 29 | P : toggle autopilot 30 | M : toggle manual transmission 31 | ,/. : gear up/down 32 | CTRL + W : toggle constant velocity mode at 60 km/h 33 | 34 | L : toggle next light type 35 | SHIFT + L : toggle high beam 36 | Z/X : toggle right/left blinker 37 | I : toggle interior light 38 | 39 | TAB : change sensor position 40 | ` or N : next sensor 41 | [1-9] : change to sensor [1-9] 42 | G : toggle radar visualization 43 | C : change weather (Shift+C reverse) 44 | Backspace : change vehicle 45 | 46 | O : open/close all doors of vehicle 47 | T : toggle vehicle's telemetry 48 | 49 | V : Select next map layer (Shift+V reverse) 50 | B : Load current selected map layer (Shift+B to unload) 51 | 52 | R : toggle recording images to disk 53 | 54 | CTRL + R : toggle recording of simulation (replacing any previous) 55 | CTRL + P : start replaying last recorded simulation 56 | CTRL + + : increments the start time of the replay by 1 second (+SHIFT = 10 seconds) 57 | CTRL + - : decrements the start time of the replay by 1 second (+SHIFT = 10 seconds) 58 | 59 | F1 : toggle HUD 60 | H/? : toggle help 61 | ESC : quit 62 | """ 63 | 64 | from __future__ import print_function 65 | 66 | 67 | # ============================================================================== 68 | # -- find carla module --------------------------------------------------------- 69 | # ============================================================================== 70 | 71 | 72 | import glob 73 | import os 74 | import sys 75 | 76 | try: 77 | sys.path.append(glob.glob('../carla/dist/carla-*%d.%d-%s.egg' % ( 78 | sys.version_info.major, 79 | sys.version_info.minor, 80 | 'win-amd64' if os.name == 'nt' else 'linux-x86_64'))[0]) 81 | except IndexError: 82 | pass 83 | 84 | 85 | # ============================================================================== 86 | # -- imports ------------------------------------------------------------------- 87 | # ============================================================================== 88 | 89 | 90 | import carla 91 | 92 | from carla import ColorConverter as cc 93 | 94 | import argparse 95 | import collections 96 | import logging 97 | import math 98 | import random 99 | import re 100 | import weakref 101 | import json 102 | from datetime import datetime as dt 103 | import datetime 104 | from pathlib import Path 105 | 106 | try: 107 | import pygame 108 | from pygame.locals import KMOD_CTRL 109 | from pygame.locals import KMOD_SHIFT 110 | from pygame.locals import K_0 111 | from pygame.locals import K_9 112 | from pygame.locals import K_BACKQUOTE 113 | from pygame.locals import K_BACKSPACE 114 | from pygame.locals import K_COMMA 115 | from pygame.locals import K_DOWN 116 | from pygame.locals import K_ESCAPE 117 | from pygame.locals import K_F1 118 | from pygame.locals import K_LEFT 119 | from pygame.locals import K_PERIOD 120 | from pygame.locals import K_RIGHT 121 | from pygame.locals import K_SLASH 122 | from pygame.locals import K_SPACE 123 | from pygame.locals import K_TAB 124 | from pygame.locals import K_UP 125 | from pygame.locals import K_a 126 | from pygame.locals import K_b 127 | from pygame.locals import K_c 128 | from pygame.locals import K_d 129 | from pygame.locals import K_g 130 | from pygame.locals import K_h 131 | from pygame.locals import K_i 132 | from pygame.locals import K_l 133 | from pygame.locals import K_m 134 | from pygame.locals import K_n 135 | from pygame.locals import K_o 136 | from pygame.locals import K_p 137 | from pygame.locals import K_q 138 | from pygame.locals import K_r 139 | from pygame.locals import K_s 140 | from pygame.locals import K_t 141 | from pygame.locals import K_v 142 | from pygame.locals import K_w 143 | from pygame.locals import K_x 144 | from pygame.locals import K_z 145 | from pygame.locals import K_MINUS 146 | from pygame.locals import K_EQUALS 147 | except ImportError: 148 | raise RuntimeError('cannot import pygame, make sure pygame package is installed') 149 | 150 | try: 151 | import numpy as np 152 | except ImportError: 153 | raise RuntimeError('cannot import numpy, make sure numpy package is installed') 154 | 155 | 156 | # ============================================================================== 157 | # -- Global functions ---------------------------------------------------------- 158 | # ============================================================================== 159 | 160 | def rgb_callback(a, b): 161 | return None 162 | 163 | def find_weather_presets(): 164 | rgx = re.compile('.+?(?:(?<=[a-z])(?=[A-Z])|(?<=[A-Z])(?=[A-Z][a-z])|$)') 165 | name = lambda x: ' '.join(m.group(0) for m in rgx.finditer(x)) 166 | presets = [x for x in dir(carla.WeatherParameters) if re.match('[A-Z].+', x)] 167 | return [(getattr(carla.WeatherParameters, x), name(x)) for x in presets] 168 | 169 | 170 | def get_actor_display_name(actor, truncate=250): 171 | name = ' '.join(actor.type_id.replace('_', '.').title().split('.')[1:]) 172 | return (name[:truncate - 1] + u'\u2026') if len(name) > truncate else name 173 | 174 | def get_actor_blueprints(world, filter, generation): 175 | bps = world.get_blueprint_library().filter(filter) 176 | 177 | if generation.lower() == "all": 178 | return bps 179 | 180 | # If the filter returns only one bp, we assume that this one needed 181 | # and therefore, we ignore the generation 182 | if len(bps) == 1: 183 | return bps 184 | 185 | try: 186 | int_generation = int(generation) 187 | # Check if generation is in available generations 188 | if int_generation in [1, 2]: 189 | bps = [x for x in bps if int(x.get_attribute('generation')) == int_generation] 190 | return bps 191 | else: 192 | print(" Warning! Actor Generation is not valid. No actor will be spawned.") 193 | return [] 194 | except: 195 | print(" Warning! Actor Generation is not valid. No actor will be spawned.") 196 | return [] 197 | 198 | now = dt.now() 199 | date_string = now.strftime("%d%m%Y-%H%M") 200 | # ============================================================================== 201 | # -- World --------------------------------------------------------------------- 202 | # ============================================================================== 203 | 204 | 205 | class World(object): 206 | def __init__(self, carla_world, hud, args): 207 | self.world = carla_world 208 | self.sync = args.sync 209 | self.actor_role_name = args.rolename 210 | try: 211 | self.map = self.world.get_map() 212 | except RuntimeError as error: 213 | print('RuntimeError: {}'.format(error)) 214 | print(' The server could not send the OpenDRIVE (.xodr) file:') 215 | print(' Make sure it exists, has the same name of your town, and is correct.') 216 | sys.exit(1) 217 | self.hud = hud 218 | self.player = None 219 | self.collision_sensor = None 220 | self.lane_invasion_sensor = None 221 | self.gnss_sensor = None 222 | self.imu_sensor = None 223 | self.radar_sensor = None 224 | self.camera_manager = None 225 | self._weather_presets = find_weather_presets() 226 | self._weather_index = 0 227 | self._actor_filter = args.filter 228 | self._actor_generation = args.generation 229 | self._gamma = args.gamma 230 | self.restart() 231 | self.world.on_tick(hud.on_world_tick) 232 | self.recording_enabled = False 233 | self.recording_start = 0 234 | self.constant_velocity_enabled = False 235 | self.show_vehicle_telemetry = False 236 | self.doors_are_open = False 237 | self.current_map_layer = 0 238 | self.map_layer_names = [ 239 | carla.MapLayer.NONE, 240 | carla.MapLayer.Buildings, 241 | carla.MapLayer.Decals, 242 | carla.MapLayer.Foliage, 243 | carla.MapLayer.Ground, 244 | carla.MapLayer.ParkedVehicles, 245 | carla.MapLayer.Particles, 246 | carla.MapLayer.Props, 247 | carla.MapLayer.StreetLights, 248 | carla.MapLayer.Walls, 249 | carla.MapLayer.All 250 | ] 251 | 252 | def restart(self): 253 | self.player_max_speed = 1.589 254 | self.player_max_speed_fast = 3.713 255 | # Keep same camera config if the camera manager exists. 256 | cam_index = self.camera_manager.index if self.camera_manager is not None else 0 257 | cam_pos_index = self.camera_manager.transform_index if self.camera_manager is not None else 0 258 | # Get a random blueprint. 259 | # Neel updated the blueprint library to get just 1 type of vehicle 260 | #blueprint = random.choice(get_actor_blueprints(self.world, self._actor_filter, self._actor_generation)) 261 | blueprint_lib = self.world.get_blueprint_library() 262 | blueprint = blueprint_lib.find('vehicle.tesla.model3') 263 | blueprint.set_attribute('role_name', self.actor_role_name) 264 | if blueprint.has_attribute('color'): 265 | color = random.choice(blueprint.get_attribute('color').recommended_values) 266 | blueprint.set_attribute('color', color) 267 | if blueprint.has_attribute('driver_id'): 268 | driver_id = random.choice(blueprint.get_attribute('driver_id').recommended_values) 269 | blueprint.set_attribute('driver_id', driver_id) 270 | if blueprint.has_attribute('is_invincible'): 271 | blueprint.set_attribute('is_invincible', 'true') 272 | # set the max speed 273 | if blueprint.has_attribute('speed'): 274 | self.player_max_speed = float(blueprint.get_attribute('speed').recommended_values[1]) 275 | self.player_max_speed_fast = float(blueprint.get_attribute('speed').recommended_values[2]) 276 | 277 | # Spawn the player. 278 | if self.player is not None: 279 | spawn_point = self.player.get_transform() 280 | spawn_point.location.z += 2.0 281 | spawn_point.rotation.roll = 0.0 282 | spawn_point.rotation.pitch = 0.0 283 | self.destroy() 284 | self.player = self.world.try_spawn_actor(blueprint, spawn_point) 285 | self.show_vehicle_telemetry = False 286 | self.modify_vehicle_physics(self.player) 287 | while self.player is None: 288 | if not self.map.get_spawn_points(): 289 | print('There are no spawn points available in your map/town.') 290 | print('Please add some Vehicle Spawn Point to your UE4 scene.') 291 | sys.exit(1) 292 | spawn_points = self.map.get_spawn_points() 293 | spawn_point = random.choice(spawn_points) if spawn_points else carla.Transform() 294 | self.player = self.world.try_spawn_actor(blueprint, spawn_point) 295 | self.show_vehicle_telemetry = False 296 | self.modify_vehicle_physics(self.player) 297 | # Set up the sensors. 298 | self.collision_sensor = CollisionSensor(self.player, self.hud) 299 | self.lane_invasion_sensor = LaneInvasionSensor(self.player, self.hud) 300 | self.gnss_sensor = GnssSensor(self.player) 301 | self.imu_sensor = IMUSensor(self.player) 302 | self.camera_manager = CameraManager(self.player, self.hud, self._gamma) 303 | self.camera_manager.transform_index = cam_pos_index 304 | self.camera_manager.set_sensor(cam_index, notify=False) 305 | actor_type = get_actor_display_name(self.player) 306 | self.hud.notification(actor_type) 307 | 308 | if self.sync: 309 | self.world.tick() 310 | else: 311 | self.world.wait_for_tick() 312 | 313 | def next_weather(self, reverse=False): 314 | self._weather_index += -1 if reverse else 1 315 | self._weather_index %= len(self._weather_presets) 316 | preset = self._weather_presets[self._weather_index] 317 | self.hud.notification('Weather: %s' % preset[1]) 318 | self.player.get_world().set_weather(preset[0]) 319 | 320 | def next_map_layer(self, reverse=False): 321 | self.current_map_layer += -1 if reverse else 1 322 | self.current_map_layer %= len(self.map_layer_names) 323 | selected = self.map_layer_names[self.current_map_layer] 324 | self.hud.notification('LayerMap selected: %s' % selected) 325 | 326 | def load_map_layer(self, unload=False): 327 | selected = self.map_layer_names[self.current_map_layer] 328 | if unload: 329 | self.hud.notification('Unloading map layer: %s' % selected) 330 | self.world.unload_map_layer(selected) 331 | else: 332 | self.hud.notification('Loading map layer: %s' % selected) 333 | self.world.load_map_layer(selected) 334 | 335 | def toggle_radar(self): 336 | if self.radar_sensor is None: 337 | self.radar_sensor = RadarSensor(self.player) 338 | elif self.radar_sensor.sensor is not None: 339 | self.radar_sensor.sensor.destroy() 340 | self.radar_sensor = None 341 | 342 | def modify_vehicle_physics(self, actor): 343 | #If actor is not a vehicle, we cannot use the physics control 344 | try: 345 | physics_control = actor.get_physics_control() 346 | physics_control.use_sweep_wheel_collision = True 347 | actor.apply_physics_control(physics_control) 348 | except Exception: 349 | pass 350 | 351 | def tick(self, clock): 352 | self.hud.tick(self, clock) 353 | 354 | def render(self, display): 355 | self.camera_manager.render(display) 356 | self.hud.render(display) 357 | 358 | def destroy_sensors(self): 359 | self.camera_manager.sensor.destroy() 360 | self.camera_manager.sensor = None 361 | self.camera_manager.index = None 362 | 363 | def destroy(self): 364 | if self.radar_sensor is not None: 365 | self.toggle_radar() 366 | sensors = [ 367 | self.camera_manager.sensor, 368 | self.collision_sensor.sensor, 369 | self.lane_invasion_sensor.sensor, 370 | self.gnss_sensor.sensor, 371 | self.imu_sensor.sensor] 372 | for sensor in sensors: 373 | if sensor is not None: 374 | sensor.stop() 375 | sensor.destroy() 376 | if self.player is not None: 377 | self.player.destroy() 378 | 379 | 380 | # ============================================================================== 381 | # -- KeyboardControl ----------------------------------------------------------- 382 | # ============================================================================== 383 | 384 | 385 | class KeyboardControl(object): 386 | """Class that handles keyboard input.""" 387 | def __init__(self, world, start_in_autopilot): 388 | self._autopilot_enabled = start_in_autopilot 389 | if isinstance(world.player, carla.Vehicle): 390 | self._control = carla.VehicleControl() 391 | self._lights = carla.VehicleLightState.NONE 392 | world.player.set_autopilot(self._autopilot_enabled) 393 | world.player.set_light_state(self._lights) 394 | elif isinstance(world.player, carla.Walker): 395 | self._control = carla.WalkerControl() 396 | self._autopilot_enabled = False 397 | self._rotation = world.player.get_transform().rotation 398 | else: 399 | raise NotImplementedError("Actor type not supported") 400 | self._steer_cache = 0.0 401 | world.hud.notification("Press 'H' or '?' for help.", seconds=4.0) 402 | 403 | def parse_events(self, client, world, clock, sync_mode): 404 | if isinstance(self._control, carla.VehicleControl): 405 | current_lights = self._lights 406 | for event in pygame.event.get(): 407 | if event.type == pygame.QUIT: 408 | return True 409 | elif event.type == pygame.KEYUP: 410 | if self._is_quit_shortcut(event.key): 411 | return True 412 | elif event.key == K_BACKSPACE: 413 | if self._autopilot_enabled: 414 | world.player.set_autopilot(False) 415 | world.restart() 416 | world.player.set_autopilot(True) 417 | else: 418 | world.restart() 419 | elif event.key == K_F1: 420 | world.hud.toggle_info() 421 | elif event.key == K_v and pygame.key.get_mods() & KMOD_SHIFT: 422 | world.next_map_layer(reverse=True) 423 | elif event.key == K_v: 424 | world.next_map_layer() 425 | elif event.key == K_b and pygame.key.get_mods() & KMOD_SHIFT: 426 | world.load_map_layer(unload=True) 427 | elif event.key == K_b: 428 | world.load_map_layer() 429 | elif event.key == K_h or (event.key == K_SLASH and pygame.key.get_mods() & KMOD_SHIFT): 430 | world.hud.help.toggle() 431 | elif event.key == K_TAB: 432 | world.camera_manager.toggle_camera() 433 | elif event.key == K_c and pygame.key.get_mods() & KMOD_SHIFT: 434 | world.next_weather(reverse=True) 435 | elif event.key == K_c: 436 | world.next_weather() 437 | elif event.key == K_g: 438 | world.toggle_radar() 439 | elif event.key == K_BACKQUOTE: 440 | world.camera_manager.next_sensor() 441 | elif event.key == K_n: 442 | world.camera_manager.next_sensor() 443 | elif event.key == K_w and (pygame.key.get_mods() & KMOD_CTRL): 444 | if world.constant_velocity_enabled: 445 | world.player.disable_constant_velocity() 446 | world.constant_velocity_enabled = False 447 | world.hud.notification("Disabled Constant Velocity Mode") 448 | else: 449 | world.player.enable_constant_velocity(carla.Vector3D(17, 0, 0)) 450 | world.constant_velocity_enabled = True 451 | world.hud.notification("Enabled Constant Velocity Mode at 60 km/h") 452 | elif event.key == K_o: 453 | try: 454 | if world.doors_are_open: 455 | world.hud.notification("Closing Doors") 456 | world.doors_are_open = False 457 | world.player.close_door(carla.VehicleDoor.All) 458 | else: 459 | world.hud.notification("Opening doors") 460 | world.doors_are_open = True 461 | world.player.open_door(carla.VehicleDoor.All) 462 | except Exception: 463 | pass 464 | elif event.key == K_t: 465 | if world.show_vehicle_telemetry: 466 | world.player.show_debug_telemetry(False) 467 | world.show_vehicle_telemetry = False 468 | world.hud.notification("Disabled Vehicle Telemetry") 469 | else: 470 | try: 471 | world.player.show_debug_telemetry(True) 472 | world.show_vehicle_telemetry = True 473 | world.hud.notification("Enabled Vehicle Telemetry") 474 | except Exception: 475 | pass 476 | elif event.key > K_0 and event.key <= K_9: 477 | index_ctrl = 0 478 | if pygame.key.get_mods() & KMOD_CTRL: 479 | index_ctrl = 9 480 | world.camera_manager.set_sensor(event.key - 1 - K_0 + index_ctrl) 481 | elif event.key == K_r and not (pygame.key.get_mods() & KMOD_CTRL): 482 | world.camera_manager.toggle_recording() 483 | elif event.key == K_r and (pygame.key.get_mods() & KMOD_CTRL): 484 | if (world.recording_enabled): 485 | client.stop_recorder() 486 | world.recording_enabled = False 487 | world.hud.notification("Recorder is OFF") 488 | else: 489 | client.start_recorder("manual_recording.rec") 490 | world.recording_enabled = True 491 | world.hud.notification("Recorder is ON") 492 | elif event.key == K_p and (pygame.key.get_mods() & KMOD_CTRL): 493 | # stop recorder 494 | client.stop_recorder() 495 | world.recording_enabled = False 496 | # work around to fix camera at start of replaying 497 | current_index = world.camera_manager.index 498 | world.destroy_sensors() 499 | # disable autopilot 500 | self._autopilot_enabled = False 501 | world.player.set_autopilot(self._autopilot_enabled) 502 | world.hud.notification("Replaying file 'manual_recording.rec'") 503 | # replayer 504 | client.replay_file("manual_recording.rec", world.recording_start, 0, 0) 505 | world.camera_manager.set_sensor(current_index) 506 | elif event.key == K_MINUS and (pygame.key.get_mods() & KMOD_CTRL): 507 | if pygame.key.get_mods() & KMOD_SHIFT: 508 | world.recording_start -= 10 509 | else: 510 | world.recording_start -= 1 511 | world.hud.notification("Recording start time is %d" % (world.recording_start)) 512 | elif event.key == K_EQUALS and (pygame.key.get_mods() & KMOD_CTRL): 513 | if pygame.key.get_mods() & KMOD_SHIFT: 514 | world.recording_start += 10 515 | else: 516 | world.recording_start += 1 517 | world.hud.notification("Recording start time is %d" % (world.recording_start)) 518 | if isinstance(self._control, carla.VehicleControl): 519 | if event.key == K_q: 520 | self._control.gear = 1 if self._control.reverse else -1 521 | elif event.key == K_m: 522 | self._control.manual_gear_shift = not self._control.manual_gear_shift 523 | self._control.gear = world.player.get_control().gear 524 | world.hud.notification('%s Transmission' % 525 | ('Manual' if self._control.manual_gear_shift else 'Automatic')) 526 | elif self._control.manual_gear_shift and event.key == K_COMMA: 527 | self._control.gear = max(-1, self._control.gear - 1) 528 | elif self._control.manual_gear_shift and event.key == K_PERIOD: 529 | self._control.gear = self._control.gear + 1 530 | elif event.key == K_p and not pygame.key.get_mods() & KMOD_CTRL: 531 | if not self._autopilot_enabled and not sync_mode: 532 | print("WARNING: You are currently in asynchronous mode and could " 533 | "experience some issues with the traffic simulation") 534 | self._autopilot_enabled = not self._autopilot_enabled 535 | world.player.set_autopilot(self._autopilot_enabled) 536 | world.hud.notification( 537 | 'Autopilot %s' % ('On' if self._autopilot_enabled else 'Off')) 538 | elif event.key == K_l and pygame.key.get_mods() & KMOD_CTRL: 539 | current_lights ^= carla.VehicleLightState.Special1 540 | elif event.key == K_l and pygame.key.get_mods() & KMOD_SHIFT: 541 | current_lights ^= carla.VehicleLightState.HighBeam 542 | elif event.key == K_l: 543 | # Use 'L' key to switch between lights: 544 | # closed -> position -> low beam -> fog 545 | if not self._lights & carla.VehicleLightState.Position: 546 | world.hud.notification("Position lights") 547 | current_lights |= carla.VehicleLightState.Position 548 | else: 549 | world.hud.notification("Low beam lights") 550 | current_lights |= carla.VehicleLightState.LowBeam 551 | if self._lights & carla.VehicleLightState.LowBeam: 552 | world.hud.notification("Fog lights") 553 | current_lights |= carla.VehicleLightState.Fog 554 | if self._lights & carla.VehicleLightState.Fog: 555 | world.hud.notification("Lights off") 556 | current_lights ^= carla.VehicleLightState.Position 557 | current_lights ^= carla.VehicleLightState.LowBeam 558 | current_lights ^= carla.VehicleLightState.Fog 559 | elif event.key == K_i: 560 | current_lights ^= carla.VehicleLightState.Interior 561 | elif event.key == K_z: 562 | current_lights ^= carla.VehicleLightState.LeftBlinker 563 | elif event.key == K_x: 564 | current_lights ^= carla.VehicleLightState.RightBlinker 565 | 566 | if not self._autopilot_enabled: 567 | if isinstance(self._control, carla.VehicleControl): 568 | self._parse_vehicle_keys(pygame.key.get_pressed(), clock.get_time()) 569 | self._control.reverse = self._control.gear < 0 570 | # Set automatic control-related vehicle lights 571 | if self._control.brake: 572 | current_lights |= carla.VehicleLightState.Brake 573 | else: # Remove the Brake flag 574 | current_lights &= ~carla.VehicleLightState.Brake 575 | if self._control.reverse: 576 | current_lights |= carla.VehicleLightState.Reverse 577 | else: # Remove the Reverse flag 578 | current_lights &= ~carla.VehicleLightState.Reverse 579 | if current_lights != self._lights: # Change the light state only if necessary 580 | self._lights = current_lights 581 | world.player.set_light_state(carla.VehicleLightState(self._lights)) 582 | elif isinstance(self._control, carla.WalkerControl): 583 | self._parse_walker_keys(pygame.key.get_pressed(), clock.get_time(), world) 584 | world.player.apply_control(self._control) 585 | 586 | def _parse_vehicle_keys(self, keys, milliseconds): 587 | if keys[K_UP] or keys[K_w]: 588 | self._control.throttle = min(self._control.throttle + 0.01, 1.00) 589 | else: 590 | self._control.throttle = 0.0 591 | 592 | if keys[K_DOWN] or keys[K_s]: 593 | self._control.brake = min(self._control.brake + 0.2, 1) 594 | else: 595 | self._control.brake = 0 596 | 597 | steer_increment = 5e-4 * milliseconds 598 | if keys[K_LEFT] or keys[K_a]: 599 | if self._steer_cache > 0: 600 | self._steer_cache = 0 601 | else: 602 | self._steer_cache -= steer_increment 603 | elif keys[K_RIGHT] or keys[K_d]: 604 | if self._steer_cache < 0: 605 | self._steer_cache = 0 606 | else: 607 | self._steer_cache += steer_increment 608 | else: 609 | self._steer_cache = 0.0 610 | self._steer_cache = min(0.7, max(-0.7, self._steer_cache)) 611 | self._control.steer = round(self._steer_cache, 1) 612 | self._control.hand_brake = keys[K_SPACE] 613 | 614 | def _parse_walker_keys(self, keys, milliseconds, world): 615 | self._control.speed = 0.0 616 | if keys[K_DOWN] or keys[K_s]: 617 | self._control.speed = 0.0 618 | if keys[K_LEFT] or keys[K_a]: 619 | self._control.speed = .01 620 | self._rotation.yaw -= 0.08 * milliseconds 621 | if keys[K_RIGHT] or keys[K_d]: 622 | self._control.speed = .01 623 | self._rotation.yaw += 0.08 * milliseconds 624 | if keys[K_UP] or keys[K_w]: 625 | self._control.speed = world.player_max_speed_fast if pygame.key.get_mods() & KMOD_SHIFT else world.player_max_speed 626 | self._control.jump = keys[K_SPACE] 627 | self._rotation.yaw = round(self._rotation.yaw, 1) 628 | self._control.direction = self._rotation.get_forward_vector() 629 | 630 | @staticmethod 631 | def _is_quit_shortcut(key): 632 | return (key == K_ESCAPE) or (key == K_q and pygame.key.get_mods() & KMOD_CTRL) 633 | 634 | 635 | # ============================================================================== 636 | # -- HUD ----------------------------------------------------------------------- 637 | # ============================================================================== 638 | 639 | 640 | class HUD(object): 641 | def __init__(self, width, height): 642 | self.dim = (width, height) 643 | font = pygame.font.Font(pygame.font.get_default_font(), 20) 644 | font_name = 'courier' if os.name == 'nt' else 'mono' 645 | fonts = [x for x in pygame.font.get_fonts() if font_name in x] 646 | default_font = 'ubuntumono' 647 | mono = default_font if default_font in fonts else fonts[0] 648 | mono = pygame.font.match_font(mono) 649 | self._font_mono = pygame.font.Font(mono, 12 if os.name == 'nt' else 14) 650 | self._notifications = FadingText(font, (width, 40), (0, height - 40)) 651 | self.help = HelpText(pygame.font.Font(mono, 16), width, height) 652 | self.server_fps = 0 653 | self.frame = 0 654 | self.simulation_time = 0 655 | self._show_info = True 656 | self._info_text = [] 657 | self._server_clock = pygame.time.Clock() 658 | 659 | def on_world_tick(self, timestamp): 660 | self._server_clock.tick() 661 | self.server_fps = self._server_clock.get_fps() 662 | self.frame = timestamp.frame 663 | self.simulation_time = timestamp.elapsed_seconds 664 | 665 | def tick(self, world, clock): 666 | self._notifications.tick(world, clock) 667 | if not self._show_info: 668 | return 669 | t = world.player.get_transform() 670 | v = world.player.get_velocity() 671 | c = world.player.get_control() 672 | compass = world.imu_sensor.compass 673 | heading = 'N' if compass > 270.5 or compass < 89.5 else '' 674 | heading += 'S' if 90.5 < compass < 269.5 else '' 675 | heading += 'E' if 0.5 < compass < 179.5 else '' 676 | heading += 'W' if 180.5 < compass < 359.5 else '' 677 | colhist = world.collision_sensor.get_collision_history() 678 | collision = [colhist[x + self.frame - 200] for x in range(0, 200)] 679 | max_col = max(1.0, max(collision)) 680 | collision = [x / max_col for x in collision] 681 | vehicles = world.world.get_actors().filter('vehicle.*') 682 | self._info_text = [ 683 | 'Server: % 16.0f FPS' % self.server_fps, 684 | 'Client: % 16.0f FPS' % clock.get_fps(), 685 | '', 686 | 'Vehicle: % 20s' % get_actor_display_name(world.player, truncate=20), 687 | 'Map: % 20s' % world.map.name.split('/')[-1], 688 | 'Simulation time: % 12s' % datetime.timedelta(seconds=int(self.simulation_time)), 689 | '', 690 | 'Speed: % 15.0f km/h' % (3.6 * math.sqrt(v.x**2 + v.y**2 + v.z**2)), 691 | u'Compass:% 17.0f\N{DEGREE SIGN} % 2s' % (compass, heading), 692 | 'Accelero: (%5.1f,%5.1f,%5.1f)' % (world.imu_sensor.accelerometer), 693 | 'Gyroscop: (%5.1f,%5.1f,%5.1f)' % (world.imu_sensor.gyroscope), 694 | 'Location:% 20s' % ('(% 5.1f, % 5.1f)' % (t.location.x, t.location.y)), 695 | 'GNSS:% 24s' % ('(% 2.6f, % 3.6f)' % (world.gnss_sensor.lat, world.gnss_sensor.lon)), 696 | 'Height: % 18.0f m' % t.location.z, 697 | ''] 698 | if isinstance(c, carla.VehicleControl): 699 | self._info_text += [ 700 | ('Throttle:', c.throttle, 0.0, 1.0), 701 | ('Steer:', c.steer, -1.0, 1.0), 702 | ('Brake:', c.brake, 0.0, 1.0), 703 | ('Reverse:', c.reverse), 704 | ('Hand brake:', c.hand_brake), 705 | ('Manual:', c.manual_gear_shift), 706 | 'Gear: %s' % {-1: 'R', 0: 'N'}.get(c.gear, c.gear)] 707 | elif isinstance(c, carla.WalkerControl): 708 | self._info_text += [ 709 | ('Speed:', c.speed, 0.0, 5.556), 710 | ('Jump:', c.jump)] 711 | self._info_text += [ 712 | '', 713 | 'Collision:', 714 | collision, 715 | '', 716 | 'Number of vehicles: % 8d' % len(vehicles)] 717 | if len(vehicles) > 1: 718 | self._info_text += ['Nearby vehicles:'] 719 | distance = lambda l: math.sqrt((l.x - t.location.x)**2 + (l.y - t.location.y)**2 + (l.z - t.location.z)**2) 720 | vehicles = [(distance(x.get_location()), x) for x in vehicles if x.id != world.player.id] 721 | for d, vehicle in sorted(vehicles, key=lambda vehicles: vehicles[0]): 722 | if d > 200.0: 723 | break 724 | vehicle_type = get_actor_display_name(vehicle, truncate=22) 725 | self._info_text.append('% 4dm %s' % (d, vehicle_type)) 726 | 727 | def toggle_info(self): 728 | self._show_info = not self._show_info 729 | 730 | def notification(self, text, seconds=2.0): 731 | self._notifications.set_text(text, seconds=seconds) 732 | 733 | def error(self, text): 734 | self._notifications.set_text('Error: %s' % text, (255, 0, 0)) 735 | 736 | def render(self, display): 737 | if self._show_info: 738 | info_surface = pygame.Surface((220, self.dim[1])) 739 | info_surface.set_alpha(100) 740 | display.blit(info_surface, (0, 0)) 741 | v_offset = 4 742 | bar_h_offset = 100 743 | bar_width = 106 744 | for item in self._info_text: 745 | if v_offset + 18 > self.dim[1]: 746 | break 747 | if isinstance(item, list): 748 | if len(item) > 1: 749 | points = [(x + 8, v_offset + 8 + (1.0 - y) * 30) for x, y in enumerate(item)] 750 | pygame.draw.lines(display, (255, 136, 0), False, points, 2) 751 | item = None 752 | v_offset += 18 753 | elif isinstance(item, tuple): 754 | if isinstance(item[1], bool): 755 | rect = pygame.Rect((bar_h_offset, v_offset + 8), (6, 6)) 756 | pygame.draw.rect(display, (255, 255, 255), rect, 0 if item[1] else 1) 757 | else: 758 | rect_border = pygame.Rect((bar_h_offset, v_offset + 8), (bar_width, 6)) 759 | pygame.draw.rect(display, (255, 255, 255), rect_border, 1) 760 | f = (item[1] - item[2]) / (item[3] - item[2]) 761 | if item[2] < 0.0: 762 | rect = pygame.Rect((bar_h_offset + f * (bar_width - 6), v_offset + 8), (6, 6)) 763 | else: 764 | rect = pygame.Rect((bar_h_offset, v_offset + 8), (f * bar_width, 6)) 765 | pygame.draw.rect(display, (255, 255, 255), rect) 766 | item = item[0] 767 | if item: # At this point has to be a str. 768 | surface = self._font_mono.render(item, True, (255, 255, 255)) 769 | display.blit(surface, (8, v_offset)) 770 | v_offset += 18 771 | self._notifications.render(display) 772 | self.help.render(display) 773 | 774 | 775 | # ============================================================================== 776 | # -- FadingText ---------------------------------------------------------------- 777 | # ============================================================================== 778 | 779 | 780 | class FadingText(object): 781 | def __init__(self, font, dim, pos): 782 | self.font = font 783 | self.dim = dim 784 | self.pos = pos 785 | self.seconds_left = 0 786 | self.surface = pygame.Surface(self.dim) 787 | 788 | def set_text(self, text, color=(255, 255, 255), seconds=2.0): 789 | text_texture = self.font.render(text, True, color) 790 | self.surface = pygame.Surface(self.dim) 791 | self.seconds_left = seconds 792 | self.surface.fill((0, 0, 0, 0)) 793 | self.surface.blit(text_texture, (10, 11)) 794 | 795 | def tick(self, _, clock): 796 | delta_seconds = 1e-3 * clock.get_time() 797 | self.seconds_left = max(0.0, self.seconds_left - delta_seconds) 798 | self.surface.set_alpha(500.0 * self.seconds_left) 799 | 800 | def render(self, display): 801 | display.blit(self.surface, self.pos) 802 | 803 | 804 | # ============================================================================== 805 | # -- HelpText ------------------------------------------------------------------ 806 | # ============================================================================== 807 | 808 | 809 | class HelpText(object): 810 | """Helper class to handle text output using pygame""" 811 | def __init__(self, font, width, height): 812 | lines = __doc__.split('\n') 813 | self.font = font 814 | self.line_space = 18 815 | self.dim = (780, len(lines) * self.line_space + 12) 816 | self.pos = (0.5 * width - 0.5 * self.dim[0], 0.5 * height - 0.5 * self.dim[1]) 817 | self.seconds_left = 0 818 | self.surface = pygame.Surface(self.dim) 819 | self.surface.fill((0, 0, 0, 0)) 820 | for n, line in enumerate(lines): 821 | text_texture = self.font.render(line, True, (255, 255, 255)) 822 | self.surface.blit(text_texture, (22, n * self.line_space)) 823 | self._render = False 824 | self.surface.set_alpha(220) 825 | 826 | def toggle(self): 827 | self._render = not self._render 828 | 829 | def render(self, display): 830 | if self._render: 831 | display.blit(self.surface, self.pos) 832 | 833 | 834 | # ============================================================================== 835 | # -- CollisionSensor ----------------------------------------------------------- 836 | # ============================================================================== 837 | 838 | 839 | class CollisionSensor(object): 840 | def __init__(self, parent_actor, hud): 841 | self.sensor = None 842 | self.history = [] 843 | self._parent = parent_actor 844 | self.hud = hud 845 | world = self._parent.get_world() 846 | bp = world.get_blueprint_library().find('sensor.other.collision') 847 | self.sensor = world.spawn_actor(bp, carla.Transform(), attach_to=self._parent) 848 | # We need to pass the lambda a weak reference to self to avoid circular 849 | # reference. 850 | weak_self = weakref.ref(self) 851 | self.sensor.listen(lambda event: CollisionSensor._on_collision(weak_self, event)) 852 | 853 | def get_collision_history(self): 854 | history = collections.defaultdict(int) 855 | for frame, intensity in self.history: 856 | history[frame] += intensity 857 | return history 858 | 859 | @staticmethod 860 | def _on_collision(weak_self, event): 861 | self = weak_self() 862 | if not self: 863 | return 864 | actor_type = get_actor_display_name(event.other_actor) 865 | self.hud.notification('Collision with %r' % actor_type) 866 | impulse = event.normal_impulse 867 | intensity = math.sqrt(impulse.x**2 + impulse.y**2 + impulse.z**2) 868 | self.history.append((event.frame, intensity)) 869 | if len(self.history) > 4000: 870 | self.history.pop(0) 871 | 872 | 873 | # ============================================================================== 874 | # -- LaneInvasionSensor -------------------------------------------------------- 875 | # ============================================================================== 876 | 877 | 878 | class LaneInvasionSensor(object): 879 | def __init__(self, parent_actor, hud): 880 | self.sensor = None 881 | 882 | # If the spawn object is not a vehicle, we cannot use the Lane Invasion Sensor 883 | if parent_actor.type_id.startswith("vehicle."): 884 | self._parent = parent_actor 885 | self.hud = hud 886 | world = self._parent.get_world() 887 | bp = world.get_blueprint_library().find('sensor.other.lane_invasion') 888 | self.sensor = world.spawn_actor(bp, carla.Transform(), attach_to=self._parent) 889 | # We need to pass the lambda a weak reference to self to avoid circular 890 | # reference. 891 | weak_self = weakref.ref(self) 892 | self.sensor.listen(lambda event: LaneInvasionSensor._on_invasion(weak_self, event)) 893 | 894 | @staticmethod 895 | def _on_invasion(weak_self, event): 896 | self = weak_self() 897 | if not self: 898 | return 899 | lane_types = set(x.type for x in event.crossed_lane_markings) 900 | text = ['%r' % str(x).split()[-1] for x in lane_types] 901 | self.hud.notification('Crossed line %s' % ' and '.join(text)) 902 | 903 | 904 | # ============================================================================== 905 | # -- GnssSensor ---------------------------------------------------------------- 906 | # ============================================================================== 907 | 908 | 909 | class GnssSensor(object): 910 | def __init__(self, parent_actor): 911 | self.sensor = None 912 | self._parent = parent_actor 913 | self.lat = 0.0 914 | self.lon = 0.0 915 | world = self._parent.get_world() 916 | bp = world.get_blueprint_library().find('sensor.other.gnss') 917 | self.sensor = world.spawn_actor(bp, carla.Transform(carla.Location(x=1.0, z=2.8)), attach_to=self._parent) 918 | # We need to pass the lambda a weak reference to self to avoid circular 919 | # reference. 920 | weak_self = weakref.ref(self) 921 | self.sensor.listen(lambda event: GnssSensor._on_gnss_event(weak_self, event)) 922 | 923 | @staticmethod 924 | def _on_gnss_event(weak_self, event): 925 | self = weak_self() 926 | if not self: 927 | return 928 | self.lat = event.latitude 929 | self.lon = event.longitude 930 | 931 | 932 | # ============================================================================== 933 | # -- IMUSensor ----------------------------------------------------------------- 934 | # ============================================================================== 935 | 936 | 937 | class IMUSensor(object): 938 | def __init__(self, parent_actor): 939 | self.sensor = None 940 | self._parent = parent_actor 941 | self.accelerometer = (0.0, 0.0, 0.0) 942 | self.gyroscope = (0.0, 0.0, 0.0) 943 | self.compass = 0.0 944 | world = self._parent.get_world() 945 | bp = world.get_blueprint_library().find('sensor.other.imu') 946 | self.sensor = world.spawn_actor( 947 | bp, carla.Transform(), attach_to=self._parent) 948 | # We need to pass the lambda a weak reference to self to avoid circular 949 | # reference. 950 | weak_self = weakref.ref(self) 951 | self.sensor.listen( 952 | lambda sensor_data: IMUSensor._IMU_callback(weak_self, sensor_data)) 953 | 954 | @staticmethod 955 | def _IMU_callback(weak_self, sensor_data): 956 | self = weak_self() 957 | if not self: 958 | return 959 | limits = (-99.9, 99.9) 960 | self.accelerometer = ( 961 | max(limits[0], min(limits[1], sensor_data.accelerometer.x)), 962 | max(limits[0], min(limits[1], sensor_data.accelerometer.y)), 963 | max(limits[0], min(limits[1], sensor_data.accelerometer.z))) 964 | self.gyroscope = ( 965 | max(limits[0], min(limits[1], math.degrees(sensor_data.gyroscope.x))), 966 | max(limits[0], min(limits[1], math.degrees(sensor_data.gyroscope.y))), 967 | max(limits[0], min(limits[1], math.degrees(sensor_data.gyroscope.z)))) 968 | self.compass = math.degrees(sensor_data.compass) 969 | 970 | 971 | # ============================================================================== 972 | # -- RadarSensor --------------------------------------------------------------- 973 | # ============================================================================== 974 | 975 | 976 | class RadarSensor(object): 977 | def __init__(self, parent_actor): 978 | self.sensor = None 979 | self._parent = parent_actor 980 | bound_x = 0.5 + self._parent.bounding_box.extent.x 981 | bound_y = 0.5 + self._parent.bounding_box.extent.y 982 | bound_z = 0.5 + self._parent.bounding_box.extent.z 983 | 984 | self.velocity_range = 7.5 # m/s 985 | world = self._parent.get_world() 986 | self.debug = world.debug 987 | bp = world.get_blueprint_library().find('sensor.other.radar') 988 | bp.set_attribute('horizontal_fov', str(35)) 989 | bp.set_attribute('vertical_fov', str(20)) 990 | self.sensor = world.spawn_actor( 991 | bp, 992 | carla.Transform( 993 | carla.Location(x=bound_x + 0.05, z=bound_z+0.05), 994 | carla.Rotation(pitch=5)), 995 | attach_to=self._parent) 996 | # We need a weak reference to self to avoid circular reference. 997 | weak_self = weakref.ref(self) 998 | self.sensor.listen( 999 | lambda radar_data: RadarSensor._Radar_callback(weak_self, radar_data)) 1000 | 1001 | @staticmethod 1002 | def _Radar_callback(weak_self, radar_data): 1003 | self = weak_self() 1004 | if not self: 1005 | return 1006 | # To get a numpy [[vel, altitude, azimuth, depth],...[,,,]]: 1007 | # points = np.frombuffer(radar_data.raw_data, dtype=np.dtype('f4')) 1008 | # points = np.reshape(points, (len(radar_data), 4)) 1009 | 1010 | current_rot = radar_data.transform.rotation 1011 | for detect in radar_data: 1012 | azi = math.degrees(detect.azimuth) 1013 | alt = math.degrees(detect.altitude) 1014 | # The 0.25 adjusts a bit the distance so the dots can 1015 | # be properly seen 1016 | fw_vec = carla.Vector3D(x=detect.depth - 0.25) 1017 | carla.Transform( 1018 | carla.Location(), 1019 | carla.Rotation( 1020 | pitch=current_rot.pitch + alt, 1021 | yaw=current_rot.yaw + azi, 1022 | roll=current_rot.roll)).transform(fw_vec) 1023 | 1024 | def clamp(min_v, max_v, value): 1025 | return max(min_v, min(value, max_v)) 1026 | 1027 | norm_velocity = detect.velocity / self.velocity_range # range [-1, 1] 1028 | r = int(clamp(0.0, 1.0, 1.0 - norm_velocity) * 255.0) 1029 | g = int(clamp(0.0, 1.0, 1.0 - abs(norm_velocity)) * 255.0) 1030 | b = int(abs(clamp(- 1.0, 0.0, - 1.0 - norm_velocity)) * 255.0) 1031 | self.debug.draw_point( 1032 | radar_data.transform.location + fw_vec, 1033 | size=0.075, 1034 | life_time=0.06, 1035 | persistent_lines=False, 1036 | color=carla.Color(r, g, b)) 1037 | 1038 | # ============================================================================== 1039 | # -- CameraManager ------------------------------------------------------------- 1040 | # ============================================================================== 1041 | 1042 | 1043 | class CameraManager(object): 1044 | def __init__(self, parent_actor, hud, gamma_correction): 1045 | self.sensor = None 1046 | self.surface = None 1047 | self._parent = parent_actor 1048 | self.hud = hud 1049 | # Neel modified the self.recording to True from False 1050 | self.recording = True 1051 | bound_x = 0.5 + self._parent.bounding_box.extent.x 1052 | bound_y = 0.5 + self._parent.bounding_box.extent.y 1053 | bound_z = 0.5 + self._parent.bounding_box.extent.z 1054 | Attachment = carla.AttachmentType 1055 | #world = self._parent.get_world() 1056 | #c = world.player.get_control() 1057 | #self.key_control = KeyboardControl(world, False) 1058 | 1059 | if not self._parent.type_id.startswith("walker.pedestrian"): 1060 | self._camera_transforms = [ 1061 | (carla.Transform(carla.Location(x=-2.0*bound_x, y=+0.0*bound_y, z=2.0*bound_z), carla.Rotation(pitch=8.0)), Attachment.SpringArm), 1062 | (carla.Transform(carla.Location(x=+0.8*bound_x, y=+0.0*bound_y, z=1.3*bound_z)), Attachment.Rigid), 1063 | (carla.Transform(carla.Location(x=+1.9*bound_x, y=+1.0*bound_y, z=1.2*bound_z)), Attachment.SpringArm), 1064 | (carla.Transform(carla.Location(x=-2.8*bound_x, y=+0.0*bound_y, z=4.6*bound_z), carla.Rotation(pitch=6.0)), Attachment.SpringArm), 1065 | (carla.Transform(carla.Location(x=-1.0, y=-1.0*bound_y, z=0.4*bound_z)), Attachment.Rigid)] 1066 | else: 1067 | self._camera_transforms = [ 1068 | (carla.Transform(carla.Location(x=-2.5, z=0.0), carla.Rotation(pitch=-8.0)), Attachment.SpringArm), 1069 | (carla.Transform(carla.Location(x=1.6, z=1.7)), Attachment.Rigid), 1070 | (carla.Transform(carla.Location(x=2.5, y=0.5, z=0.0), carla.Rotation(pitch=-8.0)), Attachment.SpringArm), 1071 | (carla.Transform(carla.Location(x=-4.0, z=2.0), carla.Rotation(pitch=6.0)), Attachment.SpringArm), 1072 | (carla.Transform(carla.Location(x=0, y=-2.5, z=-0.0), carla.Rotation(yaw=90.0)), Attachment.Rigid)] 1073 | 1074 | self.transform_index = 1 1075 | self.sensors = [ 1076 | ['sensor.camera.rgb', cc.Raw, 'Camera RGB', {}], 1077 | ['sensor.camera.depth', cc.Raw, 'Camera Depth (Raw)', {}], 1078 | ['sensor.camera.depth', cc.Depth, 'Camera Depth (Gray Scale)', {}], 1079 | ['sensor.camera.depth', cc.LogarithmicDepth, 'Camera Depth (Logarithmic Gray Scale)', {}], 1080 | ['sensor.camera.semantic_segmentation', cc.Raw, 'Camera Semantic Segmentation (Raw)', {}], 1081 | ['sensor.camera.semantic_segmentation', cc.CityScapesPalette, 'Camera Semantic Segmentation (CityScapes Palette)', {}], 1082 | ['sensor.camera.instance_segmentation', cc.CityScapesPalette, 'Camera Instance Segmentation (CityScapes Palette)', {}], 1083 | ['sensor.camera.instance_segmentation', cc.Raw, 'Camera Instance Segmentation (Raw)', {}], 1084 | ['sensor.lidar.ray_cast', None, 'Lidar (Ray-Cast)', {'range': '50'}], 1085 | ['sensor.camera.dvs', cc.Raw, 'Dynamic Vision Sensor', {}], 1086 | ['sensor.camera.rgb', cc.Raw, 'Camera RGB Distorted', 1087 | {'lens_circle_multiplier': '3.0', 1088 | 'lens_circle_falloff': '3.0', 1089 | 'chromatic_aberration_intensity': '0.5', 1090 | 'chromatic_aberration_offset': '0'}], 1091 | ['sensor.camera.optical_flow', cc.Raw, 'Optical Flow', {}], 1092 | ] 1093 | world = self._parent.get_world() 1094 | bp_library = world.get_blueprint_library() 1095 | for item in self.sensors: 1096 | bp = bp_library.find(item[0]) 1097 | if item[0].startswith('sensor.camera'): 1098 | bp.set_attribute('image_size_x', str(hud.dim[0])) 1099 | bp.set_attribute('image_size_y', str(hud.dim[1])) 1100 | if bp.has_attribute('gamma'): 1101 | bp.set_attribute('gamma', str(gamma_correction)) 1102 | for attr_name, attr_value in item[3].items(): 1103 | bp.set_attribute(attr_name, attr_value) 1104 | elif item[0].startswith('sensor.lidar'): 1105 | self.lidar_range = 50 1106 | 1107 | for attr_name, attr_value in item[3].items(): 1108 | bp.set_attribute(attr_name, attr_value) 1109 | if attr_name == 'range': 1110 | self.lidar_range = float(attr_value) 1111 | 1112 | item.append(bp) 1113 | self.index = None 1114 | 1115 | def toggle_camera(self): 1116 | self.transform_index = (self.transform_index + 1) % len(self._camera_transforms) 1117 | self.set_sensor(self.index, notify=False, force_respawn=True) 1118 | 1119 | def set_sensor(self, index, notify=True, force_respawn=False): 1120 | index = index % len(self.sensors) 1121 | needs_respawn = True if self.index is None else \ 1122 | (force_respawn or (self.sensors[index][2] != self.sensors[self.index][2])) 1123 | if needs_respawn: 1124 | if self.sensor is not None: 1125 | self.sensor.destroy() 1126 | self.surface = None 1127 | self.sensor = self._parent.get_world().spawn_actor( 1128 | self.sensors[index][-1], 1129 | self._camera_transforms[self.transform_index][0], 1130 | attach_to=self._parent, 1131 | attachment_type=self._camera_transforms[self.transform_index][1]) 1132 | # We need to pass the lambda a weak reference to self to avoid 1133 | # circular reference. 1134 | weak_self = weakref.ref(self) 1135 | self.sensor.listen(lambda image: CameraManager._parse_image(weak_self, image)) 1136 | # Neel chaged/ added the below 2 lines and commented the default callback line above 1137 | #key_control = carla.VehicleControl() 1138 | #self.sensor.listen(lambda image: CameraManager.rgb_callback(self, image)) 1139 | if notify: 1140 | self.hud.notification(self.sensors[index][2]) 1141 | self.index = index 1142 | 1143 | def next_sensor(self): 1144 | self.set_sensor(self.index + 1) 1145 | 1146 | def toggle_recording(self): 1147 | self.recording = not self.recording 1148 | self.hud.notification('Recording %s' % ('On' if self.recording else 'Off')) 1149 | 1150 | def render(self, display): 1151 | if self.surface is not None: 1152 | display.blit(self.surface, (0, 0)) 1153 | 1154 | 1155 | def _parse_image(weak_self, image): 1156 | self = weak_self() 1157 | if not self: 1158 | return 1159 | if self.sensors[self.index][0].startswith('sensor.lidar'): 1160 | points = np.frombuffer(image.raw_data, dtype=np.dtype('f4')) 1161 | points = np.reshape(points, (int(points.shape[0] / 4), 4)) 1162 | lidar_data = np.array(points[:, :2]) 1163 | lidar_data *= min(self.hud.dim) / (2.0 * self.lidar_range) 1164 | lidar_data += (0.5 * self.hud.dim[0], 0.5 * self.hud.dim[1]) 1165 | lidar_data = np.fabs(lidar_data) # pylint: disable=E1111 1166 | lidar_data = lidar_data.astype(np.int32) 1167 | lidar_data = np.reshape(lidar_data, (-1, 2)) 1168 | lidar_img_size = (self.hud.dim[0], self.hud.dim[1], 3) 1169 | lidar_img = np.zeros((lidar_img_size), dtype=np.uint8) 1170 | lidar_img[tuple(lidar_data.T)] = (255, 255, 255) 1171 | self.surface = pygame.surfarray.make_surface(lidar_img) 1172 | elif self.sensors[self.index][0].startswith('sensor.camera.dvs'): 1173 | # Example of converting the raw_data from a carla.DVSEventArray 1174 | # sensor into a NumPy array and using it as an image 1175 | dvs_events = np.frombuffer(image.raw_data, dtype=np.dtype([ 1176 | ('x', np.uint16), ('y', np.uint16), ('t', np.int64), ('pol', np.bool)])) 1177 | dvs_img = np.zeros((image.height, image.width, 3), dtype=np.uint8) 1178 | # Blue is positive, red is negative 1179 | dvs_img[dvs_events[:]['y'], dvs_events[:]['x'], dvs_events[:]['pol'] * 2] = 255 1180 | self.surface = pygame.surfarray.make_surface(dvs_img.swapaxes(0, 1)) 1181 | elif self.sensors[self.index][0].startswith('sensor.camera.optical_flow'): 1182 | image = image.get_color_coded_flow() 1183 | array = np.frombuffer(image.raw_data, dtype=np.dtype("uint8")) 1184 | array = np.reshape(array, (image.height, image.width, 4)) 1185 | array = array[:, :, :3] 1186 | array = array[:, :, ::-1] 1187 | self.surface = pygame.surfarray.make_surface(array.swapaxes(0, 1)) 1188 | else: 1189 | image.convert(self.sensors[self.index][1]) 1190 | array = np.frombuffer(image.raw_data, dtype=np.dtype("uint8")) 1191 | array = np.reshape(array, (image.height, image.width, 4)) 1192 | array = array[:, :, :3] 1193 | array = array[:, :, ::-1] 1194 | self.surface = pygame.surfarray.make_surface(array.swapaxes(0, 1)) 1195 | if self.recording: 1196 | # Changes by Neel 1197 | rgb_callback(self, image) 1198 | image.save_to_disk(f'/opt/carla-output/images/{date_string}/image_%06d.jpg' % image.frame) 1199 | 1200 | """ 1201 | # Neel added this callback Function 1202 | 1203 | def rgb_callback(self, image): 1204 | #image.save_to_disk('/opt/carla-output/images/image_%06d.jpg' % image.frame) 1205 | self._control = carla.VehicleControl() 1206 | steer = self._control.steer 1207 | throttle = self._control.throttle 1208 | sensor_data = {'rgb_image': 'image_%06d.png' % image.frame, 1209 | 'angle' : steer, 1210 | 'throttle' : throttle 1211 | } 1212 | 1213 | print(sensor_data) 1214 | #dataset_path = '/opt/carla-output/dataset/data_%06d.json' 1215 | ##with open(dataset_path % image.frame,'w') as f: 1216 | ## json.dump(sensor_data, f) 1217 | """ 1218 | 1219 | 1220 | # ============================================================================== 1221 | # -- game_loop() --------------------------------------------------------------- 1222 | # ============================================================================== 1223 | 1224 | 1225 | def game_loop(args): 1226 | global rgb_callback 1227 | pygame.init() 1228 | pygame.font.init() 1229 | world = None 1230 | original_settings = None 1231 | 1232 | try: 1233 | client = carla.Client(args.host, args.port) 1234 | client.set_timeout(20.0) 1235 | 1236 | #sim_world = client.get_world() 1237 | sim_world = client.load_world('Town05') 1238 | if args.sync: 1239 | original_settings = sim_world.get_settings() 1240 | settings = sim_world.get_settings() 1241 | if not settings.synchronous_mode: 1242 | settings.synchronous_mode = True 1243 | settings.fixed_delta_seconds = 0.05 1244 | sim_world.apply_settings(settings) 1245 | 1246 | traffic_manager = client.get_trafficmanager() 1247 | traffic_manager.set_synchronous_mode(True) 1248 | 1249 | if args.autopilot and not sim_world.get_settings().synchronous_mode: 1250 | print("WARNING: You are currently in asynchronous mode and could " 1251 | "experience some issues with the traffic simulation") 1252 | 1253 | display = pygame.display.set_mode( 1254 | (args.width, args.height), 1255 | pygame.HWSURFACE | pygame.DOUBLEBUF) 1256 | display.fill((0,0,0)) 1257 | pygame.display.flip() 1258 | 1259 | hud = HUD(args.width, args.height) 1260 | world = World(sim_world, hud, args) 1261 | controller = KeyboardControl(world, args.autopilot) 1262 | #print(controller._control.steer) 1263 | 1264 | if args.sync: 1265 | sim_world.tick() 1266 | else: 1267 | sim_world.wait_for_tick() 1268 | 1269 | clock = pygame.time.Clock() 1270 | while True: 1271 | def rgb_callback(self, image): 1272 | #image.save_to_disk('/opt/carla-output/images/image_%06d.jpg' % image.frame) 1273 | steer = controller._control.steer 1274 | throttle = controller._control.throttle 1275 | sensor_data = {'rgb_image': 'image_%06d.jpg' % image.frame, 'angle' : steer,'throttle' : throttle} 1276 | print(sensor_data) 1277 | dataset_dir = f'/opt/carla-output/dataset/{date_string}' 1278 | dataset_path = dataset_dir + '/data_%06d.json' 1279 | Path(dataset_dir).mkdir(parents=True, exist_ok= True) 1280 | with open(dataset_path % image.frame,'w') as f: 1281 | json.dump(sensor_data, f) 1282 | #globals()['rgb_callback']=rgb_callback 1283 | if args.sync: 1284 | sim_world.tick() 1285 | clock.tick_busy_loop(60) 1286 | if controller.parse_events(client, world, clock, args.sync): 1287 | return 1288 | world.tick(clock) 1289 | world.render(display) 1290 | pygame.display.flip() 1291 | 1292 | finally: 1293 | 1294 | if original_settings: 1295 | sim_world.apply_settings(original_settings) 1296 | 1297 | if (world and world.recording_enabled): 1298 | client.stop_recorder() 1299 | 1300 | if world is not None: 1301 | world.destroy() 1302 | 1303 | pygame.quit() 1304 | 1305 | 1306 | # ============================================================================== 1307 | # -- main() -------------------------------------------------------------------- 1308 | # ============================================================================== 1309 | 1310 | 1311 | def main(): 1312 | argparser = argparse.ArgumentParser( 1313 | description='CARLA Manual Control Client') 1314 | argparser.add_argument( 1315 | '-v', '--verbose', 1316 | action='store_true', 1317 | dest='debug', 1318 | help='print debug information') 1319 | argparser.add_argument( 1320 | '--host', 1321 | metavar='H', 1322 | default='127.0.0.1', 1323 | help='IP of the host server (default: 127.0.0.1)') 1324 | argparser.add_argument( 1325 | '-p', '--port', 1326 | metavar='P', 1327 | default=2000, 1328 | type=int, 1329 | help='TCP port to listen to (default: 2000)') 1330 | argparser.add_argument( 1331 | '-a', '--autopilot', 1332 | action='store_true', 1333 | help='enable autopilot') 1334 | argparser.add_argument( 1335 | '--res', 1336 | metavar='WIDTHxHEIGHT', 1337 | default='640x480', 1338 | help='window resolution (default: 1280x720)') 1339 | argparser.add_argument( 1340 | '--filter', 1341 | metavar='PATTERN', 1342 | default='vehicle.*', 1343 | help='actor filter (default: "vehicle.*")') 1344 | argparser.add_argument( 1345 | '--generation', 1346 | metavar='G', 1347 | default='2', 1348 | help='restrict to certain actor generation (values: "1","2","All" - default: "2")') 1349 | argparser.add_argument( 1350 | '--rolename', 1351 | metavar='NAME', 1352 | default='hero', 1353 | help='actor role name (default: "hero")') 1354 | argparser.add_argument( 1355 | '--gamma', 1356 | default=2.2, 1357 | type=float, 1358 | help='Gamma correction of the camera (default: 2.2)') 1359 | argparser.add_argument( 1360 | '--sync', 1361 | action='store_true', 1362 | help='Activate synchronous mode execution') 1363 | args = argparser.parse_args() 1364 | 1365 | control = carla.VehicleControl() 1366 | 1367 | args.width, args.height = [int(x) for x in args.res.split('x')] 1368 | print(args.width, args.height) 1369 | 1370 | log_level = logging.DEBUG if args.debug else logging.INFO 1371 | logging.basicConfig(format='%(levelname)s: %(message)s', level=log_level) 1372 | 1373 | logging.info('listening to server %s:%s', args.host, args.port) 1374 | 1375 | print(__doc__) 1376 | 1377 | 1378 | try: 1379 | 1380 | game_loop(args) 1381 | 1382 | except KeyboardInterrupt: 1383 | print('\nCancelled by user. Bye!') 1384 | 1385 | 1386 | if __name__ == '__main__': 1387 | 1388 | main() -------------------------------------------------------------------------------- /carla-manual-datacollect.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/env python 2 | 3 | # Copyright (c) 2019 Computer Vision Center (CVC) at the Universitat Autonoma de 4 | # Barcelona (UAB). 5 | # 6 | # This work is licensed under the terms of the MIT license. 7 | # For a copy, see . 8 | 9 | # Allows controlling a vehicle with a keyboard. For a simpler and more 10 | # documented example, please take a look at tutorial.py. 11 | 12 | ######################################################################## 13 | # Author: Neel (itsneel@amazon.com) 14 | # Adding multiple functionalities to capture image and steering data 15 | # from the simulation 16 | ######################################################################## 17 | 18 | """ 19 | Welcome to CARLA manual control. 20 | 21 | Use ARROWS or WASD keys for control. 22 | 23 | W : throttle 24 | S : brake 25 | A/D : steer left/right 26 | Q : toggle reverse 27 | Space : hand-brake 28 | P : toggle autopilot 29 | M : toggle manual transmission 30 | ,/. : gear up/down 31 | CTRL + W : toggle constant velocity mode at 60 km/h 32 | 33 | L : toggle next light type 34 | SHIFT + L : toggle high beam 35 | Z/X : toggle right/left blinker 36 | I : toggle interior light 37 | 38 | TAB : change sensor position 39 | ` or N : next sensor 40 | [1-9] : change to sensor [1-9] 41 | G : toggle radar visualization 42 | C : change weather (Shift+C reverse) 43 | Backspace : change vehicle 44 | 45 | O : open/close all doors of vehicle 46 | T : toggle vehicle's telemetry 47 | 48 | V : Select next map layer (Shift+V reverse) 49 | B : Load current selected map layer (Shift+B to unload) 50 | 51 | R : toggle recording images to disk 52 | 53 | CTRL + R : toggle recording of simulation (replacing any previous) 54 | CTRL + P : start replaying last recorded simulation 55 | CTRL + + : increments the start time of the replay by 1 second (+SHIFT = 10 seconds) 56 | CTRL + - : decrements the start time of the replay by 1 second (+SHIFT = 10 seconds) 57 | 58 | F1 : toggle HUD 59 | H/? : toggle help 60 | ESC : quit 61 | """ 62 | 63 | from __future__ import print_function 64 | 65 | 66 | # ============================================================================== 67 | # -- find carla module --------------------------------------------------------- 68 | # ============================================================================== 69 | 70 | 71 | import glob 72 | import os 73 | import sys 74 | 75 | try: 76 | sys.path.append(glob.glob('../carla/dist/carla-*%d.%d-%s.egg' % ( 77 | sys.version_info.major, 78 | sys.version_info.minor, 79 | 'win-amd64' if os.name == 'nt' else 'linux-x86_64'))[0]) 80 | except IndexError: 81 | pass 82 | 83 | 84 | # ============================================================================== 85 | # -- imports ------------------------------------------------------------------- 86 | # ============================================================================== 87 | 88 | 89 | import carla 90 | 91 | from carla import ColorConverter as cc 92 | 93 | import argparse 94 | import collections 95 | import logging 96 | import math 97 | import random 98 | import re 99 | import weakref 100 | import json 101 | from datetime import datetime as dt 102 | import datetime 103 | from pathlib import Path 104 | 105 | try: 106 | import pygame 107 | from pygame.locals import KMOD_CTRL 108 | from pygame.locals import KMOD_SHIFT 109 | from pygame.locals import K_0 110 | from pygame.locals import K_9 111 | from pygame.locals import K_BACKQUOTE 112 | from pygame.locals import K_BACKSPACE 113 | from pygame.locals import K_COMMA 114 | from pygame.locals import K_DOWN 115 | from pygame.locals import K_ESCAPE 116 | from pygame.locals import K_F1 117 | from pygame.locals import K_LEFT 118 | from pygame.locals import K_PERIOD 119 | from pygame.locals import K_RIGHT 120 | from pygame.locals import K_SLASH 121 | from pygame.locals import K_SPACE 122 | from pygame.locals import K_TAB 123 | from pygame.locals import K_UP 124 | from pygame.locals import K_a 125 | from pygame.locals import K_b 126 | from pygame.locals import K_c 127 | from pygame.locals import K_d 128 | from pygame.locals import K_g 129 | from pygame.locals import K_h 130 | from pygame.locals import K_i 131 | from pygame.locals import K_l 132 | from pygame.locals import K_m 133 | from pygame.locals import K_n 134 | from pygame.locals import K_o 135 | from pygame.locals import K_p 136 | from pygame.locals import K_q 137 | from pygame.locals import K_r 138 | from pygame.locals import K_s 139 | from pygame.locals import K_t 140 | from pygame.locals import K_v 141 | from pygame.locals import K_w 142 | from pygame.locals import K_x 143 | from pygame.locals import K_z 144 | from pygame.locals import K_MINUS 145 | from pygame.locals import K_EQUALS 146 | except ImportError: 147 | raise RuntimeError('cannot import pygame, make sure pygame package is installed') 148 | 149 | try: 150 | import numpy as np 151 | except ImportError: 152 | raise RuntimeError('cannot import numpy, make sure numpy package is installed') 153 | 154 | 155 | # ============================================================================== 156 | # -- Global functions ---------------------------------------------------------- 157 | # ============================================================================== 158 | 159 | def rgb_callback(a, b, c): 160 | return None 161 | 162 | def find_weather_presets(): 163 | rgx = re.compile('.+?(?:(?<=[a-z])(?=[A-Z])|(?<=[A-Z])(?=[A-Z][a-z])|$)') 164 | name = lambda x: ' '.join(m.group(0) for m in rgx.finditer(x)) 165 | presets = [x for x in dir(carla.WeatherParameters) if re.match('[A-Z].+', x)] 166 | return [(getattr(carla.WeatherParameters, x), name(x)) for x in presets] 167 | 168 | 169 | def get_actor_display_name(actor, truncate=250): 170 | name = ' '.join(actor.type_id.replace('_', '.').title().split('.')[1:]) 171 | return (name[:truncate - 1] + u'\u2026') if len(name) > truncate else name 172 | 173 | def get_actor_blueprints(world, filter, generation): 174 | bps = world.get_blueprint_library().filter(filter) 175 | 176 | if generation.lower() == "all": 177 | return bps 178 | 179 | # If the filter returns only one bp, we assume that this one needed 180 | # and therefore, we ignore the generation 181 | if len(bps) == 1: 182 | return bps 183 | 184 | try: 185 | int_generation = int(generation) 186 | # Check if generation is in available generations 187 | if int_generation in [1, 2]: 188 | bps = [x for x in bps if int(x.get_attribute('generation')) == int_generation] 189 | return bps 190 | else: 191 | print(" Warning! Actor Generation is not valid. No actor will be spawned.") 192 | return [] 193 | except: 194 | print(" Warning! Actor Generation is not valid. No actor will be spawned.") 195 | return [] 196 | 197 | now = dt.now() 198 | date_string = now.strftime("%d%m%Y-%H%M") 199 | # ============================================================================== 200 | # -- World --------------------------------------------------------------------- 201 | # ============================================================================== 202 | 203 | 204 | class World(object): 205 | def __init__(self, carla_world, hud, args): 206 | self.world = carla_world 207 | self.sync = args.sync 208 | self.actor_role_name = args.rolename 209 | try: 210 | self.map = self.world.get_map() 211 | except RuntimeError as error: 212 | print('RuntimeError: {}'.format(error)) 213 | print(' The server could not send the OpenDRIVE (.xodr) file:') 214 | print(' Make sure it exists, has the same name of your town, and is correct.') 215 | sys.exit(1) 216 | self.hud = hud 217 | self.player = None 218 | self.collision_sensor = None 219 | self.lane_invasion_sensor = None 220 | self.gnss_sensor = None 221 | self.imu_sensor = None 222 | self.radar_sensor = None 223 | self.camera_manager = None 224 | self._weather_presets = find_weather_presets() 225 | self._weather_index = 0 226 | self._actor_filter = args.filter 227 | self._actor_generation = args.generation 228 | self._gamma = args.gamma 229 | self.restart() 230 | self.world.on_tick(hud.on_world_tick) 231 | self.recording_enabled = False 232 | self.recording_start = 0 233 | self.constant_velocity_enabled = False 234 | self.show_vehicle_telemetry = False 235 | self.doors_are_open = False 236 | self.current_map_layer = 0 237 | self.map_layer_names = [ 238 | carla.MapLayer.NONE, 239 | carla.MapLayer.Buildings, 240 | carla.MapLayer.Decals, 241 | carla.MapLayer.Foliage, 242 | carla.MapLayer.Ground, 243 | carla.MapLayer.ParkedVehicles, 244 | carla.MapLayer.Particles, 245 | carla.MapLayer.Props, 246 | carla.MapLayer.StreetLights, 247 | carla.MapLayer.Walls, 248 | carla.MapLayer.All 249 | ] 250 | 251 | def restart(self): 252 | self.player_max_speed = 1.589 253 | self.player_max_speed_fast = 3.713 254 | # Keep same camera config if the camera manager exists. 255 | cam_index = self.camera_manager.index if self.camera_manager is not None else 0 256 | cam_pos_index = self.camera_manager.transform_index if self.camera_manager is not None else 0 257 | # Get a random blueprint. 258 | # Neel updated the blueprint library to get just 1 type of vehicle 259 | #blueprint = random.choice(get_actor_blueprints(self.world, self._actor_filter, self._actor_generation)) 260 | blueprint_lib = self.world.get_blueprint_library() 261 | blueprint = blueprint_lib.find('vehicle.tesla.model3') 262 | blueprint.set_attribute('role_name', self.actor_role_name) 263 | if blueprint.has_attribute('color'): 264 | color = random.choice(blueprint.get_attribute('color').recommended_values) 265 | blueprint.set_attribute('color', color) 266 | if blueprint.has_attribute('driver_id'): 267 | driver_id = random.choice(blueprint.get_attribute('driver_id').recommended_values) 268 | blueprint.set_attribute('driver_id', driver_id) 269 | if blueprint.has_attribute('is_invincible'): 270 | blueprint.set_attribute('is_invincible', 'true') 271 | # set the max speed 272 | if blueprint.has_attribute('speed'): 273 | self.player_max_speed = float(blueprint.get_attribute('speed').recommended_values[1]) 274 | self.player_max_speed_fast = float(blueprint.get_attribute('speed').recommended_values[2]) 275 | 276 | # Spawn the player. 277 | if self.player is not None: 278 | spawn_point = self.player.get_transform() 279 | spawn_point.location.z += 2.0 280 | spawn_point.rotation.roll = 0.0 281 | spawn_point.rotation.pitch = 0.0 282 | self.destroy() 283 | self.player = self.world.try_spawn_actor(blueprint, spawn_point) 284 | self.show_vehicle_telemetry = False 285 | self.modify_vehicle_physics(self.player) 286 | while self.player is None: 287 | if not self.map.get_spawn_points(): 288 | print('There are no spawn points available in your map/town.') 289 | print('Please add some Vehicle Spawn Point to your UE4 scene.') 290 | sys.exit(1) 291 | spawn_points = self.map.get_spawn_points() 292 | spawn_point = random.choice(spawn_points) if spawn_points else carla.Transform() 293 | self.player = self.world.try_spawn_actor(blueprint, spawn_point) 294 | self.show_vehicle_telemetry = False 295 | self.modify_vehicle_physics(self.player) 296 | # Set up the sensors. 297 | self.collision_sensor = CollisionSensor(self.player, self.hud) 298 | self.lane_invasion_sensor = LaneInvasionSensor(self.player, self.hud) 299 | self.gnss_sensor = GnssSensor(self.player) 300 | self.imu_sensor = IMUSensor(self.player) 301 | self.camera_manager = CameraManager(self.player, self.hud, self._gamma) 302 | self.camera_manager.transform_index = cam_pos_index 303 | self.camera_manager.set_sensor(cam_index, notify=False) 304 | actor_type = get_actor_display_name(self.player) 305 | self.hud.notification(actor_type) 306 | 307 | if self.sync: 308 | self.world.tick() 309 | else: 310 | self.world.wait_for_tick() 311 | 312 | def next_weather(self, reverse=False): 313 | self._weather_index += -1 if reverse else 1 314 | self._weather_index %= len(self._weather_presets) 315 | preset = self._weather_presets[self._weather_index] 316 | self.hud.notification('Weather: %s' % preset[1]) 317 | self.player.get_world().set_weather(preset[0]) 318 | 319 | def next_map_layer(self, reverse=False): 320 | self.current_map_layer += -1 if reverse else 1 321 | self.current_map_layer %= len(self.map_layer_names) 322 | selected = self.map_layer_names[self.current_map_layer] 323 | self.hud.notification('LayerMap selected: %s' % selected) 324 | 325 | def load_map_layer(self, unload=False): 326 | selected = self.map_layer_names[self.current_map_layer] 327 | if unload: 328 | self.hud.notification('Unloading map layer: %s' % selected) 329 | self.world.unload_map_layer(selected) 330 | else: 331 | self.hud.notification('Loading map layer: %s' % selected) 332 | self.world.load_map_layer(selected) 333 | 334 | def toggle_radar(self): 335 | if self.radar_sensor is None: 336 | self.radar_sensor = RadarSensor(self.player) 337 | elif self.radar_sensor.sensor is not None: 338 | self.radar_sensor.sensor.destroy() 339 | self.radar_sensor = None 340 | 341 | def modify_vehicle_physics(self, actor): 342 | #If actor is not a vehicle, we cannot use the physics control 343 | try: 344 | physics_control = actor.get_physics_control() 345 | physics_control.use_sweep_wheel_collision = True 346 | actor.apply_physics_control(physics_control) 347 | except Exception: 348 | pass 349 | 350 | def tick(self, clock): 351 | self.hud.tick(self, clock) 352 | 353 | def render(self, display): 354 | self.camera_manager.render(display) 355 | self.hud.render(display) 356 | 357 | def destroy_sensors(self): 358 | self.camera_manager.sensor.destroy() 359 | self.camera_manager.sensor = None 360 | self.camera_manager.index = None 361 | 362 | def destroy(self): 363 | if self.radar_sensor is not None: 364 | self.toggle_radar() 365 | sensors = [ 366 | self.camera_manager.sensor, 367 | self.collision_sensor.sensor, 368 | self.lane_invasion_sensor.sensor, 369 | self.gnss_sensor.sensor, 370 | self.imu_sensor.sensor] 371 | for sensor in sensors: 372 | if sensor is not None: 373 | sensor.stop() 374 | sensor.destroy() 375 | if self.player is not None: 376 | self.player.destroy() 377 | 378 | 379 | # ============================================================================== 380 | # -- KeyboardControl ----------------------------------------------------------- 381 | # ============================================================================== 382 | 383 | 384 | class KeyboardControl(object): 385 | """Class that handles keyboard input.""" 386 | def __init__(self, world, start_in_autopilot): 387 | self._autopilot_enabled = start_in_autopilot 388 | if isinstance(world.player, carla.Vehicle): 389 | self._control = carla.VehicleControl() 390 | self._lights = carla.VehicleLightState.NONE 391 | world.player.set_autopilot(self._autopilot_enabled) 392 | world.player.set_light_state(self._lights) 393 | elif isinstance(world.player, carla.Walker): 394 | self._control = carla.WalkerControl() 395 | self._autopilot_enabled = False 396 | self._rotation = world.player.get_transform().rotation 397 | else: 398 | raise NotImplementedError("Actor type not supported") 399 | self._steer_cache = 0.0 400 | world.hud.notification("Press 'H' or '?' for help.", seconds=4.0) 401 | 402 | def parse_events(self, client, world, clock, sync_mode): 403 | if isinstance(self._control, carla.VehicleControl): 404 | current_lights = self._lights 405 | for event in pygame.event.get(): 406 | if event.type == pygame.QUIT: 407 | return True 408 | elif event.type == pygame.KEYUP: 409 | if self._is_quit_shortcut(event.key): 410 | return True 411 | elif event.key == K_BACKSPACE: 412 | if self._autopilot_enabled: 413 | world.player.set_autopilot(False) 414 | world.restart() 415 | world.player.set_autopilot(True) 416 | else: 417 | world.restart() 418 | elif event.key == K_F1: 419 | world.hud.toggle_info() 420 | elif event.key == K_v and pygame.key.get_mods() & KMOD_SHIFT: 421 | world.next_map_layer(reverse=True) 422 | elif event.key == K_v: 423 | world.next_map_layer() 424 | elif event.key == K_b and pygame.key.get_mods() & KMOD_SHIFT: 425 | world.load_map_layer(unload=True) 426 | elif event.key == K_b: 427 | world.load_map_layer() 428 | elif event.key == K_h or (event.key == K_SLASH and pygame.key.get_mods() & KMOD_SHIFT): 429 | world.hud.help.toggle() 430 | elif event.key == K_TAB: 431 | world.camera_manager.toggle_camera() 432 | elif event.key == K_c and pygame.key.get_mods() & KMOD_SHIFT: 433 | world.next_weather(reverse=True) 434 | elif event.key == K_c: 435 | world.next_weather() 436 | elif event.key == K_g: 437 | world.toggle_radar() 438 | elif event.key == K_BACKQUOTE: 439 | world.camera_manager.next_sensor() 440 | elif event.key == K_n: 441 | world.camera_manager.next_sensor() 442 | elif event.key == K_w and (pygame.key.get_mods() & KMOD_CTRL): 443 | if world.constant_velocity_enabled: 444 | world.player.disable_constant_velocity() 445 | world.constant_velocity_enabled = False 446 | world.hud.notification("Disabled Constant Velocity Mode") 447 | else: 448 | world.player.enable_constant_velocity(carla.Vector3D(17, 0, 0)) 449 | world.constant_velocity_enabled = True 450 | world.hud.notification("Enabled Constant Velocity Mode at 60 km/h") 451 | elif event.key == K_o: 452 | try: 453 | if world.doors_are_open: 454 | world.hud.notification("Closing Doors") 455 | world.doors_are_open = False 456 | world.player.close_door(carla.VehicleDoor.All) 457 | else: 458 | world.hud.notification("Opening doors") 459 | world.doors_are_open = True 460 | world.player.open_door(carla.VehicleDoor.All) 461 | except Exception: 462 | pass 463 | elif event.key == K_t: 464 | if world.show_vehicle_telemetry: 465 | world.player.show_debug_telemetry(False) 466 | world.show_vehicle_telemetry = False 467 | world.hud.notification("Disabled Vehicle Telemetry") 468 | else: 469 | try: 470 | world.player.show_debug_telemetry(True) 471 | world.show_vehicle_telemetry = True 472 | world.hud.notification("Enabled Vehicle Telemetry") 473 | except Exception: 474 | pass 475 | elif event.key > K_0 and event.key <= K_9: 476 | index_ctrl = 0 477 | if pygame.key.get_mods() & KMOD_CTRL: 478 | index_ctrl = 9 479 | world.camera_manager.set_sensor(event.key - 1 - K_0 + index_ctrl) 480 | elif event.key == K_r and not (pygame.key.get_mods() & KMOD_CTRL): 481 | world.camera_manager.toggle_recording() 482 | elif event.key == K_r and (pygame.key.get_mods() & KMOD_CTRL): 483 | if (world.recording_enabled): 484 | client.stop_recorder() 485 | world.recording_enabled = False 486 | world.hud.notification("Recorder is OFF") 487 | else: 488 | client.start_recorder("manual_recording.rec") 489 | world.recording_enabled = True 490 | world.hud.notification("Recorder is ON") 491 | elif event.key == K_p and (pygame.key.get_mods() & KMOD_CTRL): 492 | # stop recorder 493 | client.stop_recorder() 494 | world.recording_enabled = False 495 | # work around to fix camera at start of replaying 496 | current_index = world.camera_manager.index 497 | world.destroy_sensors() 498 | # disable autopilot 499 | self._autopilot_enabled = False 500 | world.player.set_autopilot(self._autopilot_enabled) 501 | world.hud.notification("Replaying file 'manual_recording.rec'") 502 | # replayer 503 | client.replay_file("manual_recording.rec", world.recording_start, 0, 0) 504 | world.camera_manager.set_sensor(current_index) 505 | elif event.key == K_MINUS and (pygame.key.get_mods() & KMOD_CTRL): 506 | if pygame.key.get_mods() & KMOD_SHIFT: 507 | world.recording_start -= 10 508 | else: 509 | world.recording_start -= 1 510 | world.hud.notification("Recording start time is %d" % (world.recording_start)) 511 | elif event.key == K_EQUALS and (pygame.key.get_mods() & KMOD_CTRL): 512 | if pygame.key.get_mods() & KMOD_SHIFT: 513 | world.recording_start += 10 514 | else: 515 | world.recording_start += 1 516 | world.hud.notification("Recording start time is %d" % (world.recording_start)) 517 | if isinstance(self._control, carla.VehicleControl): 518 | if event.key == K_q: 519 | self._control.gear = 1 if self._control.reverse else -1 520 | elif event.key == K_m: 521 | self._control.manual_gear_shift = not self._control.manual_gear_shift 522 | self._control.gear = world.player.get_control().gear 523 | world.hud.notification('%s Transmission' % 524 | ('Manual' if self._control.manual_gear_shift else 'Automatic')) 525 | elif self._control.manual_gear_shift and event.key == K_COMMA: 526 | self._control.gear = max(-1, self._control.gear - 1) 527 | elif self._control.manual_gear_shift and event.key == K_PERIOD: 528 | self._control.gear = self._control.gear + 1 529 | elif event.key == K_p and not pygame.key.get_mods() & KMOD_CTRL: 530 | if not self._autopilot_enabled and not sync_mode: 531 | print("WARNING: You are currently in asynchronous mode and could " 532 | "experience some issues with the traffic simulation") 533 | self._autopilot_enabled = not self._autopilot_enabled 534 | world.player.set_autopilot(self._autopilot_enabled) 535 | world.hud.notification( 536 | 'Autopilot %s' % ('On' if self._autopilot_enabled else 'Off')) 537 | elif event.key == K_l and pygame.key.get_mods() & KMOD_CTRL: 538 | current_lights ^= carla.VehicleLightState.Special1 539 | elif event.key == K_l and pygame.key.get_mods() & KMOD_SHIFT: 540 | current_lights ^= carla.VehicleLightState.HighBeam 541 | elif event.key == K_l: 542 | # Use 'L' key to switch between lights: 543 | # closed -> position -> low beam -> fog 544 | if not self._lights & carla.VehicleLightState.Position: 545 | world.hud.notification("Position lights") 546 | current_lights |= carla.VehicleLightState.Position 547 | else: 548 | world.hud.notification("Low beam lights") 549 | current_lights |= carla.VehicleLightState.LowBeam 550 | if self._lights & carla.VehicleLightState.LowBeam: 551 | world.hud.notification("Fog lights") 552 | current_lights |= carla.VehicleLightState.Fog 553 | if self._lights & carla.VehicleLightState.Fog: 554 | world.hud.notification("Lights off") 555 | current_lights ^= carla.VehicleLightState.Position 556 | current_lights ^= carla.VehicleLightState.LowBeam 557 | current_lights ^= carla.VehicleLightState.Fog 558 | elif event.key == K_i: 559 | current_lights ^= carla.VehicleLightState.Interior 560 | elif event.key == K_z: 561 | current_lights ^= carla.VehicleLightState.LeftBlinker 562 | elif event.key == K_x: 563 | current_lights ^= carla.VehicleLightState.RightBlinker 564 | 565 | if not self._autopilot_enabled: 566 | if isinstance(self._control, carla.VehicleControl): 567 | self._parse_vehicle_keys(pygame.key.get_pressed(), clock.get_time()) 568 | self._control.reverse = self._control.gear < 0 569 | # Set automatic control-related vehicle lights 570 | if self._control.brake: 571 | current_lights |= carla.VehicleLightState.Brake 572 | else: # Remove the Brake flag 573 | current_lights &= ~carla.VehicleLightState.Brake 574 | if self._control.reverse: 575 | current_lights |= carla.VehicleLightState.Reverse 576 | else: # Remove the Reverse flag 577 | current_lights &= ~carla.VehicleLightState.Reverse 578 | if current_lights != self._lights: # Change the light state only if necessary 579 | self._lights = current_lights 580 | world.player.set_light_state(carla.VehicleLightState(self._lights)) 581 | elif isinstance(self._control, carla.WalkerControl): 582 | self._parse_walker_keys(pygame.key.get_pressed(), clock.get_time(), world) 583 | world.player.apply_control(self._control) 584 | 585 | def _parse_vehicle_keys(self, keys, milliseconds): 586 | if keys[K_UP] or keys[K_w]: 587 | self._control.throttle = min(self._control.throttle + 0.01, 0.50) # Neel changed value from 1.0 to 0.6 588 | else: 589 | self._control.throttle = 0.30 # Neel changed value from 0.0 to 0.3 590 | 591 | if keys[K_DOWN] or keys[K_s]: 592 | self._control.brake = min(self._control.brake + 0.2, 1) 593 | else: 594 | self._control.brake = 0 595 | 596 | steer_increment = 20e-4 * milliseconds # Neel changed value from 5e-4 597 | if keys[K_LEFT] or keys[K_a]: 598 | if self._steer_cache > 0: 599 | self._steer_cache = 0 600 | else: 601 | self._steer_cache -= steer_increment 602 | elif keys[K_RIGHT] or keys[K_d]: 603 | if self._steer_cache < 0: 604 | self._steer_cache = 0 605 | else: 606 | self._steer_cache += steer_increment 607 | else: 608 | self._steer_cache = 0.0 609 | self._steer_cache = min(0.7, max(-0.7, self._steer_cache)) 610 | self._control.steer = round(self._steer_cache, 1) 611 | self._control.hand_brake = keys[K_SPACE] 612 | 613 | def _parse_walker_keys(self, keys, milliseconds, world): 614 | self._control.speed = 0.0 615 | if keys[K_DOWN] or keys[K_s]: 616 | self._control.speed = 0.0 617 | if keys[K_LEFT] or keys[K_a]: 618 | self._control.speed = .01 619 | self._rotation.yaw -= 0.08 * milliseconds 620 | if keys[K_RIGHT] or keys[K_d]: 621 | self._control.speed = .01 622 | self._rotation.yaw += 0.08 * milliseconds 623 | if keys[K_UP] or keys[K_w]: 624 | self._control.speed = world.player_max_speed_fast if pygame.key.get_mods() & KMOD_SHIFT else world.player_max_speed 625 | self._control.jump = keys[K_SPACE] 626 | self._rotation.yaw = round(self._rotation.yaw, 1) 627 | self._control.direction = self._rotation.get_forward_vector() 628 | 629 | @staticmethod 630 | def _is_quit_shortcut(key): 631 | return (key == K_ESCAPE) or (key == K_q and pygame.key.get_mods() & KMOD_CTRL) 632 | 633 | 634 | # ============================================================================== 635 | # -- HUD ----------------------------------------------------------------------- 636 | # ============================================================================== 637 | 638 | 639 | class HUD(object): 640 | def __init__(self, width, height): 641 | self.dim = (width, height) 642 | font = pygame.font.Font(pygame.font.get_default_font(), 20) 643 | font_name = 'courier' if os.name == 'nt' else 'mono' 644 | fonts = [x for x in pygame.font.get_fonts() if font_name in x] 645 | default_font = 'ubuntumono' 646 | mono = default_font if default_font in fonts else fonts[0] 647 | mono = pygame.font.match_font(mono) 648 | self._font_mono = pygame.font.Font(mono, 12 if os.name == 'nt' else 14) 649 | self._notifications = FadingText(font, (width, 40), (0, height - 40)) 650 | self.help = HelpText(pygame.font.Font(mono, 16), width, height) 651 | self.server_fps = 0 652 | self.frame = 0 653 | self.simulation_time = 0 654 | self._show_info = True 655 | self._info_text = [] 656 | self._server_clock = pygame.time.Clock() 657 | 658 | def on_world_tick(self, timestamp): 659 | self._server_clock.tick() 660 | self.server_fps = self._server_clock.get_fps() 661 | self.frame = timestamp.frame 662 | self.simulation_time = timestamp.elapsed_seconds 663 | 664 | def tick(self, world, clock): 665 | self._notifications.tick(world, clock) 666 | if not self._show_info: 667 | return 668 | t = world.player.get_transform() 669 | v = world.player.get_velocity() 670 | c = world.player.get_control() 671 | compass = world.imu_sensor.compass 672 | heading = 'N' if compass > 270.5 or compass < 89.5 else '' 673 | heading += 'S' if 90.5 < compass < 269.5 else '' 674 | heading += 'E' if 0.5 < compass < 179.5 else '' 675 | heading += 'W' if 180.5 < compass < 359.5 else '' 676 | colhist = world.collision_sensor.get_collision_history() 677 | collision = [colhist[x + self.frame - 200] for x in range(0, 200)] 678 | max_col = max(1.0, max(collision)) 679 | collision = [x / max_col for x in collision] 680 | vehicles = world.world.get_actors().filter('vehicle.*') 681 | self._info_text = [ 682 | 'Server: % 16.0f FPS' % self.server_fps, 683 | 'Client: % 16.0f FPS' % clock.get_fps(), 684 | '', 685 | 'Vehicle: % 20s' % get_actor_display_name(world.player, truncate=20), 686 | 'Map: % 20s' % world.map.name.split('/')[-1], 687 | 'Simulation time: % 12s' % datetime.timedelta(seconds=int(self.simulation_time)), 688 | '', 689 | 'Speed: % 15.0f km/h' % (3.6 * math.sqrt(v.x**2 + v.y**2 + v.z**2)), 690 | u'Compass:% 17.0f\N{DEGREE SIGN} % 2s' % (compass, heading), 691 | 'Accelero: (%5.1f,%5.1f,%5.1f)' % (world.imu_sensor.accelerometer), 692 | 'Gyroscop: (%5.1f,%5.1f,%5.1f)' % (world.imu_sensor.gyroscope), 693 | 'Location:% 20s' % ('(% 5.1f, % 5.1f)' % (t.location.x, t.location.y)), 694 | 'GNSS:% 24s' % ('(% 2.6f, % 3.6f)' % (world.gnss_sensor.lat, world.gnss_sensor.lon)), 695 | 'Height: % 18.0f m' % t.location.z, 696 | ''] 697 | if isinstance(c, carla.VehicleControl): 698 | self._info_text += [ 699 | ('Throttle:', c.throttle, 0.0, 1.0), 700 | ('Steer:', c.steer, -1.0, 1.0), 701 | ('Brake:', c.brake, 0.0, 1.0), 702 | ('Reverse:', c.reverse), 703 | ('Hand brake:', c.hand_brake), 704 | ('Manual:', c.manual_gear_shift), 705 | 'Gear: %s' % {-1: 'R', 0: 'N'}.get(c.gear, c.gear)] 706 | elif isinstance(c, carla.WalkerControl): 707 | self._info_text += [ 708 | ('Speed:', c.speed, 0.0, 5.556), 709 | ('Jump:', c.jump)] 710 | self._info_text += [ 711 | '', 712 | 'Collision:', 713 | collision, 714 | '', 715 | 'Number of vehicles: % 8d' % len(vehicles)] 716 | if len(vehicles) > 1: 717 | self._info_text += ['Nearby vehicles:'] 718 | distance = lambda l: math.sqrt((l.x - t.location.x)**2 + (l.y - t.location.y)**2 + (l.z - t.location.z)**2) 719 | vehicles = [(distance(x.get_location()), x) for x in vehicles if x.id != world.player.id] 720 | for d, vehicle in sorted(vehicles, key=lambda vehicles: vehicles[0]): 721 | if d > 200.0: 722 | break 723 | vehicle_type = get_actor_display_name(vehicle, truncate=22) 724 | self._info_text.append('% 4dm %s' % (d, vehicle_type)) 725 | 726 | def toggle_info(self): 727 | self._show_info = not self._show_info 728 | 729 | def notification(self, text, seconds=2.0): 730 | self._notifications.set_text(text, seconds=seconds) 731 | 732 | def error(self, text): 733 | self._notifications.set_text('Error: %s' % text, (255, 0, 0)) 734 | 735 | def render(self, display): 736 | if self._show_info: 737 | info_surface = pygame.Surface((220, self.dim[1])) 738 | info_surface.set_alpha(100) 739 | display.blit(info_surface, (0, 0)) 740 | v_offset = 4 741 | bar_h_offset = 100 742 | bar_width = 106 743 | for item in self._info_text: 744 | if v_offset + 18 > self.dim[1]: 745 | break 746 | if isinstance(item, list): 747 | if len(item) > 1: 748 | points = [(x + 8, v_offset + 8 + (1.0 - y) * 30) for x, y in enumerate(item)] 749 | pygame.draw.lines(display, (255, 136, 0), False, points, 2) 750 | item = None 751 | v_offset += 18 752 | elif isinstance(item, tuple): 753 | if isinstance(item[1], bool): 754 | rect = pygame.Rect((bar_h_offset, v_offset + 8), (6, 6)) 755 | pygame.draw.rect(display, (255, 255, 255), rect, 0 if item[1] else 1) 756 | else: 757 | rect_border = pygame.Rect((bar_h_offset, v_offset + 8), (bar_width, 6)) 758 | pygame.draw.rect(display, (255, 255, 255), rect_border, 1) 759 | f = (item[1] - item[2]) / (item[3] - item[2]) 760 | if item[2] < 0.0: 761 | rect = pygame.Rect((bar_h_offset + f * (bar_width - 6), v_offset + 8), (6, 6)) 762 | else: 763 | rect = pygame.Rect((bar_h_offset, v_offset + 8), (f * bar_width, 6)) 764 | pygame.draw.rect(display, (255, 255, 255), rect) 765 | item = item[0] 766 | if item: # At this point has to be a str. 767 | surface = self._font_mono.render(item, True, (255, 255, 255)) 768 | display.blit(surface, (8, v_offset)) 769 | v_offset += 18 770 | self._notifications.render(display) 771 | self.help.render(display) 772 | 773 | 774 | # ============================================================================== 775 | # -- FadingText ---------------------------------------------------------------- 776 | # ============================================================================== 777 | 778 | 779 | class FadingText(object): 780 | def __init__(self, font, dim, pos): 781 | self.font = font 782 | self.dim = dim 783 | self.pos = pos 784 | self.seconds_left = 0 785 | self.surface = pygame.Surface(self.dim) 786 | 787 | def set_text(self, text, color=(255, 255, 255), seconds=2.0): 788 | text_texture = self.font.render(text, True, color) 789 | self.surface = pygame.Surface(self.dim) 790 | self.seconds_left = seconds 791 | self.surface.fill((0, 0, 0, 0)) 792 | self.surface.blit(text_texture, (10, 11)) 793 | 794 | def tick(self, _, clock): 795 | delta_seconds = 1e-3 * clock.get_time() 796 | self.seconds_left = max(0.0, self.seconds_left - delta_seconds) 797 | self.surface.set_alpha(500.0 * self.seconds_left) 798 | 799 | def render(self, display): 800 | display.blit(self.surface, self.pos) 801 | 802 | 803 | # ============================================================================== 804 | # -- HelpText ------------------------------------------------------------------ 805 | # ============================================================================== 806 | 807 | 808 | class HelpText(object): 809 | """Helper class to handle text output using pygame""" 810 | def __init__(self, font, width, height): 811 | lines = __doc__.split('\n') 812 | self.font = font 813 | self.line_space = 18 814 | self.dim = (780, len(lines) * self.line_space + 12) 815 | self.pos = (0.5 * width - 0.5 * self.dim[0], 0.5 * height - 0.5 * self.dim[1]) 816 | self.seconds_left = 0 817 | self.surface = pygame.Surface(self.dim) 818 | self.surface.fill((0, 0, 0, 0)) 819 | for n, line in enumerate(lines): 820 | text_texture = self.font.render(line, True, (255, 255, 255)) 821 | self.surface.blit(text_texture, (22, n * self.line_space)) 822 | self._render = False 823 | self.surface.set_alpha(220) 824 | 825 | def toggle(self): 826 | self._render = not self._render 827 | 828 | def render(self, display): 829 | if self._render: 830 | display.blit(self.surface, self.pos) 831 | 832 | 833 | # ============================================================================== 834 | # -- CollisionSensor ----------------------------------------------------------- 835 | # ============================================================================== 836 | 837 | 838 | class CollisionSensor(object): 839 | def __init__(self, parent_actor, hud): 840 | self.sensor = None 841 | self.history = [] 842 | self._parent = parent_actor 843 | self.hud = hud 844 | world = self._parent.get_world() 845 | bp = world.get_blueprint_library().find('sensor.other.collision') 846 | self.sensor = world.spawn_actor(bp, carla.Transform(), attach_to=self._parent) 847 | # We need to pass the lambda a weak reference to self to avoid circular 848 | # reference. 849 | weak_self = weakref.ref(self) 850 | self.sensor.listen(lambda event: CollisionSensor._on_collision(weak_self, event)) 851 | 852 | def get_collision_history(self): 853 | history = collections.defaultdict(int) 854 | for frame, intensity in self.history: 855 | history[frame] += intensity 856 | return history 857 | 858 | @staticmethod 859 | def _on_collision(weak_self, event): 860 | self = weak_self() 861 | if not self: 862 | return 863 | actor_type = get_actor_display_name(event.other_actor) 864 | self.hud.notification('Collision with %r' % actor_type) 865 | impulse = event.normal_impulse 866 | intensity = math.sqrt(impulse.x**2 + impulse.y**2 + impulse.z**2) 867 | self.history.append((event.frame, intensity)) 868 | if len(self.history) > 4000: 869 | self.history.pop(0) 870 | 871 | 872 | # ============================================================================== 873 | # -- LaneInvasionSensor -------------------------------------------------------- 874 | # ============================================================================== 875 | 876 | 877 | class LaneInvasionSensor(object): 878 | def __init__(self, parent_actor, hud): 879 | self.sensor = None 880 | 881 | # If the spawn object is not a vehicle, we cannot use the Lane Invasion Sensor 882 | if parent_actor.type_id.startswith("vehicle."): 883 | self._parent = parent_actor 884 | self.hud = hud 885 | world = self._parent.get_world() 886 | bp = world.get_blueprint_library().find('sensor.other.lane_invasion') 887 | self.sensor = world.spawn_actor(bp, carla.Transform(), attach_to=self._parent) 888 | # We need to pass the lambda a weak reference to self to avoid circular 889 | # reference. 890 | weak_self = weakref.ref(self) 891 | self.sensor.listen(lambda event: LaneInvasionSensor._on_invasion(weak_self, event)) 892 | 893 | @staticmethod 894 | def _on_invasion(weak_self, event): 895 | self = weak_self() 896 | if not self: 897 | return 898 | lane_types = set(x.type for x in event.crossed_lane_markings) 899 | text = ['%r' % str(x).split()[-1] for x in lane_types] 900 | self.hud.notification('Crossed line %s' % ' and '.join(text)) 901 | 902 | 903 | # ============================================================================== 904 | # -- GnssSensor ---------------------------------------------------------------- 905 | # ============================================================================== 906 | 907 | 908 | class GnssSensor(object): 909 | def __init__(self, parent_actor): 910 | self.sensor = None 911 | self._parent = parent_actor 912 | self.lat = 0.0 913 | self.lon = 0.0 914 | world = self._parent.get_world() 915 | bp = world.get_blueprint_library().find('sensor.other.gnss') 916 | self.sensor = world.spawn_actor(bp, carla.Transform(carla.Location(x=1.0, z=2.8)), attach_to=self._parent) 917 | # We need to pass the lambda a weak reference to self to avoid circular 918 | # reference. 919 | weak_self = weakref.ref(self) 920 | self.sensor.listen(lambda event: GnssSensor._on_gnss_event(weak_self, event)) 921 | 922 | @staticmethod 923 | def _on_gnss_event(weak_self, event): 924 | self = weak_self() 925 | if not self: 926 | return 927 | self.lat = event.latitude 928 | self.lon = event.longitude 929 | 930 | 931 | # ============================================================================== 932 | # -- IMUSensor ----------------------------------------------------------------- 933 | # ============================================================================== 934 | 935 | 936 | class IMUSensor(object): 937 | def __init__(self, parent_actor): 938 | self.sensor = None 939 | self._parent = parent_actor 940 | self.accelerometer = (0.0, 0.0, 0.0) 941 | self.gyroscope = (0.0, 0.0, 0.0) 942 | self.compass = 0.0 943 | world = self._parent.get_world() 944 | bp = world.get_blueprint_library().find('sensor.other.imu') 945 | self.sensor = world.spawn_actor( 946 | bp, carla.Transform(), attach_to=self._parent) 947 | # We need to pass the lambda a weak reference to self to avoid circular 948 | # reference. 949 | weak_self = weakref.ref(self) 950 | self.sensor.listen( 951 | lambda sensor_data: IMUSensor._IMU_callback(weak_self, sensor_data)) 952 | 953 | @staticmethod 954 | def _IMU_callback(weak_self, sensor_data): 955 | self = weak_self() 956 | if not self: 957 | return 958 | limits = (-99.9, 99.9) 959 | self.accelerometer = ( 960 | max(limits[0], min(limits[1], sensor_data.accelerometer.x)), 961 | max(limits[0], min(limits[1], sensor_data.accelerometer.y)), 962 | max(limits[0], min(limits[1], sensor_data.accelerometer.z))) 963 | self.gyroscope = ( 964 | max(limits[0], min(limits[1], math.degrees(sensor_data.gyroscope.x))), 965 | max(limits[0], min(limits[1], math.degrees(sensor_data.gyroscope.y))), 966 | max(limits[0], min(limits[1], math.degrees(sensor_data.gyroscope.z)))) 967 | self.compass = math.degrees(sensor_data.compass) 968 | 969 | 970 | # ============================================================================== 971 | # -- RadarSensor --------------------------------------------------------------- 972 | # ============================================================================== 973 | 974 | 975 | class RadarSensor(object): 976 | def __init__(self, parent_actor): 977 | self.sensor = None 978 | self._parent = parent_actor 979 | bound_x = 0.5 + self._parent.bounding_box.extent.x 980 | bound_y = 0.5 + self._parent.bounding_box.extent.y 981 | bound_z = 0.5 + self._parent.bounding_box.extent.z 982 | 983 | self.velocity_range = 7.5 # m/s 984 | world = self._parent.get_world() 985 | self.debug = world.debug 986 | bp = world.get_blueprint_library().find('sensor.other.radar') 987 | bp.set_attribute('horizontal_fov', str(35)) 988 | bp.set_attribute('vertical_fov', str(20)) 989 | self.sensor = world.spawn_actor( 990 | bp, 991 | carla.Transform( 992 | carla.Location(x=bound_x + 0.05, z=bound_z+0.05), 993 | carla.Rotation(pitch=5)), 994 | attach_to=self._parent) 995 | # We need a weak reference to self to avoid circular reference. 996 | weak_self = weakref.ref(self) 997 | self.sensor.listen( 998 | lambda radar_data: RadarSensor._Radar_callback(weak_self, radar_data)) 999 | 1000 | @staticmethod 1001 | def _Radar_callback(weak_self, radar_data): 1002 | self = weak_self() 1003 | if not self: 1004 | return 1005 | # To get a numpy [[vel, altitude, azimuth, depth],...[,,,]]: 1006 | # points = np.frombuffer(radar_data.raw_data, dtype=np.dtype('f4')) 1007 | # points = np.reshape(points, (len(radar_data), 4)) 1008 | 1009 | current_rot = radar_data.transform.rotation 1010 | for detect in radar_data: 1011 | azi = math.degrees(detect.azimuth) 1012 | alt = math.degrees(detect.altitude) 1013 | # The 0.25 adjusts a bit the distance so the dots can 1014 | # be properly seen 1015 | fw_vec = carla.Vector3D(x=detect.depth - 0.25) 1016 | carla.Transform( 1017 | carla.Location(), 1018 | carla.Rotation( 1019 | pitch=current_rot.pitch + alt, 1020 | yaw=current_rot.yaw + azi, 1021 | roll=current_rot.roll)).transform(fw_vec) 1022 | 1023 | def clamp(min_v, max_v, value): 1024 | return max(min_v, min(value, max_v)) 1025 | 1026 | norm_velocity = detect.velocity / self.velocity_range # range [-1, 1] 1027 | r = int(clamp(0.0, 1.0, 1.0 - norm_velocity) * 255.0) 1028 | g = int(clamp(0.0, 1.0, 1.0 - abs(norm_velocity)) * 255.0) 1029 | b = int(abs(clamp(- 1.0, 0.0, - 1.0 - norm_velocity)) * 255.0) 1030 | self.debug.draw_point( 1031 | radar_data.transform.location + fw_vec, 1032 | size=0.075, 1033 | life_time=0.06, 1034 | persistent_lines=False, 1035 | color=carla.Color(r, g, b)) 1036 | 1037 | # ============================================================================== 1038 | # -- CameraManager ------------------------------------------------------------- 1039 | # ============================================================================== 1040 | 1041 | 1042 | class CameraManager(object): 1043 | def __init__(self, parent_actor, hud, gamma_correction): 1044 | self.sensor = None 1045 | self.surface = None 1046 | self._parent = parent_actor 1047 | self.hud = hud 1048 | # Neel modified the self.recording to True from False 1049 | self.recording = False 1050 | bound_x = 0.5 + self._parent.bounding_box.extent.x 1051 | bound_y = 0.5 + self._parent.bounding_box.extent.y 1052 | bound_z = 0.5 + self._parent.bounding_box.extent.z 1053 | Attachment = carla.AttachmentType 1054 | #world = self._parent.get_world() 1055 | #c = world.player.get_control() 1056 | #self.key_control = KeyboardControl(world, False) 1057 | 1058 | if not self._parent.type_id.startswith("walker.pedestrian"): 1059 | self._camera_transforms = [ 1060 | (carla.Transform(carla.Location(x=-2.0*bound_x, y=+0.0*bound_y, z=2.0*bound_z), carla.Rotation(pitch=8.0)), Attachment.Rigid), 1061 | (carla.Transform(carla.Location(x=+0.8*bound_x, y=+0.0*bound_y, z=1.3*bound_z)), Attachment.Rigid), 1062 | (carla.Transform(carla.Location(x=+1.9*bound_x, y=+1.0*bound_y, z=1.2*bound_z)), Attachment.Rigid), 1063 | (carla.Transform(carla.Location(x=-2.8*bound_x, y=+0.0*bound_y, z=4.6*bound_z), carla.Rotation(pitch=6.0)), Attachment.Rigid), 1064 | (carla.Transform(carla.Location(x=-1.0, y=-1.0*bound_y, z=0.4*bound_z)), Attachment.Rigid)] 1065 | else: 1066 | self._camera_transforms = [ 1067 | (carla.Transform(carla.Location(x=-2.5, z=0.0), carla.Rotation(pitch=-8.0)), Attachment.Rigid), 1068 | (carla.Transform(carla.Location(x=1.6, z=1.7)), Attachment.Rigid), 1069 | (carla.Transform(carla.Location(x=2.5, y=0.5, z=0.0), carla.Rotation(pitch=-8.0)), Attachment.Rigid), 1070 | (carla.Transform(carla.Location(x=-4.0, z=2.0), carla.Rotation(pitch=6.0)), Attachment.Rigid), 1071 | (carla.Transform(carla.Location(x=0, y=-2.5, z=-0.0), carla.Rotation(yaw=90.0)), Attachment.Rigid)] 1072 | 1073 | self.transform_index = 1 1074 | self.sensors = [ 1075 | ['sensor.camera.rgb', cc.Raw, 'Camera RGB', {}], 1076 | ['sensor.camera.depth', cc.Raw, 'Camera Depth (Raw)', {}], 1077 | ['sensor.camera.depth', cc.Depth, 'Camera Depth (Gray Scale)', {}], 1078 | ['sensor.camera.depth', cc.LogarithmicDepth, 'Camera Depth (Logarithmic Gray Scale)', {}], 1079 | ['sensor.camera.semantic_segmentation', cc.Raw, 'Camera Semantic Segmentation (Raw)', {}], 1080 | ['sensor.camera.semantic_segmentation', cc.CityScapesPalette, 'Camera Semantic Segmentation (CityScapes Palette)', {}], 1081 | ['sensor.camera.instance_segmentation', cc.CityScapesPalette, 'Camera Instance Segmentation (CityScapes Palette)', {}], 1082 | ['sensor.camera.instance_segmentation', cc.Raw, 'Camera Instance Segmentation (Raw)', {}], 1083 | ['sensor.lidar.ray_cast', None, 'Lidar (Ray-Cast)', {'range': '50'}], 1084 | ['sensor.camera.dvs', cc.Raw, 'Dynamic Vision Sensor', {}], 1085 | ['sensor.camera.rgb', cc.Raw, 'Camera RGB Distorted', 1086 | {'lens_circle_multiplier': '3.0', 1087 | 'lens_circle_falloff': '3.0', 1088 | 'chromatic_aberration_intensity': '0.5', 1089 | 'chromatic_aberration_offset': '0'}], 1090 | ['sensor.camera.optical_flow', cc.Raw, 'Optical Flow', {}], 1091 | ] 1092 | world = self._parent.get_world() 1093 | bp_library = world.get_blueprint_library() 1094 | for item in self.sensors: 1095 | bp = bp_library.find(item[0]) 1096 | if item[0].startswith('sensor.camera'): 1097 | bp.set_attribute('image_size_x', str(hud.dim[0])) 1098 | bp.set_attribute('image_size_y', str(hud.dim[1])) 1099 | if bp.has_attribute('gamma'): 1100 | bp.set_attribute('gamma', str(gamma_correction)) 1101 | for attr_name, attr_value in item[3].items(): 1102 | bp.set_attribute(attr_name, attr_value) 1103 | elif item[0].startswith('sensor.lidar'): 1104 | self.lidar_range = 50 1105 | 1106 | for attr_name, attr_value in item[3].items(): 1107 | bp.set_attribute(attr_name, attr_value) 1108 | if attr_name == 'range': 1109 | self.lidar_range = float(attr_value) 1110 | 1111 | item.append(bp) 1112 | self.index = None 1113 | 1114 | def toggle_camera(self): 1115 | self.transform_index = (self.transform_index + 1) % len(self._camera_transforms) 1116 | self.set_sensor(self.index, notify=False, force_respawn=True) 1117 | 1118 | def set_sensor(self, index, notify=True, force_respawn=False): 1119 | index = index % len(self.sensors) 1120 | needs_respawn = True if self.index is None else \ 1121 | (force_respawn or (self.sensors[index][2] != self.sensors[self.index][2])) 1122 | if needs_respawn: 1123 | if self.sensor is not None: 1124 | self.sensor.destroy() 1125 | self.surface = None 1126 | self.sensor = self._parent.get_world().spawn_actor( 1127 | self.sensors[index][-1], 1128 | self._camera_transforms[self.transform_index][0], 1129 | attach_to=self._parent, 1130 | attachment_type=self._camera_transforms[self.transform_index][1]) 1131 | # We need to pass the lambda a weak reference to self to avoid 1132 | # circular reference. 1133 | weak_self = weakref.ref(self) 1134 | self.sensor.listen(lambda image: CameraManager._parse_image(weak_self, image)) 1135 | # Neel chaged/ added the below 2 lines and commented the default callback line above 1136 | #key_control = carla.VehicleControl() 1137 | #self.sensor.listen(lambda image: CameraManager.rgb_callback(self, image)) 1138 | if notify: 1139 | self.hud.notification(self.sensors[index][2]) 1140 | self.index = index 1141 | 1142 | def next_sensor(self): 1143 | self.set_sensor(self.index + 1) 1144 | 1145 | def toggle_recording(self): 1146 | self.recording = not self.recording 1147 | self.hud.notification('Recording %s' % ('On' if self.recording else 'Off')) 1148 | 1149 | def render(self, display): 1150 | if self.surface is not None: 1151 | display.blit(self.surface, (0, 0)) 1152 | 1153 | 1154 | def _parse_image(weak_self, image): 1155 | self = weak_self() 1156 | if not self: 1157 | return 1158 | if self.sensors[self.index][0].startswith('sensor.lidar'): 1159 | points = np.frombuffer(image.raw_data, dtype=np.dtype('f4')) 1160 | points = np.reshape(points, (int(points.shape[0] / 4), 4)) 1161 | lidar_data = np.array(points[:, :2]) 1162 | lidar_data *= min(self.hud.dim) / (2.0 * self.lidar_range) 1163 | lidar_data += (0.5 * self.hud.dim[0], 0.5 * self.hud.dim[1]) 1164 | lidar_data = np.fabs(lidar_data) # pylint: disable=E1111 1165 | lidar_data = lidar_data.astype(np.int32) 1166 | lidar_data = np.reshape(lidar_data, (-1, 2)) 1167 | lidar_img_size = (self.hud.dim[0], self.hud.dim[1], 3) 1168 | lidar_img = np.zeros((lidar_img_size), dtype=np.uint8) 1169 | lidar_img[tuple(lidar_data.T)] = (255, 255, 255) 1170 | self.surface = pygame.surfarray.make_surface(lidar_img) 1171 | elif self.sensors[self.index][0].startswith('sensor.camera.dvs'): 1172 | # Example of converting the raw_data from a carla.DVSEventArray 1173 | # sensor into a NumPy array and using it as an image 1174 | dvs_events = np.frombuffer(image.raw_data, dtype=np.dtype([ 1175 | ('x', np.uint16), ('y', np.uint16), ('t', np.int64), ('pol', np.bool)])) 1176 | dvs_img = np.zeros((image.height, image.width, 3), dtype=np.uint8) 1177 | # Blue is positive, red is negative 1178 | dvs_img[dvs_events[:]['y'], dvs_events[:]['x'], dvs_events[:]['pol'] * 2] = 255 1179 | self.surface = pygame.surfarray.make_surface(dvs_img.swapaxes(0, 1)) 1180 | elif self.sensors[self.index][0].startswith('sensor.camera.optical_flow'): 1181 | image = image.get_color_coded_flow() 1182 | array = np.frombuffer(image.raw_data, dtype=np.dtype("uint8")) 1183 | array = np.reshape(array, (image.height, image.width, 4)) 1184 | array = array[:, :, :3] 1185 | array = array[:, :, ::-1] 1186 | self.surface = pygame.surfarray.make_surface(array.swapaxes(0, 1)) 1187 | else: 1188 | image.convert(self.sensors[self.index][1]) 1189 | array = np.frombuffer(image.raw_data, dtype=np.dtype("uint8")) 1190 | array = np.reshape(array, (image.height, image.width, 4)) 1191 | array = array[:, :, :3] 1192 | array = array[:, :, ::-1] 1193 | # Neel added 1194 | rgb_callback(self, array, image) 1195 | self.surface = pygame.surfarray.make_surface(array.swapaxes(0, 1)) 1196 | if self.recording: 1197 | # Changes by Neel 1198 | #rgb_callback(self, image) 1199 | image.save_to_disk(f'/opt/carla-output/images/{date_string}/image_%06d.jpg' % image.frame) 1200 | 1201 | """ 1202 | # Neel added this callback Function 1203 | 1204 | def rgb_callback(self, image): 1205 | #image.save_to_disk('/opt/carla-output/images/image_%06d.jpg' % image.frame) 1206 | self._control = carla.VehicleControl() 1207 | steer = self._control.steer 1208 | throttle = self._control.throttle 1209 | sensor_data = {'rgb_image': 'image_%06d.png' % image.frame, 1210 | 'angle' : steer, 1211 | 'throttle' : throttle 1212 | } 1213 | 1214 | print(sensor_data) 1215 | #dataset_path = '/opt/carla-output/dataset/data_%06d.json' 1216 | ##with open(dataset_path % image.frame,'w') as f: 1217 | ## json.dump(sensor_data, f) 1218 | """ 1219 | 1220 | 1221 | # ============================================================================== 1222 | # -- game_loop() --------------------------------------------------------------- 1223 | # ============================================================================== 1224 | 1225 | 1226 | def game_loop(args): 1227 | global rgb_callback 1228 | pygame.init() 1229 | pygame.font.init() 1230 | world = None 1231 | original_settings = None 1232 | 1233 | try: 1234 | client = carla.Client(args.host, args.port) 1235 | client.set_timeout(20.0) 1236 | 1237 | sim_world = client.get_world() 1238 | if args.sync: 1239 | original_settings = sim_world.get_settings() 1240 | settings = sim_world.get_settings() 1241 | if not settings.synchronous_mode: 1242 | settings.synchronous_mode = True 1243 | settings.fixed_delta_seconds = 0.05 1244 | sim_world.apply_settings(settings) 1245 | 1246 | traffic_manager = client.get_trafficmanager() 1247 | traffic_manager.set_synchronous_mode(True) 1248 | 1249 | if args.autopilot and not sim_world.get_settings().synchronous_mode: 1250 | print("WARNING: You are currently in asynchronous mode and could " 1251 | "experience some issues with the traffic simulation") 1252 | 1253 | display = pygame.display.set_mode( 1254 | (args.width, args.height), 1255 | pygame.HWSURFACE | pygame.DOUBLEBUF) 1256 | display.fill((0,0,0)) 1257 | pygame.display.flip() 1258 | 1259 | hud = HUD(args.width, args.height) 1260 | world = World(sim_world, hud, args) 1261 | controller = KeyboardControl(world, args.autopilot) 1262 | #print(controller._control.steer) 1263 | 1264 | if args.sync: 1265 | sim_world.tick() 1266 | else: 1267 | sim_world.wait_for_tick() 1268 | 1269 | clock = pygame.time.Clock() 1270 | while True: 1271 | def rgb_callback(self, array, image): 1272 | #image.save_to_disk('/opt/carla-output/images/image_%06d.jpg' % image.frame) 1273 | image_dir = f'/opt/carla-output/images/{date_string}/' 1274 | Path(image_dir).mkdir(parents=True, exist_ok= True) 1275 | image_path = image_dir + 'arr_%08d.npy' 1276 | np.save(image_path % image.frame, array) 1277 | steer = controller._control.steer 1278 | throttle = controller._control.throttle 1279 | sensor_data = {'img_array':'arr_%08d.npy' % image.frame, "angle" : steer,'throttle' : throttle} 1280 | print(sensor_data) 1281 | dataset_dir = f'/opt/carla-output/dataset/{date_string}' 1282 | dataset_path = dataset_dir + '/data_%08d.json' 1283 | Path(dataset_dir).mkdir(parents=True, exist_ok= True) 1284 | with open(dataset_path % image.frame,'w') as f: 1285 | json.dump(sensor_data, f) 1286 | #globals()['rgb_callback']=rgb_callback 1287 | if args.sync: 1288 | sim_world.tick() 1289 | clock.tick_busy_loop(60) 1290 | if controller.parse_events(client, world, clock, args.sync): 1291 | return 1292 | world.tick(clock) 1293 | world.render(display) 1294 | pygame.display.flip() 1295 | 1296 | finally: 1297 | 1298 | if original_settings: 1299 | sim_world.apply_settings(original_settings) 1300 | 1301 | if (world and world.recording_enabled): 1302 | client.stop_recorder() 1303 | 1304 | if world is not None: 1305 | world.destroy() 1306 | 1307 | pygame.quit() 1308 | 1309 | 1310 | # ============================================================================== 1311 | # -- main() -------------------------------------------------------------------- 1312 | # ============================================================================== 1313 | 1314 | 1315 | def main(): 1316 | argparser = argparse.ArgumentParser( 1317 | description='CARLA Manual Control Client') 1318 | argparser.add_argument( 1319 | '-v', '--verbose', 1320 | action='store_true', 1321 | dest='debug', 1322 | help='print debug information') 1323 | argparser.add_argument( 1324 | '--host', 1325 | metavar='H', 1326 | default='127.0.0.1', 1327 | help='IP of the host server (default: 127.0.0.1)') 1328 | argparser.add_argument( 1329 | '-p', '--port', 1330 | metavar='P', 1331 | default=2000, 1332 | type=int, 1333 | help='TCP port to listen to (default: 2000)') 1334 | argparser.add_argument( 1335 | '-a', '--autopilot', 1336 | action='store_true', 1337 | help='enable autopilot') 1338 | argparser.add_argument( 1339 | '--res', 1340 | metavar='WIDTHxHEIGHT', 1341 | default='640x480', 1342 | help='window resolution (default: 1280x720)') 1343 | argparser.add_argument( 1344 | '--filter', 1345 | metavar='PATTERN', 1346 | default='vehicle.*', 1347 | help='actor filter (default: "vehicle.*")') 1348 | argparser.add_argument( 1349 | '--generation', 1350 | metavar='G', 1351 | default='2', 1352 | help='restrict to certain actor generation (values: "1","2","All" - default: "2")') 1353 | argparser.add_argument( 1354 | '--rolename', 1355 | metavar='NAME', 1356 | default='hero', 1357 | help='actor role name (default: "hero")') 1358 | argparser.add_argument( 1359 | '--gamma', 1360 | default=2.2, 1361 | type=float, 1362 | help='Gamma correction of the camera (default: 2.2)') 1363 | argparser.add_argument( 1364 | '--sync', 1365 | action='store_true', 1366 | help='Activate synchronous mode execution') 1367 | args = argparser.parse_args() 1368 | 1369 | control = carla.VehicleControl() 1370 | 1371 | args.width, args.height = [int(x) for x in args.res.split('x')] 1372 | print(args.width, args.height) 1373 | 1374 | log_level = logging.DEBUG if args.debug else logging.INFO 1375 | logging.basicConfig(format='%(levelname)s: %(message)s', level=log_level) 1376 | 1377 | logging.info('listening to server %s:%s', args.host, args.port) 1378 | 1379 | print(__doc__) 1380 | 1381 | 1382 | try: 1383 | 1384 | game_loop(args) 1385 | 1386 | except KeyboardInterrupt: 1387 | print('\nCancelled by user. Bye!') 1388 | 1389 | 1390 | if __name__ == '__main__': 1391 | 1392 | main() --------------------------------------------------------------------------------