├── LICENSE ├── README.md ├── SimulationRadar.m └── media ├── image6.png └── output.gif /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2020 Vivek 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Radar Simulation 2 | 3 | ![Alt Text](https://github.com/curio-code/Radar-Simulation/blob/master/media/output.gif) 4 | 5 | ## Simulation Environment 6 | 1. Defining an empty scenario 7 | ``` 8 | scenario = drivingScenario; 9 | scenario.SampleTime = 0.01; 10 | ``` 11 | 2. Adding roads in into the environment 12 | ``` 13 | roadCenters = [0 0; 50 0; 100 0; 250 20; 500 40]; 14 | road(scenario, roadCenters, 'lanes',lanespec(2)); 15 | ``` 16 | 17 | 3. Creating Ego Vehicle and various traffic cars on the road 18 | ``` 19 | egoCar = vehicle(scenario, 'ClassID', 1); 20 | trajectory(egoCar, roadCenters(2:end,:) - [0 1.8], 25); % On right lane 21 | 22 | % Add a car in front of the ego vehicle 23 | leadCar = vehicle(scenario, 'ClassID', 1); 24 | trajectory(leadCar, [70 0; roadCenters(3:end,:)] - [0 1.8], 25); % On right lane 25 | 26 | % Add a car that travels at 35 m/s along the road and passes the ego vehicle 27 | passingCar = vehicle(scenario, 'ClassID', 1); 28 | waypoints = [0 -1.8; 50 1.8; 100 1.8; 250 21.8; 400 32.2; 500 38.2]; 29 | trajectory(passingCar, waypoints, 35); 30 | 31 | % Add a car behind the ego vehicle 32 | chaseCar = vehicle(scenario, 'ClassID', 1); 33 | trajectory(chaseCar, [25 0; roadCenters(2:end,:)] - [0 1.8], 25); % On right lane 34 | ``` 35 | 4. Attaching 6 RADARs on the Ego Vehicle 36 | ``` 37 | sensors{1} = radarDetectionGenerator('SensorIndex', 1, 'Height', 0.2, 'MaxRange', 174, ... 38 | 'SensorLocation', [egoCar.Wheelbase + egoCar.FrontOverhang, 0], 'FieldOfView', [20, 5]); 39 | ``` 40 | 41 | ## Create a Kalman filter for tracking 42 | ``` 43 | H = [1 0 0 0; 0 0 1 0; 0 1 0 0; 0 0 0 1]; 44 | filter = trackingKF('MotionModel', '2D Constant Velocity', 'State', H'*detection.Measurement, 45 | 'MeasurementModel', H, 'StateCovariance', H'*detection.MeasurementNoise*H, 'MeasurementNoise', 46 | detection.MeasurementNoise); 47 | ``` 48 | ### Cummulative flow of the code 49 | ![Alt Text](https://github.com/curio-code/Radar-Simulation/blob/master/media/image6.png) 50 | -------------------------------------------------------------------------------- /SimulationRadar.m: -------------------------------------------------------------------------------- 1 | %% Sensor Fusion Using Synthetic Radar 2 | %% Generating the Scenario 3 | % Scenario generation comprises generating a road network, defining 4 | % vehicles that move on the roads, and moving the vehicles. 5 | % 6 | % Test the ability of the sensor fusion to track a 7 | % vehicle that is passing on the left of the ego vehicle. The scenario 8 | % simulates a highway setting, and additional vehicles are in front of and 9 | % behind the ego vehicle. 10 | 11 | % Define an empty scenario. 12 | scenario = drivingScenario; 13 | scenario.SampleTime = 0.01; 14 | 15 | %% 16 | % Stretch of 500 meters of typical highway road with two lanes. The 17 | % road is defined using a set of points, where each point defines the center of 18 | % the road in 3-D space. 19 | roadCenters = [0 0; 50 0; 100 0; 250 20; 500 40]; 20 | road(scenario, roadCenters, 'lanes',lanespec(2)); 21 | 22 | %% 23 | % Creating the ego vehicle and three cars around it: one that overtakes the 24 | % ego vehicle and passes it on the left, one that drives right in front of 25 | % the ego vehicle and one that drives right behind the ego vehicle. All the 26 | % cars follow the trajectory defined by the road waypoints by using the 27 | % |trajectory| driving policy. The passing car will start on the right 28 | % lane, move to the left lane to pass, and return to the right lane. 29 | 30 | % Creating the ego vehicle that travels at 25 m/s along the road. Placed the 31 | % vehicle on the right lane by subtracting off half a lane width (1.8 m) 32 | % from the centerline of the road. 33 | egoCar = vehicle(scenario, 'ClassID', 1); 34 | trajectory(egoCar, roadCenters(2:end,:) - [0 1.8], 25); % On right lane 35 | 36 | % A car in front of the ego vehicle 37 | leadCar = vehicle(scenario, 'ClassID', 1); 38 | trajectory(leadCar, [70 0; roadCenters(3:end,:)] - [0 1.8], 25); % On right lane 39 | 40 | % A car that travels at 35 m/s along the road and passes the ego vehicle 41 | passingCar = vehicle(scenario, 'ClassID', 1); 42 | waypoints = [0 -1.8; 50 1.8; 100 1.8; 250 21.8; 400 32.2; 500 38.2]; 43 | trajectory(passingCar, waypoints, 35); 44 | 45 | % A car behind the ego vehicle 46 | chaseCar = vehicle(scenario, 'ClassID', 1); 47 | trajectory(chaseCar, [25 0; roadCenters(2:end,:)] - [0 1.8], 25); % On right lane 48 | 49 | %% Define Radar Sensors 50 | % 51 | % The ego vehicle is equipped with a 52 | % long-range radar sensor. Each side of the vehicle has two short-range radar 53 | % sensors, each covering 90 degrees. One sensor on each side covers from 54 | % the middle of the vehicle to the back. The other sensor on each side 55 | % covers from the middle of the vehicle forward. The figure in the next 56 | % section shows the coverage. 57 | 58 | sensors = cell(6,1); 59 | % Front-facing long-range radar sensor at the center of the front bumper of the car. 60 | sensors{1} = radarDetectionGenerator('SensorIndex', 1, 'Height', 0.2, 'MaxRange', 174, ... 61 | 'SensorLocation', [egoCar.Wheelbase + egoCar.FrontOverhang, 0], 'FieldOfView', [20, 5]); 62 | 63 | % Rear-facing long-range radar sensor at the center of the rear bumper of the car. 64 | sensors{2} = radarDetectionGenerator('SensorIndex', 2, 'Height', 0.2, 'Yaw', 180, ... 65 | 'SensorLocation', [-egoCar.RearOverhang, 0], 'MaxRange', 174, 'FieldOfView', [20, 5]); 66 | 67 | % Rear-left-facing short-range radar sensor at the left rear wheel well of the car. 68 | sensors{3} = radarDetectionGenerator('SensorIndex', 3, 'Height', 0.2, 'Yaw', 120, ... 69 | 'SensorLocation', [0, egoCar.Width/2], 'MaxRange', 30, 'ReferenceRange', 50, ... 70 | 'FieldOfView', [90, 5], 'AzimuthResolution', 10, 'RangeResolution', 1.25); 71 | 72 | % Rear-right-facing short-range radar sensor at the right rear wheel well of the car. 73 | sensors{4} = radarDetectionGenerator('SensorIndex', 4, 'Height', 0.2, 'Yaw', -120, ... 74 | 'SensorLocation', [0, -egoCar.Width/2], 'MaxRange', 30, 'ReferenceRange', 50, ... 75 | 'FieldOfView', [90, 5], 'AzimuthResolution', 10, 'RangeResolution', 1.25); 76 | 77 | % Front-left-facing short-range radar sensor at the left front wheel well of the car. 78 | sensors{5} = radarDetectionGenerator('SensorIndex', 5, 'Height', 0.2, 'Yaw', 60, ... 79 | 'SensorLocation', [egoCar.Wheelbase, egoCar.Width/2], 'MaxRange', 30, ... 80 | 'ReferenceRange', 50, 'FieldOfView', [90, 5], 'AzimuthResolution', 10, ... 81 | 'RangeResolution', 1.25); 82 | 83 | % Front-right-facing short-range radar sensor at the right front wheel well of the car. 84 | sensors{6} = radarDetectionGenerator('SensorIndex', 6, 'Height', 0.2, 'Yaw', -60, ... 85 | 'SensorLocation', [egoCar.Wheelbase, -egoCar.Width/2], 'MaxRange', 30, ... 86 | 'ReferenceRange', 50, 'FieldOfView', [90, 5], 'AzimuthResolution', 10, ... 87 | 'RangeResolution', 1.25); 88 | 89 | %% Create a Tracker 90 | % Create a || to track 91 | % the vehicles that are close to the ego vehicle. The tracker uses the 92 | % |initSimDemoFilter| supporting function to initialize a constant velocity 93 | % linear Kalman filter that works with position and velocity. 94 | % 95 | % Tracking is done in 2-D. Although the sensors return measurements in 3-D, 96 | % the motion itself is confined to the horizontal plane, so there is no 97 | % need to track the height. 98 | 99 | 100 | 101 | % https://www.mathworks.com/help/driving/ref/multiobjecttracker-system-object.html 102 | 103 | tracker = multiObjectTracker('FilterInitializationFcn', @initSimDemoFilter, ... 104 | 'AssignmentThreshold', 30, 'ConfirmationParameters', [4 5], 'NumCoastingUpdates', 5); 105 | positionSelector = [1 0 0 0; 0 0 1 0]; % Position selector 106 | velocitySelector = [0 1 0 0; 0 0 0 1]; % Velocity selector 107 | 108 | % Create the display and return a handle to the bird's-eye plot 109 | BEP = createDemoDisplay(egoCar, sensors); 110 | 111 | 112 | 113 | %% Simulate the Scenario 114 | % The following loop moves the vehicles, calls the sensor simulation, and 115 | % performs the tracking. 116 | % 117 | % Note that the scenario generation and sensor simulation can have 118 | % different time steps. Specifying different time steps for the scenario 119 | % and the sensors enables you to decouple the scenario simulation from the 120 | % sensor simulation. This is useful for modeling actor motion with high 121 | % accuracy independently from the sensor's measurement rate. 122 | % 123 | % Another example is when the sensors have different update rates. Suppose 124 | % one sensor provides updates every 20 milliseconds and another sensor 125 | % provides updates every 50 milliseconds. You can specify the scenario with 126 | % an update rate of 10 milliseconds and the sensors will provide their 127 | % updates at the correct time. 128 | % 129 | % In this example, the scenario generation has a time step of 0.01 second, 130 | % while the sensors detect every 0.1 second. The sensors return a logical 131 | % flag, |isValidTime|, that is true if the sensors generated detections. 132 | % This flag is used to call the tracker only when there are detections. 133 | % 134 | % Another important note is that the sensors can simulate multiple 135 | % detections per target, in particular when the targets are very close to 136 | % the radar sensors. Because the tracker assumes a single detection per 137 | % target from each sensor, you must cluster the detections before the 138 | % tracker processes them. This is done by the function |clusterDetections|. 139 | % See the 'Supporting Functions' section. 140 | 141 | toSnap = true; 142 | while advance(scenario) && ishghandle(BEP.Parent) 143 | % Get the scenario time 144 | time = scenario.SimulationTime; 145 | 146 | % Get the position of the other vehicle in ego vehicle coordinates 147 | ta = targetPoses(egoCar); 148 | 149 | % Simulate the sensors 150 | detections = {}; 151 | isValidTime = false(1,6); 152 | for i = 1:6 153 | [sensorDets,numValidDets,isValidTime(i)] = sensors{i}(ta, time); 154 | if numValidDets 155 | detections = [detections; sensorDets]; %#ok 156 | end 157 | end 158 | 159 | % Update the tracker if there are new detections 160 | if any(isValidTime) 161 | vehicleLength = sensors{1}.ActorProfiles.Length; 162 | detectionClusters = clusterDetections(detections, vehicleLength); 163 | confirmedTracks = updateTracks(tracker, detectionClusters, time); 164 | 165 | % Update bird's-eye plot 166 | updateBEP(BEP, egoCar, detections, confirmedTracks, positionSelector, velocitySelector); 167 | end 168 | 169 | % Snap a figure for the document when the car passes the ego vehicle 170 | if ta(1).Position(1) > 0 && toSnap 171 | toSnap = false; 172 | snapnow 173 | end 174 | end 175 | 176 | %% Supporting Functions 177 | % 178 | % 179 | % This function initializes a constant velocity filter based on a detection. 180 | function filter = initSimDemoFilter(detection) 181 | % Use a 2-D constant velocity model to initialize a trackingKF filter. 182 | % The state vector is [x;vx;y;vy] 183 | % The detection measurement vector is [x;y;vx;vy] 184 | % As a result, the measurement model is H = [1 0 0 0; 0 0 1 0; 0 1 0 0; 0 0 0 1] 185 | 186 | %The Kalman filter using trackingKF function. 187 | H = [1 0 0 0; 0 0 1 0; 0 1 0 0; 0 0 0 1]; 188 | filter = trackingKF('MotionModel', '2D Constant Velocity', 'State', H'*detection.Measurement, 'MeasurementModel', H, 'StateCovariance', H'*detection.MeasurementNoise*H, 'MeasurementNoise', detection.MeasurementNoise); 189 | 190 | 191 | end 192 | 193 | %%% 194 | % *|clusterDetections|* 195 | % 196 | % This function merges multiple detections suspected to be of the same 197 | % vehicle to a single detection. The function looks for detections that are 198 | % closer than the size of a vehicle. Detections that fit this criterion are 199 | % considered a cluster and are merged to a single detection at the centroid 200 | % of the cluster. The measurement noises are modified to represent the 201 | % possibility that each detection can be anywhere on the vehicle. 202 | % Therefore, the noise should have the same size as the vehicle size. 203 | % 204 | % In addition, this function removes the third dimension of the measurement 205 | % (the height) and reduces the measurement vector to [x;y;vx;vy]. 206 | function detectionClusters = clusterDetections(detections, vehicleSize) 207 | N = numel(detections); 208 | distances = zeros(N); 209 | for i = 1:N 210 | for j = i+1:N 211 | if detections{i}.SensorIndex == detections{j}.SensorIndex 212 | distances(i,j) = norm(detections{i}.Measurement(1:2) - detections{j}.Measurement(1:2)); 213 | else 214 | distances(i,j) = inf; 215 | end 216 | end 217 | end 218 | leftToCheck = 1:N; 219 | i = 0; 220 | detectionClusters = cell(N,1); 221 | while ~isempty(leftToCheck) 222 | % Remove the detections that are in the same cluster as the one under 223 | % consideration 224 | 225 | 226 | underConsideration = leftToCheck(1); 227 | clusterInds = (distances(underConsideration, leftToCheck) display. 258 | function BEP = createDemoDisplay(egoCar, sensors) 259 | % Make a figure 260 | hFigure = figure('Position', [0, 0, 1200, 640], 'Name', 'Sensor Fusion with Synthetic Data Example'); 261 | movegui(hFigure, [0 -1]); % Moves the figure to the left and a little down from the top 262 | 263 | % Add a car plot that follows the ego vehicle from behind 264 | hCarViewPanel = uipanel(hFigure, 'Position', [0 0 0.5 0.5], 'Title', 'Chase Camera View'); 265 | hCarPlot = axes(hCarViewPanel); 266 | chasePlot(egoCar, 'Parent', hCarPlot); 267 | 268 | % Add a car plot that follows the ego vehicle from a top view 269 | hTopViewPanel = uipanel(hFigure, 'Position', [0 0.5 0.5 0.5], 'Title', 'Top View'); 270 | hCarPlot = axes(hTopViewPanel); 271 | chasePlot(egoCar, 'Parent', hCarPlot, 'ViewHeight', 130, 'ViewLocation', [0 0], 'ViewPitch', 90); 272 | 273 | % Add a panel for a bird's-eye plot 274 | hBEVPanel = uipanel(hFigure, 'Position', [0.5 0 0.5 1], 'Title', 'Bird''s-Eye Plot'); 275 | 276 | % Create bird's-eye plot for the ego car and sensor coverage 277 | hBEVPlot = axes(hBEVPanel); 278 | frontBackLim = 60; 279 | BEP = birdsEyePlot('Parent', hBEVPlot, 'Xlimits', [-frontBackLim frontBackLim], 'Ylimits', [-35 35]); 280 | 281 | % Plot the coverage areas for radars 282 | for i = 1:6 283 | cap = coverageAreaPlotter(BEP,'FaceColor','red','EdgeColor','red'); 284 | plotCoverageArea(cap, sensors{i}.SensorLocation,... 285 | sensors{i}.MaxRange, sensors{i}.Yaw, sensors{i}.FieldOfView(1)); 286 | end 287 | 288 | % Plot the coverage areas for vision sensors 289 | % for i = 7:8 290 | % cap = coverageAreaPlotter(BEP,'FaceColor','blue','EdgeColor','blue'); 291 | % plotCoverageArea(cap, sensors{i}.SensorLocation,... 292 | % sensors{i}.MaxRange, sensors{i}.Yaw, 45); 293 | % end 294 | 295 | % Create a vision detection plotter put it in a struct for future use 296 | % detectionPlotter(BEP, 'DisplayName','vision', 'MarkerEdgeColor','blue', 'Marker','^'); 297 | 298 | % Combine all radar detections into one entry and store it for later update 299 | detectionPlotter(BEP, 'DisplayName','radar', 'MarkerEdgeColor','red'); 300 | 301 | % Add road borders to plot 302 | laneMarkingPlotter(BEP, 'DisplayName','lane markings'); 303 | 304 | % Add the tracks to the bird's-eye plot. Show last 10 track updates. 305 | trackPlotter(BEP, 'DisplayName','track', 'HistoryDepth',10); 306 | 307 | axis(BEP.Parent, 'equal'); 308 | xlim(BEP.Parent, [-frontBackLim frontBackLim]); 309 | ylim(BEP.Parent, [-40 40]); 310 | 311 | % Add an outline plotter for ground truth 312 | outlinePlotter(BEP, 'Tag', 'Ground truth'); 313 | end 314 | 315 | %%% 316 | % *|updateBEP|* 317 | % 318 | % This function updates the bird's-eye plot with road boundaries, 319 | % detections, and tracks. 320 | function updateBEP(BEP, egoCar, detections, confirmedTracks, psel, vsel) 321 | % Update road boundaries and their display 322 | [lmv, lmf] = laneMarkingVertices(egoCar); 323 | plotLaneMarking(findPlotter(BEP,'DisplayName','lane markings'),lmv,lmf); 324 | 325 | % update ground truth data 326 | [position, yaw, length, width, originOffset, color] = targetOutlines(egoCar); 327 | plotOutline(findPlotter(BEP,'Tag','Ground truth'), position, yaw, length, width, 'OriginOffset', originOffset, 'Color', color); 328 | 329 | % Prepare and update detections display 330 | N = numel(detections); 331 | detPos = zeros(N,2); 332 | isRadar = true(N,1); 333 | for i = 1:N 334 | detPos(i,:) = detections{i}.Measurement(1:2)'; 335 | if detections{i}.SensorIndex > 6 336 | isRadar(i) = false; 337 | end 338 | end 339 | plotDetection(findPlotter(BEP,'DisplayName','radar'), detPos(isRadar,:)); 340 | 341 | % Prepare and update tracks display 342 | trackIDs = {confirmedTracks.TrackID}; 343 | labels = cellfun(@num2str, trackIDs, 'UniformOutput', false); 344 | [tracksPos, tracksCov] = getTrackPositions(confirmedTracks, psel); 345 | tracksVel = getTrackVelocities(confirmedTracks, vsel); 346 | plotTrack(findPlotter(BEP,'DisplayName','track'), tracksPos, tracksVel, tracksCov, labels); 347 | end 348 | -------------------------------------------------------------------------------- /media/image6.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/curio-code/Radar-Simulation/2196f78b20cc164119265bfbf2876eb28b10ed32/media/image6.png -------------------------------------------------------------------------------- /media/output.gif: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/curio-code/Radar-Simulation/2196f78b20cc164119265bfbf2876eb28b10ed32/media/output.gif --------------------------------------------------------------------------------