├── Example_JD.md ├── MapUpdateCycle.pdf ├── MapUpdateCycle.png ├── README.md └── Reference ├── 1704.02696.pdf ├── 219126.pdf ├── 241975.pdf ├── 252489.pdf ├── Grid-Based Multi-Sensor Fusion for On-Road Obstacle Detection- Application to Autonomous Driving.pdf ├── Improving absolute position estimates of an automotive vehicle using GPS in sensor fusion.pdf ├── Offline Sensor Fusion for Multitarget Tracking using Radar and Camera Detections.pdf ├── T4C.1.SS2323_4079F1.pdf └── presentations-ch_201710_apollo_maojiming.pdf /Example_JD.md: -------------------------------------------------------------------------------- 1 | As Active Safety Map & Cloud System Developer, you will be part of a team working with map data needs for a fused environmental perception, as well as a solution for probe sourcing sensor data for keeping a cloud based map continuously up to date and reliable enough to be used for autonomous driving. 2 | Main responsibilities 3 | 4 | One of your responsibilities will be to design the system solution as well as writing and releasing requirements and documentation, supporting the development in these areas. In this work, you need to have a close collaboration with the Function Owners within Active Safety as well as other stakeholders within Volvo Cars globally, such as the Infotainment/Navigation area, the team within Volvo Cars IT working with the cloud development and the Sensor Performance team. 5 | 6 | -Further, as a Senior Developer, you will collaborate closely with our map suppliers, as well as our SW Partner Zenuity, in all phases of the Active Safety and Autonomous Drive projects. You will also work to define strategies and new technology steps / advanced engineering within the area together with the Map & Cloud Team Leader. -------------------------------------------------------------------------------- /MapUpdateCycle.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/insyncim64/sensor_fusion_hdmap/dc6eeaae045289f32a88f7d1daa0b049b4fd6ef0/MapUpdateCycle.pdf -------------------------------------------------------------------------------- /MapUpdateCycle.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/insyncim64/sensor_fusion_hdmap/dc6eeaae045289f32a88f7d1daa0b049b4fd6ef0/MapUpdateCycle.png -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | Basic: 2 | What is sensor fusion? 3 | - Many sensors are equipped in a car. Integrate different cameras, radars can help to achieve a acurate status of a car in the environment. 4 | 5 | ![](https://static.leiphone.com/uploads/new/article/740_740/201708/59a669c009294.jpg?imageMogr2/format/jpg/quality/90) 6 | - However, every sensor has its pros and cons. e.g. take two sensors in a cellphone, accerometer is not accurate while error will not be accumulated. gyro is accurate in returning the changes, while errors will be accumulated. 7 | - Kalman filter is a classic tool to do sensor fusion. 8 | What is high defition map? 9 | Why a high definition map is needed? 10 | Why sensor fusion is needed? 11 | What is the relationship between sensor fusion and HAD? 12 | What have you done for sensor fusion and HAD? 13 | 14 | What are sensors in a car? 15 | - CMOS cameras will be blinded in a rainy and floggy whether 16 | - Radars perform well in bad weather condition while their resolution is not that good. 17 | - LiDAR ToF Technology using light. based on different reflection of signal it determines the surface. e.g. diffuse reflection, retro-reflection 18 | 19 | One example for sensor fusion in a car? 20 | - Back camera and ultrasonic range finder for parking 21 | - Front camera and multi module radars for ADAS 22 | 23 | Sensor fusion systems 24 | - centralized 25 | - distributed 26 | 27 | Practical experience about Sensor Fusion, Map and Cloud: 28 | - Apollo Cloud platform 29 | - HD Map: openDRIVE 30 | - OTA 31 | - Data Platform 32 | 33 | Companies: 34 | - Civil Maps 35 | - base map 36 | - camera view, voxel view 37 | - get feature points based on the base map(SLAM) 38 | - second car can reuse the based map and localizes itself 39 | - 80% - +/-5cm 40 | - How to use LiDAR 41 | C/C++ 42 | 43 | HD Map 44 | - The importance of HD Map in Autonoumous Driving 45 | - From cloud platform perspective, simulation system can not work without HD map. The goal of a simulation system is to reconstruct real roads, traffic and environment. And it is used for training algorithms. On the other hand, HD Map in the cloud can act as a data source for cars in the road. It supports autonomous driving mainling in four aspects, localization, perception, decision and planning. 46 | - Localization: it is known that in combination of GPS, IMU, current map is working perfectly for cars. However, current accuracy of localization can not meet the requirement for localization in autonomous driving. Different sensors are used for percept and localize a car. A quick solution is to have a map which contains crucial information for driving. And it should as detailed as possible. Then a low cost autonomous driving solution can be proposed. A single lens camera can be used to capture image. Lanes can be used to compare with the lane information in the HD map to determine the location horizontally. Traffic light, light and lighting pole can be used for localization vertically. 47 | - Perception: At first, sensors can percept the environment up to 1km away. HD Map can provide you information much more far away from it. Secondly, through comparison between the information which is captured actively from sensors and HD map, it helps to detect objects, like vehcile and pedestrian. To aggregate different regions which are needed by perception module, HD Map can provide region of interest(ROI). Moreover, information from HD map can provide semantic meaning. For example, different kinds of traffic light system have different number of lights. If it knows that how many lights it should detect and percept. It will help a lot to design perception algorithms. 48 | - Decision and Planning: Including the realtime update of map, it helps to plan efficiently. 49 | - HD Map can help to reduce number of sensors. A feasible solution can be hardly proposed without HD map. 50 | - Main procedure to create an HD map 51 | - Data sourcing 52 | - Image, Point Cloud, GPS Track 53 | - Pre-Processing 54 | - Sensor fusion which combines information from GPS, IMU, LiDAR, and camera 55 | - Deep learning for segmentation and detection 56 | - Manual Verification 57 | - human being is needed to increase accuracy 58 | - Release 59 | - Usage of HD map 60 | - HD Map, ADAS Map Infotainment map 61 | - Format of HD map, openDrive 62 | - Update 63 | - A update cycle exists between cloud of map provider and car. Here is the [update cylce](https://mp.weixin.qq.com/s?__biz=MzI1NjkxOTMyNQ==&mid=100000179&idx=1&sn=63bdf976825c2770ef974a7c11ed8f6e&chksm=6a1e13c15d699ad78c84170c8f447e456742b608efadca25cf37001027c6f7ba6a16179dd72b&scene=20&key=ffd6e7826d53df79f8fa0d6d3a605b69eab835c3a7f6e9b417f0e172bb307a57c80de27a35ff2d4a84893b42496a1c745c1526c69788e175961ee4e8a14030b50cf6cf25047043361bedc1b89ae458ba&ascene=0&uin=MTE2NjQzMjMyMA%3D%3D&devicetype=iMac+MacBookPro12%2C1+OSX+OSX+10.12.6+build(16G29)&version=12020110&nettype=WIFI&lang=zh_CN&fontScale=100&pass_ticket=IYK%2FI9o2lXOo4hCjSwjLG0LZNdIURijz4X9oUzAknRcwcU2FAcspk%2BmQy6fFfhp0## "update cycle") of ApolloAuto HDMap from Baidu. 64 | ![](MapUpdateCycle.png) 65 | - Sensor fusion in the cloud 66 | Point cloud from LiDAR is pre-processed in the car, and post-processed and fused in the cloud, creating a continuously updated 3D map for SLAM. 67 | - Basic theories: 68 | - Coordination systems: Ego vehicle reference frame, Homogeneous coordinates 69 | - Sensor fusion algorithms: 70 | - *Bayesian filtering* 71 | - *Kalman Filter and Extended Kalman Filter* 72 | - *Particle Filter* 73 | - Clustering algorithm 74 | - K-means 75 | - Single Linkage Clustering 76 | - Methods for map extraction from point cloud 77 | - Dense reconstruction algorithms: 78 | Dense reconstruction is widely used for 3D printing, face recognition. A low cost solution(KinectFusion) which uses RGBD camera was proposed in 2013. 79 | - Sparce reconstruction algorithms: 80 | - Procedures for map extraction: 81 | - Compact map representation 82 | - Localization algorithm 83 | 84 | 85 | 86 | -------------------------------------------------------------------------------- /Reference/1704.02696.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/insyncim64/sensor_fusion_hdmap/dc6eeaae045289f32a88f7d1daa0b049b4fd6ef0/Reference/1704.02696.pdf -------------------------------------------------------------------------------- /Reference/219126.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/insyncim64/sensor_fusion_hdmap/dc6eeaae045289f32a88f7d1daa0b049b4fd6ef0/Reference/219126.pdf -------------------------------------------------------------------------------- /Reference/241975.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/insyncim64/sensor_fusion_hdmap/dc6eeaae045289f32a88f7d1daa0b049b4fd6ef0/Reference/241975.pdf -------------------------------------------------------------------------------- /Reference/252489.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/insyncim64/sensor_fusion_hdmap/dc6eeaae045289f32a88f7d1daa0b049b4fd6ef0/Reference/252489.pdf -------------------------------------------------------------------------------- /Reference/Grid-Based Multi-Sensor Fusion for On-Road Obstacle Detection- Application to Autonomous Driving.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/insyncim64/sensor_fusion_hdmap/dc6eeaae045289f32a88f7d1daa0b049b4fd6ef0/Reference/Grid-Based Multi-Sensor Fusion for On-Road Obstacle Detection- Application to Autonomous Driving.pdf -------------------------------------------------------------------------------- /Reference/Improving absolute position estimates of an automotive vehicle using GPS in sensor fusion.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/insyncim64/sensor_fusion_hdmap/dc6eeaae045289f32a88f7d1daa0b049b4fd6ef0/Reference/Improving absolute position estimates of an automotive vehicle using GPS in sensor fusion.pdf -------------------------------------------------------------------------------- /Reference/Offline Sensor Fusion for Multitarget Tracking using Radar and Camera Detections.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/insyncim64/sensor_fusion_hdmap/dc6eeaae045289f32a88f7d1daa0b049b4fd6ef0/Reference/Offline Sensor Fusion for Multitarget Tracking using Radar and Camera Detections.pdf -------------------------------------------------------------------------------- /Reference/T4C.1.SS2323_4079F1.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/insyncim64/sensor_fusion_hdmap/dc6eeaae045289f32a88f7d1daa0b049b4fd6ef0/Reference/T4C.1.SS2323_4079F1.pdf -------------------------------------------------------------------------------- /Reference/presentations-ch_201710_apollo_maojiming.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/insyncim64/sensor_fusion_hdmap/dc6eeaae045289f32a88f7d1daa0b049b4fd6ef0/Reference/presentations-ch_201710_apollo_maojiming.pdf --------------------------------------------------------------------------------