├── .gitattributes ├── README.md ├── files ├── alphasense_core_3d.zip ├── example_7s_sensors_dont_use.yaml └── ncamera_settings.yaml ├── images ├── alphasense_core_04mp_bandwidth.png ├── alphasense_core_16mp_bandwidth.png ├── alphasense_ip_setting.png ├── exposure_signal.png ├── nm_connection_editor.png └── viewer.png └── pages ├── calibration.md ├── configuring_the_network.md ├── faq.md ├── getting_started.md ├── gpio_connector.md ├── installation_and_upgrade.md ├── maximize_network_performance.md ├── ros_driver_usage.md ├── sensor_settings.md ├── time_synchronization.md └── usage_examples.md /.gitattributes: -------------------------------------------------------------------------------- 1 | *.zip filter=lfs diff=lfs merge=lfs -text 2 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Core Research manual 2 | 3 | > **Warning** 4 | > This repository applies to **Core Research**, a visual-inertial sensor. It **does not** apply to 5 | > [**Alphasense Position**](https://www.sevensense.ai/product/alphasense-position), our 6 | > Visual SLAM system. Documentation of Alphasense Position is provided directly by the 7 | > Sevensense Sales team. 8 | 9 | This is the manual for the Core Research Development Kit, a state-of-the-art 10 | visual-inertial sensor providing spatial awareness with its 360° view. It is 11 | manufactured and distributed by [Sevensense](https://www.sevensense.ai/). 12 | 13 | ![alphasense](https://uploads-ssl.webflow.com/5e2ed3c886f41759e22ec3e3/5e618820f7d4594c1e0d82a8_image-64-compressor.png) 14 | 15 | > **Note** 16 | > The manual assumes you are using driver and firmware version v2.10 or higher. 17 | 18 | Here you will find further information on the following topics: 19 | 20 | - [Getting started](pages/getting_started.md) 21 | - [Sensor settings](pages/sensor_settings.md) 22 | - [ROS driver usage](pages/ros_driver_usage.md) 23 | - [Configuring the network](pages/configuring_the_network.md) 24 | - [Time synchronization](pages/time_synchronization.md) 25 | - [Calibration](pages/calibration.md) 26 | - [Usage Examples](pages/usage_examples.md) 27 | - [Maximize network performance](pages/maximize_network_performance.md) 28 | - [Installation and upgrading of driver/firmware](pages/installation_and_upgrade.md) 29 | - [GPIO Connector](pages/gpio_connector.md) 30 | - [Frequently asked questions](pages/faq.md) 31 | - [Datasheet](https://hubs.ly/Q01XVycm0) 32 | - [STEP 3D model file](files/alphasense_core_3d.zip) 33 | 34 | If you have any questions or feedback, please contact us at 35 | . 36 | -------------------------------------------------------------------------------- /files/alphasense_core_3d.zip: -------------------------------------------------------------------------------- 1 | version https://git-lfs.github.com/spec/v1 2 | oid sha256:c879b5c38d20761056ed1e9196c4841933e46bc333e1e5c45e0dc6ce34dc2344 3 | size 2109026 4 | -------------------------------------------------------------------------------- /files/example_7s_sensors_dont_use.yaml: -------------------------------------------------------------------------------- 1 | # This file is an example calibration file, don't use this calibration file 2 | # for your own Core Research. 3 | ncameras: 4 | - cameras: 5 | - T_B_C: 6 | rows: 4 7 | cols: 4 8 | data: [-0.0089889092, 0.0097254569, 0.9999123037, 0.0520086092, 9 | -0.9999261367, -0.0082678741, -0.0089086176, 0.0488993112, 10 | 0.0081805087, -0.9999185256, 0.0097990577, -0.011757515, 11 | 0.0, 0.0, 0.0, 1.0] 12 | camera: 13 | distortion: 14 | parameters: 15 | rows: 4 16 | cols: 1 17 | data: [-0.0416026702, 0.0022689289, -0.0027567794, 0.000401603] 18 | type: equidistant 19 | id: 839e02677a1f4b5c91466b2bc5fdc33e 20 | image_height: 1080 21 | image_width: 1440 22 | intrinsics: 23 | rows: 4 24 | cols: 1 25 | data: [701.4165958679, 701.480279171, 668.2392112416, 517.9783218077] 26 | label: /alphasense_driver_ros/cam0 27 | line-delay-nanoseconds: 0 28 | type: pinhole 29 | - T_B_C: 30 | rows: 4 31 | cols: 4 32 | data: [0.0125928289, 0.006979524, 0.9998963481, 0.0518868177, 33 | 0.9999006789, 0.0062411982, -0.0126364486, -0.0649104681, 34 | -0.0063287477, 0.9999561659, -0.0069002365, -0.0118702141, 35 | 0.0, 0.0, 0.0, 1.0] 36 | camera: 37 | distortion: 38 | parameters: 39 | rows: 4 40 | cols: 1 41 | data: [-0.0401505666, -0.0043543854, 0.0037690653, -0.0016622367] 42 | type: equidistant 43 | id: e385b3f7d4304bb5a6c4d0bf921bd30d 44 | image_height: 1080 45 | image_width: 1440 46 | intrinsics: 47 | rows: 4 48 | cols: 1 49 | data: [696.6316066498, 696.4060613947, 654.6782697748, 508.3389579961] 50 | label: /alphasense_driver_ros/cam1 51 | line-delay-nanoseconds: 0 52 | type: pinhole 53 | - T_B_C: 54 | rows: 4 55 | cols: 4 56 | data: [-0.9999941366, -0.0024554137, -0.0023870029, 0.0119244293, 57 | -0.0024411744, 0.9999793185, -0.0059500663, -0.0090725585, 58 | 0.0024015634, -0.0059442043, -0.9999794493, -0.0413166987, 59 | 0.0, 0.0, 0.0, 1.0] 60 | camera: 61 | distortion: 62 | parameters: 63 | rows: 4 64 | cols: 1 65 | data: [-0.0391856077, -0.002086056, 0.0011426976, -0.0007623923] 66 | type: equidistant 67 | id: 1e4d1bc29e2949f08644e4b55e3e673d 68 | image_height: 1080 69 | image_width: 1440 70 | intrinsics: 71 | rows: 4 72 | cols: 1 73 | data: [702.6178299134, 702.1525588585, 683.0301275644, 465.8701574816] 74 | label: /alphasense_driver_ros/cam2 75 | line-delay-nanoseconds: 0 76 | type: pinhole 77 | - T_B_C: 78 | rows: 4 79 | cols: 4 80 | data: [0.9999636036, 0.0044487401, 0.0072801186, -0.0055397911, 81 | -0.0073021704, 0.0049608861, 0.9999610332, 0.0636896469, 82 | 0.0044124509, -0.9999777989, 0.004993191, -0.012049265, 83 | 0.0, 0.0, 0.0, 1.0] 84 | camera: 85 | distortion: 86 | parameters: 87 | rows: 4 88 | cols: 1 89 | data: [-0.0415514737, 0.0033150215, -0.0031847652, 0.0004627569] 90 | type: equidistant 91 | id: db5f9b2fee5447c9bf95ba68c4f5818e 92 | image_height: 1080 93 | image_width: 1440 94 | intrinsics: 95 | rows: 4 96 | cols: 1 97 | data: [694.1395188892, 693.643887969, 684.6818336991, 518.2240705591] 98 | label: /alphasense_driver_ros/cam3 99 | line-delay-nanoseconds: 0 100 | type: pinhole 101 | - T_B_C: 102 | rows: 4 103 | cols: 4 104 | data: [0.9997863618, 0.0077920857, -0.0191445589, -0.0052163478, 105 | -0.0191440399, -0.0001411904, -0.9998167261, -0.0631751336, 106 | -0.0077933606, 0.9999696313, 8.0117e-06, -0.0106140007, 107 | 0.0, 0.0, 0.0, 1.0] 108 | camera: 109 | distortion: 110 | parameters: 111 | rows: 4 112 | cols: 1 113 | data: [-0.0399233051, -0.0083899158, 0.0071120553, -0.002454966] 114 | type: equidistant 115 | id: e61e50b4906f4ea4b6d16893bdae491c 116 | image_height: 1080 117 | image_width: 1440 118 | intrinsics: 119 | rows: 4 120 | cols: 1 121 | data: [698.2931547887, 698.3666690515, 651.2675661506, 533.2089472599] 122 | label: /alphasense_driver_ros/cam4 123 | line-delay-nanoseconds: 0 124 | type: pinhole 125 | id: 5db04bcbc17b41259617449e73297ed5 126 | label: ncamera 127 | sensors: 128 | - default_biases: 129 | acc_bias: 130 | rows: 3 131 | cols: 1 132 | data: [-0.0321298649, -0.0593762732, 0.1706163708] 133 | gyro_bias: 134 | rows: 3 135 | cols: 1 136 | data: [0.0006150941, 7.78075e-05, 0.000727046] 137 | gravity_magnitude_mps2: 9.808083883386614 138 | hardware_id: /alphasense_driver_ros/imu 139 | id: 2da70378a45c4ddb9ed32f72c2fcf3a6 140 | saturation_accel_max_mps2: 150.0 141 | saturation_gyro_max_radps: 7.5 142 | sensor_type: IMU 143 | sigmas: 144 | acc_bias_random_walk_noise_density: 0.0043 145 | acc_noise_density: 0.019 146 | gyro_bias_random_walk_noise_density: 0.000266 147 | gyro_noise_density: 0.019 148 | -------------------------------------------------------------------------------- /files/ncamera_settings.yaml: -------------------------------------------------------------------------------- 1 | camera_config: &standard_camera_config 2 | # Exposure control. 3 | manual_exp_time_us: 10000 # 10 - 65000 4 | auto_exposure_enabled: true # true/false 5 | min_exp_time_us: 50 # 10 - 65000 6 | max_exp_time_us: 10000 # 10 - 65000 7 | autoexp_speed: 10 # 1 - 10 8 | autoexp_target_median: 70 # 0 - 255 9 | 10 | # Gain control. 11 | auto_gain_enabled: false # true/false 12 | autoexp_max_gain : 255 # 0 - 255 13 | manual_gain: 1 # 0 - 255 14 | 15 | # Auto exposure ROI control. 16 | auto_exposure_roi_enabled: false # true/false 17 | autoexp_start_row: 0 # 0 - 539/1079 18 | autoexp_end_row: -1 # -1 - 539/1079 19 | autoexp_start_col: 0 # 0 - 719/1439 20 | autoexp_end_col: -1 # -1 - 719/1439 21 | 22 | # Extra configuration. 23 | rotate_image: false # true/false 24 | 25 | ncamera: 26 | # Measurement rates. 27 | camera_frequency_hz: 10 # 1 - 75 28 | imu_frequency_hz: 200 # 100/200/400 29 | gyro_range_deg_per_sec: 2000 # 125/250/500/1000/2000 30 | 31 | # Network tuning parameters. 32 | pixels_per_packet: 1440 # Needs to be a divisor of the image resolution (height*width). 33 | peak_bandwidth_limit_mbps: 1000.0 # 125 - 1000 34 | image_socket_receive_buffer_size_mb: 10.0 # 1 - 100 35 | 36 | # PTP time synchronization parameters. 37 | expect_ptp_time_synchronization: false 38 | max_ptp_sync_lost_time_s: 60 39 | 40 | # Number of enabled cameras and individual camera configuration. 41 | num_cams: 5 # 1 - 8 42 | cams: 43 | 0: *standard_camera_config 44 | 1: 45 | # Force exposure parameters of stereo-pairs to be equal. 46 | autoexp_master_camera_index: 0 47 | 2: *standard_camera_config 48 | 3: *standard_camera_config 49 | 4: *standard_camera_config 50 | -------------------------------------------------------------------------------- /images/alphasense_core_04mp_bandwidth.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sevensense-robotics/core_research_manual/6280dd64b7271b8a26a21c1703f38086d636a2b2/images/alphasense_core_04mp_bandwidth.png -------------------------------------------------------------------------------- /images/alphasense_core_16mp_bandwidth.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sevensense-robotics/core_research_manual/6280dd64b7271b8a26a21c1703f38086d636a2b2/images/alphasense_core_16mp_bandwidth.png -------------------------------------------------------------------------------- /images/alphasense_ip_setting.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sevensense-robotics/core_research_manual/6280dd64b7271b8a26a21c1703f38086d636a2b2/images/alphasense_ip_setting.png -------------------------------------------------------------------------------- /images/exposure_signal.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sevensense-robotics/core_research_manual/6280dd64b7271b8a26a21c1703f38086d636a2b2/images/exposure_signal.png -------------------------------------------------------------------------------- /images/nm_connection_editor.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sevensense-robotics/core_research_manual/6280dd64b7271b8a26a21c1703f38086d636a2b2/images/nm_connection_editor.png -------------------------------------------------------------------------------- /images/viewer.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/sevensense-robotics/core_research_manual/6280dd64b7271b8a26a21c1703f38086d636a2b2/images/viewer.png -------------------------------------------------------------------------------- /pages/calibration.md: -------------------------------------------------------------------------------- 1 | # Calibration 2 | 3 | All Core Research Development Kits come with factory intrinsic and extrinsic calibration. We provide this calibration on our servers in several file formats. 4 | The driver comes with a command to easily obtain the calibration files belonging to your sensor. 5 | 6 | ## Calibration model 7 | 8 | 9 | ### Camera intrinsics 10 | 11 | The individual cameras are calibrated using a pinhole projection model. The distortion of the lens 12 | is modelled with an equidistant model with five terms. See the reference below for more information 13 | on the equidistant model. 14 | 15 | > :information_source: **Info**: The pinhole+equidistant model is not the same 16 | as the `plumb_bob` model that is commonly used in ROS. Because of our 17 | wide-angle lenses the equidistant model is better suited. 18 | 19 | *J. Kannala and S. Brandt (2006). A Generic Camera Model and Calibration Method for Conventional, Wide-Angle, and Fish-Eye Lenses, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no. 8, pp. 1335-1340* [(pdf link)](http://www.ee.oulu.fi/~jkannala/publications/tpami2006.pdf) 20 | 21 | ### Extrinsics 22 | 23 | For each camera we provide the 6-DOF transformation with respect to the IMU. 24 | 25 | ## File formats 26 | 27 | At the moment we produce two file formats `7s_sensors.yaml` and `kalibr_calib.yaml`. The main calibration file is `7s_sensors.yaml`. The others are generated to make our demos easy to use. 28 | 29 | ### 7s_sensors.yaml 30 | 31 | All calibrated parameters are available in this file. An example can be found [here](/files/example_7s_sensors_dont_use.yaml). 32 | 33 | For each camera there is a yaml node under `ncameras/cameras`, like the one below. 34 | 35 | ```yaml 36 | - T_B_C: 37 | rows: 4 38 | cols: 4 39 | data: [ 40 | -0.0089889092, 0.0097254569, 0.9999123037, 0.0520086092, 41 | -0.9999261367, -0.0082678741, -0.0089086176, 0.0488993112, 42 | 0.0081805087, -0.9999185256, 0.0097990577, -0.011757515, 43 | 0.0, 0.0, 0.0, 1.0] 44 | camera: 45 | distortion: 46 | parameters: 47 | rows: 4 48 | cols: 1 49 | data: [-0.0416026702, 0.0022689289, -0.0027567794, 0.000401603] 50 | type: equidistant 51 | id: 839e02677a1f4b5c91466b2bc5fdc33e 52 | image_height: 1080 53 | image_width: 1440 54 | intrinsics: 55 | rows: 4 56 | cols: 1 57 | data: [701.4165958679, 701.480279171, 668.2392112416, 517.9783218077] 58 | label: /alphasense_driver_ros/cam0 59 | line-delay-nanoseconds: 0 60 | type: pinhole 61 | ``` 62 | 63 | The `label` node identifies to which camera the calibration belongs. In this 64 | example the calibration is for camera 0. 65 | 66 | ```yaml 67 | label: /alphasense_driver_ros/cam0 68 | ``` 69 | 70 | The parameters for the intrinsic calibration are encoded in the `camera/intrinsics` and `camera/distortion` field for the projection model and distortion model respectively. 71 | 72 | **Projection model** 73 | 74 | The four parameters for the pinhole projection model are specified in the order `fu`, `fv`, `cu`, `cv`. 75 | Where the pair `fu` `fv` is the focal length in u and v directions on the image plane, and the pair `cu` `cv` is the location of the center of projection in uv coordinates. The unit of all four parameters is pixels. The u and v directions run across the width and height of the image plane. 76 | 77 | In this example `fu` = 701.4165958679, `fv`= 701.480279171, `cu` = 668.2392112416, and `cv` = 517.9783218077. 78 | 79 | ```yaml 80 | - camera: 81 | intrinsics: 82 | data: [701.4165958679, 701.480279171, 668.2392112416, 517.9783218077] 83 | ``` 84 | 85 | **Distortion model** 86 | 87 | The equidistant distortion model uses five parameters `k1`, `k2`, `k3`, `k4`, `k5` (see equation 6 of the reference mentioned above). We fix `k1` to 1, the other four parameters are encoded in `camera/distortion/parameters` in the same order. 88 | 89 | In this example `k1` = 1, `k2`= -0.0416026702, `k3` = 0.0022689289, `k4` = -0.0027567794, and `k5` = 0.000401603. 90 | 91 | ```yaml 92 | - camera: 93 | distortion: 94 | parameters: 95 | data: [-0.0416026702, 0.0022689289, -0.0027567794, 0.000401603] 96 | ``` 97 | 98 | **Extrinsics** 99 | 100 | Extrinsic calibration is encoded in the `T_B_C/data` node as a homogeneous transformation matrix with the translational part in meters. It expresses the pose of the camera in the frame of the IMU. 101 | 102 | ```yaml 103 | - T_B_C: 104 | data: [ 105 | -0.0089889092, 0.0097254569, 0.9999123037, 0.0520086092, 106 | -0.9999261367, -0.0082678741, -0.0089086176, 0.0488993112, 107 | 0.0081805087, -0.9999185256, 0.0097990577, -0.011757515, 108 | 0.0, 0.0, 0.0, 1.0] 109 | ``` 110 | 111 | 112 | 113 | ### kalibr_calib.yaml 114 | 115 | This file is meant for the [stereo demo](https://github.com/sevensense-robotics/alphasense_stereo_demo) we provide on github. 116 | 117 | ## Obtaining the calibration files 118 | 119 | The above mentioned files can be downloaded with the driver using the following 120 | command. 121 | 122 | ```console 123 | alphasense download_calibration - 7s_sensors.yaml 124 | ``` 125 | 126 | The command will output the file path on STDOUT. This can be used in a script 127 | to automatically get the correct calibration even when switching the Alphasense 128 | Core. 129 | 130 | ```bash 131 | CALIBRATION_FILE_PATH=`alphasense download_calibration - 7s_sensors.yaml` 132 | rosrun my_vio_package my_vio_node _calibration_file:=$CALIBRATION_FILE_PATH 133 | ``` 134 | 135 | The calibration can also be downloaded to a specific path on the computer 136 | 137 | ```console 138 | alphasense download_calibration - 7s_sensors.yaml ~/my-favorite-path/my-calibration-file 139 | ``` 140 | -------------------------------------------------------------------------------- /pages/configuring_the_network.md: -------------------------------------------------------------------------------- 1 | # Configuring the network 2 | 3 | The network settings of the device can be configured using the command line tools that are supplied with the driver. 4 | 5 | ## Setting up the host computer 6 | 7 | The host computer (the computer you connect the sensor to) needs be assigned 8 | a static IP address on the same subnet as the Core Research. The steps for 9 | setting this up depends on how the networkis managed on the host. We provide 10 | steps for a standard desktop Ubuntu setup and a standard server Ubuntu setup. 11 | 12 | **Default Core Research network settings** 13 | 14 | | Setting | Value | 15 | | ------------------- | ------------- | 16 | | Host IP Address | 192.168.77.78 | 17 | | Host Netmask | 255.255.255.0 | 18 | | Sensor IP Address | 192.168.77.77 | 19 | 20 | ### Setting up a static IP on Ubuntu desktop 21 | 22 | Follow the steps in the [Getting started with Core Research](/pages/getting_started.md#setting-up-the-network-configuration) to configure 23 | the network through the Network Manager GUI in Ubuntu. 24 | 25 | ### Setting up a static IP on Ubuntu server 26 | 27 | > :warning: **Warning**: The instructions below might not work properly when the network is 28 | >being managed by a network manager like "NetworkManager" on Ubuntu desktop. In that case you 29 | >have to consult the manual of the network manager on how to assign a static IP. 30 | 31 | First you need to figure out to which network interface the Core Research is connected. A list of network interfaces can be obtained with 32 | 33 | ```console 34 | ip link 35 | ``` 36 | 37 | ```console 38 | sevensense@7s-workstation:~$ ip link 39 | 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000 40 | link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 41 | 2: enp5s0: mtu 1500 qdisc mq state DOWN mode DEFAULT group default qlen 1000 42 | link/ether 68:05:ca:b1:2a:76 brd ff:ff:ff:ff:ff:ff 43 | 3: eno1: mtu 1500 qdisc fq_codel state UP mode DEFAULT group default qlen 1000 44 | link/ether 04:92:26:5d:04:9c brd ff:ff:ff:ff:ff:ff 45 | 4: enp3s0: mtu 1500 qdisc noop state DOWN mode DEFAULT group default qlen 1000 46 | link/ether 68:05:ca:b8:09:8a brd ff:ff:ff:ff:ff:ff 47 | 5: enx000acd338ed4: mtu 1500 qdisc fq_codel state DOWN mode DEFAULT group default qlen 1000 48 | link/ether 00:0a:cd:33:8e:d4 brd ff:ff:ff:ff:ff:ff 49 | ``` 50 | 51 | When the network interface is known it needs to be set up and assigned a static IP. 52 | 53 | ```console 54 | sudo ip link set INTERFACE_NAME up 55 | sudo ip address add dev INTERFACE_NAME 192.168.77.78/24 56 | ``` 57 | 58 | For example to do this for the interface `enp3s0` execute 59 | 60 | ```console 61 | sevensense@7s-workstation:~$ sudo ip link set enp3s0 up 62 | sevensense@7s-workstation:~$ sudo ip address add dev enp3s0 192.168.77.78/24 63 | ``` 64 | 65 | The assingment is not permanent and will be lost after reboot. 66 | We recommend looking at https://netplan.io/ for setting up a permanent network configuration. 67 | 68 | ## Listing devices attached to the network 69 | 70 | Once the network is setup, a list of all Core Research sensors attached to the network can be obtained with `alphasense list`. 71 | 72 | ``` 73 | sevensense@7s-workstation:~$ alphasense list 74 | - Sevensense AS1 720x540 (sn: 3A1319969034225F i: as1-ethernet l: 192.168.1.128) 75 | ``` 76 | 77 | The Core Research sensors are identified by their serial number, 78 | which for the sensor in the example above is *3A1319969034225F*. 79 | 80 | If your Core Research does not show up in the list, the following should be checked: 81 | 82 | 1. Is the Core Research powered on, and the ethernet cable connected? 83 | 2. Is the LED on the sensor's ethernet port blinking? If not check the ethernet cable. 84 | 3. Is your network configured correctly? In case of a new sensor check out the 85 | [Setting up the network configuration](/pages/getting_started.md#setting-up-the-network-configuration) 86 | section. You might have to reactivate the network profile after reconnecting the ethernet cable. You can check if the network interface is up and has an IP assigned by exeucting `ip address` 87 | 4. When you are not sure of the network configuration anymore, 88 | see [Recovering a device with unknown network configuration](/pages/configuring_the_network.md#recovering-a-device-with-an-unknown-network-configuration). 89 | 90 | > :information_source: **Info**: The Core Research does not respond to pings. This does not mean the device is broken. 91 | 92 | ## Showing device information 93 | 94 | More information about a device can be viewed using the `alphasense show SERIAL` command. If only one device is attached 95 | the serial number can be replaced with `-` to get information about the first found device. 96 | 97 | > :warning: **Warning**: Do not run this command with the driver running. This command might interrupt the image and IMU streams. 98 | 99 | ``` 100 | sevensense@7s-workstation:~$ alphasense show 3A1319969034225F 101 | Device: Sevensense AS1 720x540 (sn: 3A1319969034225F i: as1-ethernet l: 192.168.1.128) 102 | 103 | Device persistent configuration: 104 | device_ip 192.168.77.77 105 | gateway_ip 192.168.77.1 106 | host_ip 192.168.77.78 107 | mac_address 70:b3:d5:9c:30:36 108 | subnet_mask 0.0.0.0 109 | 110 | Device information: 111 | baseboard_imu_type BMI085 112 | cam0_id not-connected 113 | cam1_id not-connected 114 | cam2_id not-connected 115 | cam3_id not-connected 116 | cam4_id not-connected 117 | cam5_id not-connected 118 | cam6_id not-connected 119 | cam7_id not-connected 120 | firmware_version 125389894 121 | serial_number 3A1319969034225F 122 | fpga_size 35 123 | fpga_temperature_degrees_c 49 124 | image_sensor_type IMX287 125 | ``` 126 | 127 | ## Modifying persistent configuration 128 | 129 | Persistent configuration options like `device_ip` can be modified using the `alphasense set` command. 130 | To for example change the sensors IP to `192.168.77.76` and the subnet mask to `255.255.0.0`, run the command below. 131 | 132 | ``` 133 | alphasense set 3A1319969034225F device_ip 192.168.77.76 subnet_mask 255.255.0.0 gateway_ip 192.168.77.1 134 | ``` 135 | 136 | > :warning: **Warning**: **Network settings become active immediately.** It is therefore important that you change all desired network settings with one `alphasense set ...` call. 137 | 138 | When the sensors IP is changed an exception might be shown, this is because the 139 | sensor is not reachable anymore under the old IP. 140 | 141 | ``` 142 | sevensense@7s-workstation:~$ alphasense set 3A1319969034225F device_ip 10.0.0.1 143 | terminate called after throwing an instance of 'alphasense::AS1NetworkException' 144 | what(): exchangeSettingsWithFpga maximum number of retries reached. 145 | ``` 146 | 147 | The Core Research sends the image streams to a preconfigured host IP. 148 | The factory default host IP is `192.168.77.78`. The command below can be used to change 149 | this host IP. In the example the host IP is changed to `192.168.77.177` 150 | 151 | ``` 152 | alphasense set 3A1319969034225F host_ip 192.168.77.177 153 | ``` 154 | 155 | ## Recovering a device with an unknown network configuration 156 | 157 | When the devices network settings are modified and the sensor is outside of the subnet configured on 158 | the host computer. It will not be discoverable anymore by the driver. This is because of the reverse 159 | path filtering in the linux kernel, which blocks broadcasts sent from an IP not within the subnet. 160 | 161 | ``` 162 | sevensense@7s-workstation:~$ alphasense list 163 | No devices found. 164 | ``` 165 | 166 | Disable reverse path filtering by executing the following two commands, replacing `INTERFACE_NAME` with 167 | the name of the interface the device is connected to (for example `eth0` or `enp1s0`): 168 | 169 | ``` 170 | sudo sysctl net.ipv4.conf.all.rp_filter=0 171 | sudo sysctl net.ipv4.conf.INTERFACE_NAME.rp_filter=0 172 | ``` 173 | 174 | With reverse path filtering disabled the device can be discovered (the discovered IP address 175 | is `10.1.1.77`). 176 | 177 | ``` 178 | sevensense@7s-workstation:~$ alphasense list 179 | - Sevensense AS1 720x540 (sn: 3A1319969034324F i: as1_ethernet l: 10.1.1.77) 180 | ``` 181 | 182 | To be able to communicate with the device the host computer's network has to be configured with a 183 | subnet that contains the IP of the device (`10.1.1.77`). It is recommended to set the host computer's 184 | IP to either one IP higher or lower than the device in this case `10.1.1.76` or `10.1.1.78`. This new IP has to be set as host IP for the Core Research using: 185 | 186 | ``` 187 | alphasense set 3A1319969034324F host_ip 10.1.1.76 188 | ``` 189 | 190 | The device 191 | should now be accessible again. 192 | 193 | If you want to reset the factory default network settings, run the following command: 194 | 195 | ``` 196 | alphasense set 3A1319969034324F device_ip 192.168.77.77 host_ip 192.168.77.78 subnet_mask 0.0.0.0 gateway_ip 192.168.77.1 197 | ``` 198 | 199 | After changing the network settings the device can be discovered again with the default network settings. 200 | 201 | ``` 202 | sevensense@7s-workstation:~$ alphasense list 203 | - Sevensense AS1 720x540 (sn: 3A1319969034324F i: as1_ethernet l: 192.168.77.77) 204 | ``` 205 | 206 | Reverse path filtering can now be enabled again: 207 | 208 | ``` 209 | sudo sysctl net.ipv4.conf.all.rp_filter=1 210 | sudo sysctl net.ipv4.conf.INTERFACE_NAME.rp_filter=1 211 | ``` -------------------------------------------------------------------------------- /pages/faq.md: -------------------------------------------------------------------------------- 1 | # Frequently asked questions 2 | 3 | We would like to answer frequently asked questions on this page. If you do not 4 | find an answer here, feel free to contact us at with 5 | your request. 6 | 7 | - [I would like to mount Core Research on my robot, what should I consider?](#i-would-like-to-mount-alphasense-core-on-my-robot-what-should-i-consider) 8 | - [Are all images on Core Research recorded at the same time?](#are-all-images-on-alphasense-core-recorded-at-the-same-time) 9 | - [I don’t want to use ROS to access the data, what should I do?](#i-dont-want-to-use-ros-to-access-the-data-what-should-i-do) 10 | - [Why is my sensor dropping frames?](#why-is-my-sensor-dropping-frames) 11 | - [Can I use the Core Research driver on an ARM architecture?](#can-i-use-the-alphasense-core-driver-on-an-arm-architecture) 12 | - [How do I change the IP address of the sensor?](#how-do-i-change-the-ip-address-of-the-sensor) 13 | - [Can I synchronize the internal clock of the Core Research to an external system?](#can-i-synchronize-the-internal-clock-of-the-alphasense-core-to-an-external-system) 14 | - [How do I fix "Image receive timed out"?](#how-do-i-fix-image-receive-timed-out) 15 | - [Why is my Core Research not detected?](#why-is-my-alphasense-core-not-detected) 16 | - [How do I find the serial number of my Core Research?](#how-do-i-find-the-serial-number-of-my-alphasense-core) 17 | 18 | 19 | ## I would like to mount Core Research on my robot, what should I consider? 20 | 21 | In general, this depends on the application, but there are some common 22 | guidelines which should be considered: 23 | - First of all, please consult the 24 | [datasheet](https://hubs.ly/Q01XVycm0) 25 | to find all relevant specifications such as the position of the mounting 26 | holes at the bottom, required power supply, or opening angles of the cameras. 27 | - Related to that: Mount the sensor at a position where none of the cameras 28 | sees the hull of the robot. Note that the cameras have a wide field of view. 29 | - If you plan to fuse the wheel odometry with your state estimation from the 30 | IMU/camera data, it is important that the there is a non-moving connection 31 | between the Core Research mounting position and the wheel odometry frame. 32 | - Depending on the kind of robot on which you are mounting Alphasense, we 33 | suggest to dampen the sensor in case you expect high-frequency vibrations, 34 | which can be relevant for a better state estimation output. This can for 35 | example be done with Silicone rubber dampers which are put between the sensor 36 | frame and the robot. 37 | - The sensor is well suited for outdoor scenarios, especially the auto exposure 38 | is designed to work well despite the impact of sunlight. However, it has to 39 | be protected against rain/dust. 40 | 41 | ## Are all images on Core Research recorded at the same time? 42 | 43 | Yes, all image and IMU streams are hardware-synchronized. During auto exposure 44 | operation, different cameras might have different exposure times, and they are 45 | synchronized in the middle of the exposure time. The timestamp of an image 46 | corresponds then to this mid-frame time. 47 | 48 | ## I don’t want to use ROS to access the data, what should I do? 49 | 50 | We are currently working on a ROS-free API, which we will publish soon. Using 51 | this, it will then be possible to use the image/IMU data directly in your C++ 52 | application independently of ROS or the catkin build system. 53 | 54 | ## Why is my sensor dropping frames? 55 | 56 | Please check the following: 57 | - See the [Maximize network performance](/pages/maximize_network_performance.md) page for guidelines on configuring the system and Core Research for maximal performance. 58 | - Are you using a Gigabit Ethernet link? The driver will print a warning if 59 | this is not the case. Alternatively, this can also be verified manually by 60 | installing the `ethtool` package (`sudo apt install ethtool`) and checking 61 | that the output of `ethtool ETHERNET_INTERFACE_NAME | grep Speed` is 62 | `Speed: 1000Mb/s`. You can find the `ETHERNET_INTERFACE_NAME` with the 63 | following command: 64 | `find /sys/class/net -type l -not -lname '*virtual*' -printf '%f\n'`. A usual 65 | interface name is for example `eth0`. 66 | - Are you using an USB/Ethernet adapter? Note that there can be issues with 67 | some adapters. If available, we suggest to use the built-in Ethernet port 68 | instead of a USB/Ethernet adapter. 69 | - Does the configured frame rate fit the Gigabit Ethernet bandwidth? If this is 70 | not the case, try lowering the frame rate as described 71 | [here](/pages/sensor_settings.md#frame-rate-and-imu-frequency). 72 | 73 | ## Can I use the Core Research driver on an [ARM architecture](https://en.wikipedia.org/wiki/ARM_architecture)? 74 | 75 | Yes this is possible. You can install the driver for the `arm64` architecture, see [Installation](/pages/installation_and_upgrade.md#installation). 76 | 77 | ## How do I change the IP address of the sensor? 78 | 79 | The IP can be changed by following the instructions in 80 | [Modifying persistent configuration](/pages/configuring_the_network.md#modifying-persistent-configuration). 81 | 82 | ## Can I synchronize the internal clock of the Core Research to an external system? 83 | 84 | Yes, the Core Research supports the PTP (Precision Time Protocol) protocol to synchronize the internal clock over the 85 | ethernet connection. Follow the instructions in [Synchronized time](/pages/time_synchronization.md#time-synchronization) to set this up. 86 | 87 | ## How do I fix "Image receive timed out"? 88 | 89 | The "Image receive timed out" means that the driver is not receiving image stream packets from the Core Research. This can happen because of many reasons. Some commonly occuring reasons are: 90 | 91 | * The Core Research is not powered anymore or the ethernet cable is unplugged. 92 | * The network configuration has been changed. 93 | * The Core Research is configured to send data to a different host IP, this can be checked with `alphasense show -`. 94 | * There is another driver running. Only one driver can be running at a time. 95 | * The `pixels_per_packet` setting is set higher than the MTU of the network interface. See [Network interface MTU](/pages/maximize_network_performance.md#network-interface-mtu). 96 | 97 | 98 | ## Why is my Core Research not detected? 99 | 100 | When the Core Research is not detected with `alphasense list`, check the following points: 101 | 102 | * The Core Research is powered, there should be a green LED blinking directly on the main PCB behind the ethernet connector. 103 | * There is an active ethernet connection. There should be two LEDs on the ethernet port, one should be lit constantly, the other blinking. 104 | * The network profile you created is activated. It is easiest to verify this by running `ip address` in the terminal and check if the interface has the correct IP address assigned. 105 | * The firewall is disabled. 106 | * No network related errors are printed in `dmesg`. 107 | 108 | When the above points do not solve the problem. A conflicting network configuration is often the problem. Double check all custom network configuration. 109 | 110 | To check that the Core Research is actually sending out packets `tcpdump` can be used (`INTERFACE_NAME` should be replaced with the name of the interface the Core Research is connected to): 111 | 112 | ```console 113 | sevensense@7s-workstation:~$ sudo tcpdump -i INTERFACE_NAME udp port 5349 -vv -nn -n 114 | tcpdump: listening on enp5s0, link-type EN10MB (Ethernet), capture size 262144 bytes 115 | 11:22:56.792794 IP (tos 0x0, ttl 64, id 0, offset 0, flags [DF], proto UDP (17), length 221) 116 | 192.168.77.77.5349 > 255.255.255.255.5349: [no cksum] UDP, length 193 117 | 11:22:56.912826 IP (tos 0x0, ttl 64, id 0, offset 0, flags [DF], proto UDP (17), length 221) 118 | 192.168.77.77.5349 > 255.255.255.255.5349: [no cksum] UDP, length 193 119 | 11:22:57.032827 IP (tos 0x0, ttl 64, id 0, offset 0, flags [DF], proto UDP (17), length 221) 120 | 192.168.77.77.5349 > 255.255.255.255.5349: [no cksum] UDP, length 193 121 | ``` 122 | 123 | ## How do I find the serial number of my Core Research? 124 | 125 | The serial number of the Core Research can be found by connecting it to a computer and running `alphasense show -` 126 | 127 | The serial number in the example below is `3A1319969034225F`. 128 | 129 | ``` 130 | sevensense@7s-workstation:~$ alphasense show - 131 | 132 | ... 133 | 134 | Device information: 135 | baseboard_imu_type BMI085 136 | cam0_id not-connected 137 | cam1_id not-connected 138 | cam2_id not-connected 139 | cam3_id not-connected 140 | cam4_id not-connected 141 | cam5_id not-connected 142 | cam6_id not-connected 143 | cam7_id not-connected 144 | firmware_version 125389894 145 | serial_number 3A1319969034225F 146 | fpga_size 35 147 | fpga_temperature_degrees_c 49 148 | image_sensor_type IMX287 149 | ``` 150 | -------------------------------------------------------------------------------- /pages/getting_started.md: -------------------------------------------------------------------------------- 1 | # Getting started with Core Research 2 | 3 | Congratulations on your Core Research 5-Camera Development Kit! Start using 4 | it by following these simple steps: 5 | 6 | ## What's in the box? 7 | 8 | - **The sensor:** a development kit with 5 cameras and an IMU (synchronized) 9 | - **Cable adapter** from the 2.1x5.5 mm Barrel Plug of the AC adapter to the Molex Nano-Fit connector of the sensor 10 | 11 | ## What else is needed? 12 | 13 | - A **Category 6 (Cat6) Ethernet cable** to transfer the data to your device 14 | - A **12V 25W AC wall adapter** with a 2.1x5.5 mm Barrel Plug that mates to the adapter cable (eg. [CUI SWI25-12-E-P5 with an EU plug](https://www.mouser.ch/ProductDetail/490-SWI25-12-E-P5)) 15 | 16 | ## Installing the driver 17 | 18 | > :information_source: **Info**: Have a look at the 19 | [Installation](/pages/installation_and_upgrade.md) page for a list of supported Ubuntu versions and processor architectures. 20 | 21 | Ubuntu 18.04 or 20.04 is required to install the driver. We also recommend installing ROS Melodic/Noetic `desktop-full` ([ROS 22 | installation](http://wiki.ros.org/melodic/Installation/Ubuntu)) A `ros-base` installation is sufficient, but we recommend 23 | the `desktop-full` to use the `rqt` GUI tools mentioned in other parts of 24 | this manual. 25 | 26 | The following instructions describe the steps for adding the Sevensense APT 27 | repository and installing the Alphasense driver contained therein. 28 | 29 | ``` 30 | # Install curl. 31 | sudo apt install curl 32 | 33 | # Add the Sevensense PGP key to make this machine trust Sevensense's packages. 34 | curl -Ls http://deb.7sr.ch/pubkey.gpg | sudo gpg --dearmor -o /usr/share/keyrings/deb-7sr-ch-keyring.gpg 35 | 36 | # Add the Sevensense APT repository to the list of known sources. 37 | echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/deb-7sr-ch-keyring.gpg] http://deb.7sr.ch/alphasense/stable $(lsb_release -cs) main" \ 38 | | sudo tee /etc/apt/sources.list.d/sevensense.list 39 | 40 | # Install the Alphasense driver. 41 | sudo apt update 42 | sudo apt install alphasense-driver-core alphasense-viewer alphasense-firmware ros-melodic-alphasense-driver-ros ros-melodic-alphasense-driver 43 | ``` 44 | 45 | ## Setting up the network configuration 46 | 47 | In order to connect your Core Research device to your computer, a network 48 | configuration with a static IP needs to be set up. 49 | 50 | To set this up, open a terminal, launch the `nm-connection-editor` and add a 51 | new `Ethernet` connection: 52 | 53 | ![nm_connection_editor](/images/nm_connection_editor.png) 54 | 55 | Name the new connection `alphasense`. Under the `IPv4 Settings` tab, change the 56 | method to `Manual` and add the following static IP address: 57 | 58 | ![ip_settings](/images/alphasense_ip_setting.png) 59 | 60 | Save the new configuration. 61 | 62 | By default the sensor is configured with the following network settings. 63 | 64 | | Setting | Value | 65 | | ------------------- | ------------- | 66 | | Host IP Address | 192.168.77.78 | 67 | | Host Netmask | 255.255.255.0 | 68 | | Sensor IP Address | 192.168.77.77 | 69 | 70 | ## Running the driver 71 | 72 | Connect the sensor to the power cable and the Ethernet cable. Click on the 73 | Network Manager icon (usually located in the top bar of Ubuntu) and select the 74 | `alphasense` network connection. 75 | 76 | There are two ways to run the Alphasense driver, described in the following two 77 | sections: 78 | 79 | ### 1) Launching the standalone viewer 80 | 81 | First increase the maximum allowed socket buffer size: 82 | 83 | ``` 84 | sudo sysctl -w net.core.rmem_max=11145728 85 | ``` 86 | 87 | This setting can be made permanent by creating a file called 88 | `/etc/sysctl.d/90-increase-network-buffers.conf` with the option: 89 | 90 | ```console 91 | echo -e "net.core.rmem_max=11145728" | sudo tee /etc/sysctl.d/90-increase-network-buffers.conf 92 | ``` 93 | 94 | After that, you can launch the Alphasense GUI which will display all available 95 | image streams. 96 | 97 | Launch the Alphasense GUI with the following command: 98 | 99 | ``` 100 | viewalphasense 101 | ``` 102 | 103 | ![viewalphasense](/images/viewer.png) 104 | 105 | ```console 106 | sevensense@7s-workstation:~$ viewalphasense 107 | Found camera, opening... 108 | [ INFO|2020-07-21 15:53:21.389801] Assuming camera is connected to NIC enp5s0. 109 | Opened! writing settings... 110 | [ INFO|2020-07-21 15:53:21.391719] Calculated an inter packet delay of 1 us. 111 | [ INFO|2020-07-21 15:53:21.593148] Reallocating image stream packet buffers (6990 packets @ 1500 bytes). 112 | [ INFO|2020-07-21 15:53:21.595462] Resizing image socket buffer to 10485760 bytes. 113 | Ready! 114 | 202 IMU measurements received in 1 second. 115 | |- Current IMU measurement timestamp: 12 190998900 116 | |- accl.x: -1.6711 117 | |- accl.y: -0.0843384 118 | |- accl.z: -9.47461 119 | |- gyro.x: 0.000213059 120 | |- gyro.y: 0.0288695 121 | |- gyro.z: -0.00255671 122 | ``` 123 | 124 | ### 2) Launching the ROS driver 125 | 126 | As an alternative to using the standalone viewer and in order to access the 127 | streamed data, our ROS driver can be used. Note that only one driver at a time 128 | can be running, so `viewalphasense` needs to be stopped before launching this 129 | one. 130 | 131 | 132 | If not already done before, increase the maximum allowed socket buffer size: 133 | 134 | ``` 135 | sudo sysctl -w net.core.rmem_max=11145728 136 | ``` 137 | 138 | Start the ROS driver of Alphasense, run the following command: 139 | 140 | 141 | ``` 142 | source /opt/ros/melodic/setup.bash 143 | roscore& 144 | rosrun alphasense_driver_ros alphasense_driver_ros 145 | 146 | ``` 147 | 148 | Via ROS messages we offer an easy way to access the images and the IMU data. 149 | See [ROS Interface](/pages/ros_driver_usage.md) for more information on the provided 150 | topics and configuration parameters. 151 | 152 | We recommend tools such as [rqt_image_view](http://wiki.ros.org/rqt_image_view) 153 | and [rqt_plot](http://wiki.ros.org/rqt_plot) to inspect the images and the IMU 154 | data. 155 | 156 | Find out more about the sensor settings and how to permanently set them 157 | [here](/pages/sensor_settings.md). 158 | -------------------------------------------------------------------------------- /pages/gpio_connector.md: -------------------------------------------------------------------------------- 1 | # Core Research GPIO Functions 2 | 3 | The Core Research has a GPIO connector that can be used for several different functions. On a standard Core Research all extra functions are disabled, contact support@sevensense.ch for function activation keys. 4 | 5 | **Currently supported functions** 6 | * [External Exposure Signal](#external-exposure-signal): GPIO1 Will go high when the camera with the longest exposure time starts its exposure, and will go low again when that camera finishes its exposure. In other words, the pulse on the GPIO1 pin will cover the exposure of all cameras. 7 | 8 | The voltage for all GPIO pins is 3.3V for a logic 1 and 0V for a logic 0. 9 | 10 | ## The Connector 11 | 12 | Two versions of the connector exists. The older Core Research models have an 8 pin connector, the newer ones have a 10 pin connector. At the moment they both support the same extensions. 13 | 14 | ### 10 pin (new version) 15 | 16 | Connector part no: [SM10B-GHS-TB from JST](http://www.jst-mfg.com/product/pdf/eng/eGH.pdf) 17 | Plug part no: [GHR-10V-S from JST](http://www.jst-mfg.com/product/pdf/eng/eGH.pdf) 18 | 19 | **Pin numbering is according to the manufacturer's datasheet.** 20 | 21 | Pin Number | Name 22 | --- | --- 23 | 1 | GPIO7 24 | 2 | GPIO8 25 | 3 | GPIO1 (External Exposure Signal) 26 | 4 | GPIO2 27 | 5 | 3.3V 28 | 6 | GPIO3_CS 29 | 7 | GPIO4_DIN 30 | 8 | GND 31 | 9 | GPIO5_DOUT 32 | 10 | GPIO6_SCLK 33 | 34 | ### 8 pins (old version) 35 | Connector part no: [SM08B-GHS-TB from JST](http://www.jst-mfg.com/product/pdf/eng/eGH.pdf) 36 | Plug part no: [GHR-08V-S from JST](http://www.jst-mfg.com/product/pdf/eng/eGH.pdf) 37 | 38 | **Pin numbering is according to the manufacturer's datasheet.** 39 | 40 | Pin Number | Name 41 | --- | --- 42 | 1 | GPIO1 (External Exposure Signal) 43 | 2 | GPIO2 44 | 3 | 3.3V 45 | 4 | GPIO3_CS 46 | 5 | GPIO4_DIN 47 | 6 | GND 48 | 7 | GPIO5_DOUT 49 | 8 | GPIO6_SCLK 50 | 51 | # External Exposure Signal 52 | 53 | This function can be used to control for example an LED in sync with the exposure interval of the cameras. 54 | 55 | GPIO1 will go high 10us before the first camera shutter opens, and goes low 10us after all camera shutters are closed. This is illustrated in the diagram below. 56 | 57 | ![alphasense](../images/exposure_signal.png) 58 | -------------------------------------------------------------------------------- /pages/installation_and_upgrade.md: -------------------------------------------------------------------------------- 1 | # Installation 2 | 3 | The Core Research driver binaries for Ubuntu are available from the Sevensense APT repositories. 4 | The packages are built for Ubuntu 18.04/ROS Melodic and Ubuntu 20.04/ROS Noetic, both for AMD64 (also known as x86_64) and ARM64 architectures. 5 | 6 | Please follow the instructions below for the installation procedure. 7 | 8 | 9 | ## Adding the repository 10 | 11 | First you need to trust the Sevensense key used to sign the packages. 12 | 13 | ```console 14 | sudo apt install curl 15 | curl -Ls http://deb.7sr.ch/pubkey.gpg | sudo gpg --dearmor -o /usr/share/keyrings/deb-7sr-ch-keyring.gpg 16 | ``` 17 | 18 | 19 | > :information_source: **Info**: When using special certificates supplied by Sevensense, make sure that they are up-to-date. Expired certificates will cause authentication errors even though this repository is public. 20 | 21 | Then you need to add the repository from the list above to your APT configuration. 22 | 23 | ```console 24 | echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/deb-7sr-ch-keyring.gpg] http://deb.7sr.ch/alphasense/stable $(lsb_release -cs) main" \ 25 | | sudo tee /etc/apt/sources.list.d/sevensense.list 26 | sudo apt update 27 | ``` 28 | 29 | ## Installing the driver 30 | 31 | The driver is split into several packages: 32 | 33 | * `alphasense-driver-core`: Core driver package that always needs to be installed. 34 | * `alphasense-viewer`: Viewer application to check the image streams. 35 | * `alphasenese-firmware`: Package to distribute firmware upgrades. 36 | * `ros-melodic-alphasense-driver-ros`: ROS interface to the driver. 37 | 38 | To install **all packages** run: 39 | 40 | ```console 41 | sudo apt install alphasense-driver-core alphasense-viewer alphasense-firmware ros-melodic-alphasense-driver-ros 42 | ``` 43 | 44 | To install the driver **without ROS** support: 45 | 46 | ```console 47 | sudo apt install alphasense-driver-core alphasense-viewer alphasense-firmware 48 | ``` 49 | 50 | # Upgrading the firmware 51 | 52 | The newest firmware can be flashed using the flashing tool included in the driver. Customers will receive a notification email when new firmware is available. 53 | 54 | Follow these steps to upgrade to the newest firmware and driver: 55 | 56 | > :warning: **Warning**: Do not unplug the power to the Core Research while flashing is in progress. This can brick the device. 57 | 58 | 1. Upgrade the driver and firmware packages 59 | ```console 60 | sudo apt install alphasense-driver-core alphasense-viewer alphasense-firmware ros-melodic-alphasense-driver-ros 61 | ``` 62 | 2. Connect **ONLY ONE** Core Research to the host. 63 | 3. Run the upgrade utility, and confirm the upgrade. 64 | 65 | ```console 66 | flashalphasense 67 | ``` 68 | 69 | ```console 70 | sevensense@7s-workstation:~$ flashalphasense 71 | Found device 'Sevensense AS1 720x540 (sn: 3A13199690341453 i: as1_ethernet l: 192.168.77.77)' 72 | Loading firmware from '/usr/lib/alphasense_firmware/data/fw_ethernet_206430361_IMX287_35.mcs.asf'. 73 | 74 | !IMPORTANT! Make sure the device has constant power while flashing is in progress. Interrupting power during flashing can break the device. 75 | Device to be flashed: Sevensense AS1 720x540 (sn: 3A13199690341453 i: as1_ethernet l: 192.168.77.77) 76 | Firmware to be flashed: Alphasense firmware version 206430361, built on 2020-06-24T01:30:47 by Sevensense Robotics AG (206430361) 77 | Proceed with flashing [y/N]? 78 | y 79 | Start of flashing process, DO NOT TURN OFF DEVICE! 80 | Uploading firmware... 81 | 100% 82 | Writing to persistent memory... 83 | Flashing succeeded, please reboot the device! 84 | ``` 85 | 4. When "Flashing succeeded, please reboot the device!" appears the Alphasense 86 | Core can be rebooted by unplugging and then replugging the power. 87 | -------------------------------------------------------------------------------- /pages/maximize_network_performance.md: -------------------------------------------------------------------------------- 1 | # Maximize network performance 2 | 3 | To maximize the performance of the Core Research some default configuration 4 | options need to be changed. Network packet drop and therefore frame drops 5 | can be eliminated by following the guidelines below. 6 | 7 | The guidelines are split into the following three categories: 8 | 9 | 1. Hardware setup 10 | 2. Kernel and network configuration 11 | 3. Core Research configuration 12 | 13 | ## Hardware setup 14 | 15 | We recommend using a modern PCIe gigabit ethernet card that supports jumbo frames 16 | (for example cards based on the Intel I2xx controllers). 17 | USB to Ethernet adapters are not recommended for a final setup. 18 | They will work but aren't as reliable and might drop frames. 19 | 20 | Some embedded platforms like for example the NVidia Jetson TX2 cannot reliably support 21 | the full gigabit bandwidth on their integrated ethernet ports and will have a reduced maximum frame rate. 22 | 23 | If the Core Research is connected over a switch, make sure that it supports 24 | gigabit ethernet speeds. When multiple devices are connected to the switch and the 25 | Core Research is sharing bandwidth with other sensors, a part of the bandwidth 26 | will need to be reserved for the Core Research. The achievable maximum frame rate will 27 | depend on the amount of bandwidth reserved for the Core Research. Note that this reserved 28 | bandwidth needs to be configured as the peak bandwidth limit, 29 | see [Core Research configuration](/pages/maximize_network_performance.md#alphasense-core-configuration). 30 | 31 | The graphs below show the maximum achievable frame rate for a give number of sensors 32 | and amount of reserved bandwidth. 33 | 34 | ![nm_connection_editor](/images/alphasense_core_04mp_bandwidth.png) 35 | 36 | ![nm_connection_editor](/images/alphasense_core_16mp_bandwidth.png) 37 | 38 | Correct cabling that supports gigabit ethernet speeds must be used in the network 39 | between the host and the Core Research. In most cases this means CAT5e or CAT6 cables. 40 | 41 | ## Kernel and network configuration 42 | 43 | ### Network stack parameters 44 | 45 | The Linux kernel has a lot of parameters that can be tweaked for better network performance. We recommend 46 | doubling some of the default settings. 47 | 48 | ```console 49 | sudo sysctl -w net.core.netdev_budget=600 50 | sudo sysctl -w net.core.netdev_max_backlog=2000 51 | ``` 52 | 53 | These changes will reset after reboot. They can be made persistent by creating the following file: 54 | 55 | ```console 56 | echo -e "net.core.netdev_budget=600\nnet.core.netdev_max_backlog=2000" | sudo tee /etc/sysctl.d/90-alphasense-network-parameters.conf 57 | ``` 58 | 59 | ### Network interface ring buffer 60 | 61 | We also recommend increasing the ring buffer of the network card. 62 | First check the maximum supported RX ring size by executing: 63 | 64 | ```console 65 | sudo apt install ethtool 66 | sudo ethtool -g INTERFACE_NAME 67 | ``` 68 | 69 | ```console 70 | sevensense@7s-workstation:~$ sudo ethtool -g eth0 71 | Ring parameters for eth0: 72 | Pre-set maximums: 73 | RX: 4096 74 | RX Mini: 0 75 | RX Jumbo: 0 76 | TX: 4096 77 | Current hardware settings: 78 | RX: 256 79 | RX Mini: 0 80 | RX Jumbo: 0 81 | TX: 256 82 | ``` 83 | 84 | The maximum RX ring size in the above case is 4096. The ring size can be increased using: 85 | 86 | ```console 87 | sudo ethtool -G INTERFACE_NAME rx 4096 88 | ``` 89 | 90 | The change in ring size does not persist across reboots. Some network managers support configuration 91 | of the ring sizes. Otherwise it can be added to a script that runs on boot. 92 | 93 | ### Network interface MTU 94 | 95 | The last thing to configure is the Maximum Transmission Unit (MTU) of the network interface. 96 | This needs to be increased to 7260 to allow for the maximum packet size supported by the Core Research. 97 | 98 | ```console 99 | sudo ip link set INTERFACE_NAME mtu 7260 100 | ``` 101 | 102 | This setting does also not persist across reboots and needs to be configured in the 103 | network manager or a startup script. 104 | 105 | ## Core Research configuration 106 | 107 | Three driver configuration parameters influence the network performance of the Core Research 108 | `pixels_per_packet`, `image_socket_receive_buffer_size_mb`, and `peak_bandwidth_limit_mbps`. 109 | See [Sensor settings](/pages/sensor_settings.md) for instructions on how to supply the driver 110 | with these parameters. 111 | 112 | `pixels_per_packet` configures the packet size used for the video stream. Bigger packets decrease the network stack overhead, because less packets are needed to transfer a frame. This should be set 113 | to the maximum of 7200 when possible. This is limited by the maximum MTU of your network interface. 114 | Note that the network card has to be configured to accept bigger packets, 115 | see [Network interface MTU](/pages/maximize_network_performance.md#network-interface-mtu) 116 | 117 | `image_socket_receive_buffer_size_mb` sets the size of the buffer used for the video stream 118 | socket. The default value for this is 10 megabyte which should be enough for most systems. This cannot be 119 | set higher than the maximum socket buffer size allowed by the kernel. The maximum the kernel allows can 120 | be changed with (the value is in bytes): 121 | 122 | ```console 123 | sudo sysctl -w net.core.rmem_max=11145728 124 | ``` 125 | 126 | `peak_bandwidth_limit_mbps` sets the maximum peak bandwidth in megabits per second of the Core Research. 127 | This should be set to 1000, which means that it transfers the captured frames to the host at full gigabit speed. 128 | For some systems this can be too much and cause buffer overflows in lower levels of the network stack. In that 129 | case it can be tuned to a lower value at the cost of decreased maximum frame rate and higher transfer latency. This parameter also needs to be lowered when the Core Research has to share the bandwidth 130 | with other sensors or devices. 131 | -------------------------------------------------------------------------------- /pages/ros_driver_usage.md: -------------------------------------------------------------------------------- 1 | # ROS driver usage 2 | 3 | The ROS driver can be started with 4 | 5 | ``` 6 | rosrun alphasense_driver_ros alphasense_driver_ros 7 | ``` 8 | 9 | ## ROS API 10 | 11 | ### Published topics 12 | 13 | Images are published using the `image_transport` package. The topics published depend on the `image_transport` plugins installed. We list only the raw image topics here. 14 | 15 | * **/alphasense_driver_ros/cam0** (*sensor_msgs/Image*) 16 | Camera images from cam0 17 | * **/alphasense_driver_ros/cam1** (*sensor_msgs/Image*) 18 | Camera images from cam1 19 | * **/alphasense_driver_ros/cam2** (*sensor_msgs/Image*) 20 | Camera images from cam2 21 | * **/alphasense_driver_ros/cam3** (*sensor_msgs/Image*) 22 | Camera images from cam3 23 | * **/alphasense_driver_ros/cam4** (*sensor_msgs/Image*) 24 | Camera images from cam4 25 | * **/alphasense_driver_ros/cam5** (*sensor_msgs/Image*) 26 | Camera images from cam5 27 | * **/alphasense_driver_ros/cam6** (*sensor_msgs/Image*) 28 | Camera images from cam6 29 | * **/alphasense_driver_ros/cam7** (*sensor_msgs/Image*) 30 | Camera images from cam7 31 | * **/alphasense_driver_ros/imu** (*sensor_msgs/Imu*) 32 | IMU measurements of the internal IMU. 33 | * **/alphasense_driver_ros/aux_imu** (*sensor_msgs/Imu*) 34 | IMU measurements of an optional external IMU. 35 | * **/alphasense_driver_ros/serial_number** (*std_msgs/String*) 36 | Serial number of the connected Core Research. 37 | 38 | ### Parameters 39 | 40 | * **~ncamera_settings** (*string, default: ""*) 41 | File path to a `ncamera_settings.yaml` configuration file. See [Sensor settings](/pages/sensor_settings.md) for more information about the format of this file. If left empty `rqt_reconfigure` support will be enabled, configuration can then be changed dynamically and is read from the ros parameter system. 42 | * **~device_serial** (*string, default: ""*) 43 | When this is set the driver will only connect to a specific Core Research. If left empty the driver will connect to the first one it can find. 44 | * **~translate_device_time** (*bool, default: True*) 45 | Translate the internal Core Research clock to the host clock using [`cuckoo_time_translator`](https://github.com/ethz-asl/cuckoo_time_translator). This should be disabled when the internal Core Research clock is synchronized to the host using PTP (see [Synchronized time](/pages/time_synchronization.md#synchronized-time-ptp)). 46 | -------------------------------------------------------------------------------- /pages/sensor_settings.md: -------------------------------------------------------------------------------- 1 | # Sensor settings 2 | 3 | ## Running the sensor with a settings file 4 | 5 | Instead of starting Alphasense with the standard configuration, the driver can 6 | be given a custom settings file. An example is provided here: 7 | [ncamera_settings.yaml](/files/ncamera_settings.yaml). This can be passed to the 8 | driver with one of the commands shown below. 9 | 10 | Standalone viewer: 11 | 12 | ```console 13 | viewalphasense --ncamera-settings /path/to/ncamera_settings.yaml 14 | ``` 15 | 16 | ROS driver: 17 | 18 | ```console 19 | rosrun alphasense_driver_ros alphasense_driver_ros _ncamera_settings:=/path/to/ncamera_settings.yaml 20 | ``` 21 | 22 | ## Dynamically configuring settings 23 | 24 | When using the ROS driver, it is possible to change the settings dynamically as 25 | an alternative to providing a settings file. 26 | To adjust the configuration of the driver, use 27 | [rqt_reconfigure](http://wiki.ros.org/rqt_reconfigure) after launching the 28 | driver. 29 | 30 | ``` 31 | rosrun rqt_reconfigure rqt_reconfigure 32 | ``` 33 | 34 | This application lists all availabe settings and allows to change them on the 35 | fly. Those are described in more detail in the sections below. 36 | 37 | ## Individual camera parameters 38 | 39 | Each camera can have distinct settings for (auto) exposure and gain. The 40 | exposure intervals of all cameras will be temporally aligned in the center. 41 | 42 | ### Camera exposure time 43 | 44 | #### Fixed exposure time 45 | 46 | The exposure time of a camera can be fixed by setting the 47 | `auto_exposure_enabled` parameter to `false` and setting the 48 | `manual_exp_time_us` to the desired exposure time in microseconds. 49 | 50 | #### Auto exposure 51 | 52 | Alphasense is equipped with an intelligent auto exposure algorithm which is 53 | specially designed for computer vision and state estimation applications. It 54 | detects areas in the image that contain high gradients and adjusts the exposure 55 | time accordingly. As a result, it is possible to recognize structure in the 56 | environment under the influence of sunlight, for example. 57 | 58 | Auto exposure can be enabled by setting the `auto_exposure_enabled` parameter 59 | to `true`. The algorithm adjusts the exposure time until it reaches the overall 60 | image brightness defined in the `autoexp_target_median` setting. The target 61 | brightness is encoded using 8-bit grayscale: 0 representing black and 255 62 | white. The minimum and maximum exposure time range within which the algorithm 63 | can operate can be configured with `min_exp_time_us` and `max_exp_time_us` 64 | respectively. Both parameters are given in microseconds. If autogain is 65 | enabled, the gain will be increased if the maximum exposure time is too low in 66 | order to reach the desired image brightness. The parameter `autoexp_speed` 67 | adjusts how smooth the change in exposure time and consequently image 68 | brigthness is. However, when `autoexp_speed` is reduced, convergence is slowed 69 | down. 70 | 71 | #### Stereo pairs 72 | 73 | To make matching across stereo cameras more stable it helps to equalize the exposure of both cameras. This can be done in the Alphasense by running auto exposure for only one camera in the pair, and copy the result to the other camera. 74 | 75 | A camera can be configured to copy the exposure parameters of another camera with the `autoexp_master_camera_index` parameter. 76 | 77 | ```yaml 78 | ncamera: 79 | # ... 80 | cams: 81 | 0: 82 | auto_exposure_enabled: true 83 | # ... 84 | 1: 85 | autoexp_master_camera_index: 0 86 | ``` 87 | 88 | ##### Setting the Region Of Interest (ROI) 89 | 90 | The auto exposure algorithm can be configured to only optimize brightness of a 91 | certain rectangular patch of the image. 92 | 93 | The boundaries of the patch can be set as follows: 94 | 95 | ```yaml 96 | autoexp_start_row: 0 97 | autoexp_end_row: -1 98 | autoexp_start_col: 0 99 | autoexp_end_col: -1 100 | ``` 101 | 102 | For `autoexp_end_row` and `autoexp_end_col`, -1 equals the maximum height and 103 | width. So in this case, the whole image is being used. 104 | 105 | 106 | ### Camera gain 107 | 108 | The camera gain parameter can be used to adjust light sensitivity of the 109 | sensor. Increasing the gain will lead to brighter images with shorter exposure 110 | times. Note that this comes at the cost of increased noise in the image. 111 | Setting `manual_gain` to 0 will result in no amplification and 255 will 112 | increase sensitivity by a factor of 5.6. If auto exposure is enabled, this 113 | parameter has no influence. 114 | 115 | ### Auto gain 116 | 117 | This setting is only relevant if auto exposure is enabled. Enabling this 118 | parameter allows the auto exposure algorithm to increase the camera gain when 119 | maximum exposure time is reached. This enables operation in very dark 120 | environments or when maximum exposure time is too low to reach a sufficiently 121 | bright image. In case the desired image brightness can be achieved adjusting 122 | exposure time only, gain is kept at the lowest level to reduce noise. The 123 | `autoexp_max_gain` allows for manual control over the maximum gain value, the 124 | algorithm can apply. 125 | 126 | ### Rotating the image by 180 degrees 127 | 128 | > :warning: **Breaks factory calibration**: Do not rotate the camera images when using the factory supplied calibration! 129 | 130 | The camera image can be rotated by 180 degrees. This can be done by setting the 131 | `rotate_image` parameter to `true`. The rotation of the images happens 132 | directly on the Alphasense camera main board. 133 | 134 | ## Sensor parameters 135 | 136 | ### Number of enabled cameras 137 | 138 | The number of enabled cameras can be set with the `num_cams` option. The camera 139 | ports until the `num_cams` value will be enabled. By disabling camera ports, the 140 | maximum allowed frame rate of the enabled cameras can be increased. 141 | 142 | ### Frame rate and IMU frequency 143 | 144 | The camera frame rate can be set with the `camera_frequency_hz` option. The IMU 145 | frequency can be set to 100, 200 or 400 Hz with the `imu_frequency_hz` option. 146 | 147 | The minimum frame rate is 1Hz. The maximum frame rate depends on the number of cameras, 148 | the maximum/manual exposure time, and network tuning parameters 149 | (`inter_packet_delay_us`/`pixels_per_packet`). The table below gives the absolute 150 | maximum frame rates for each number of cameras with exposure at 10us, `inter_packet_delay_us` at 1.0 us and `pixels_per_packet` at 7200. 151 | 152 | #### Absolute maximum frame rates in Hz/fps 153 | 154 | | Number of Cameras | Core Research (0.4 MP, Mono) | Core Research (1.6 MP, Mono) | 155 | | --- | --- | ---| 156 | | 1 | 75 | 30 | 157 | | 2 | 75 | 30 | 158 | | 3 | 75 | 24 | 159 | | 4 | 72 | 18 | 160 | | 5 | 58 | 14 | 161 | | 6 | 48 | 12 | 162 | | 7 | 41 | 10 | 163 | | 8 | 35 | 8 | 164 | 165 | ### Gyro sensitivity 166 | 167 | The sensitivity of the gyroscope can be set with the `gyro_range_deg_per_sec`. 168 | The parameter sets the maximum measurable rotation rate. Setting this value 169 | higher will decrease the gyro sensitivity. 170 | 171 | ### Network tuning 172 | 173 | The peak bandwidth used by the Core Research can be set with the `peak_bandwidth_limit_mbps` parameter. 174 | 175 | The Core Research packet size can be set with the `pixels_per_packet` option. 176 | It is best to increase the `pixels_per_packet` to the highest possible allowed by your network setup (The maximum is 7200). 177 | 178 | > :information_source: **Info**: The MTU of the network card to which the Core Research is connected needs to be at least `pixels_per_packet` + 60, otherwise the driver cannot receive the image stream packets. In that case you will get "Image receive timed out." errors. 179 | 180 | See [Maximize network performance](/pages/maximize_network_performance.md) for more information. 181 | 182 | ### PTP sensor stream blocking 183 | 184 | The driver can be configured to not stream IMU and camera frames before the device successfully 185 | synchronizes its clock. This prevents a big timestamp jump in the streams. 186 | To enable this set `expect_ptp_time_synchronization` to `true`, by default this is set to `false`. 187 | 188 | To prevent timestamps from drifting apart when PTP synchronization is lost, a timeout can be set after which the driver 189 | blocks the streams. This timeout in seconds is configured with the `max_ptp_sync_lost_time_s` parameter. 190 | By default this parameter is set to `60` seconds. The timeout mechanism can be disabled by setting this to `0`. 191 | -------------------------------------------------------------------------------- /pages/time_synchronization.md: -------------------------------------------------------------------------------- 1 | # Time synchronization 2 | 3 | The Core Research stamps all measurements using its internal clock. The camera 4 | frames are stamped at the middle of their exposure intervals. This means that the 5 | IMU and cameras are all inside the same time frame. In case no other sensors are used 6 | this can be treated as the global time frame and no further action has to be taken. 7 | However, if measurements from other cameras, LiDARs, sonars or wheel odometery are 8 | present and are intended to be fused with the measurements of Core Research, time 9 | synchronization between those sensors is necessary. 10 | 11 | We distinguish three possible modes of setting up the Core Research & driver: 12 | 13 | * **Raw time:** In this mode the ROS messages will contain the raw timestamp of the internal 14 | device clock. This clock starts counting from timestamp 0 on device power up and has a non-constant drifting offset with respect to ROS time. 15 | * **Translated time:** (default) In this mode the raw timestamps of the internal clock are translated to the host time frame using the [cuckoo_time_translator](https://github.com/ethz-asl/cuckoo_time_translator). A constant offset with respect to ROS time remains. This offset could be obtained by a suitable calibration procedure (depends on the sensor setup and is not provided by Sevensense). 16 | * **Synchronized time:** In this mode the internal device clock is synchronized to the host computer using the PTP protocol. This puts the device clock into the same time frame as the host and the raw timestamps can be directly used for sensor fusion. 17 | 18 | The instructions for configuring each mode are given below. 19 | 20 | ## Raw time 21 | 22 | Raw timestamps can be enabled by disabling the time translation functionality in the ROS driver. This can be done by setting the ROS parameter `translate_device_time` to `False` (see [ROS Parameters](/pages/ros_driver_usage.md#parameters)). 23 | 24 | Make sure that no PTP master is present in the network, because otherwise the internal clock would be adjusted to be 25 | in sync with the master. 26 | 27 | ## Translated time 28 | 29 | This mode can be enabled by setting the ROS parameter `translate_device_time` to `True`. Note that `translate_device_time` is `True` by default. 30 | 31 | ## Synchronized time (PTP) 32 | 33 | > :information_source: **Info**: In the current firmware/driver version (<= 1.7.1) PTP synchronization will cause an IMU timestamp warning to trigger "IMU measurement period unstable, error: 0.1000ms.". This can be ignored and will be fixed in the next firmware release. 34 | 35 | The Core Research will automatically start the synchronization when a PTP master is connected to the ethernet network. Only PTP over UDP/IPv4 is supported. See [Running a PTP master on linux](#running-a-ptp-master-on-linux) on how to run a PTP master directly on the host. 36 | 37 | > :warning: **Important**: Time translation needs to be **disabled** when PTP is used. 38 | 39 | Time translation needs to be disabled for the internal clock time to be passed through correctly. See [Raw time](#raw-time) above on how to disable this. Note that raw and synchronized time are essentially the same from the point of view of the driver. It's the presence of a PTP master in the network that makes the device change its internal clock. 40 | 41 | To prevent the sensors streams from streaming before the internal clock is synchronized see [PTP sensor stream blocking](sensor_settings.md#ptp-sensor-stream-blocking) . 42 | 43 | ### Running a PTP master on linux 44 | 45 | In this example we use tools from the [linuxptp project](https://sourceforge.net/projects/linuxptp/). This example is just to quickly setup a functioning PTP master, please see the linuxptp documentation for more advanced command line flags. 46 | 47 | The tools can be installed with: 48 | 49 | ```sudo apt install linuxptp``` 50 | 51 | In all commands below you need to replace `INTERFACE_NAME` with the name of the network interface to which the Core Research is connected. 52 | 53 | We recommend using a network interface that has hardware timestamping support. Most modern PCIe network interfaces support this, USB to Ethernet adapters usually do not support this. You can use the following command to check if your network card supports hardware timestamping: 54 | 55 | `ethtool -T INTERFACE_NAME` 56 | 57 | The console output should look similar as the one shown below: 58 | 59 | ```console 60 | sevensense@7s-workstation:~$ sudo ethtool -T enp5s0 61 | Time stamping parameters for enp5s0: 62 | Capabilities: 63 | hardware-transmit (SOF_TIMESTAMPING_TX_HARDWARE) 64 | software-transmit (SOF_TIMESTAMPING_TX_SOFTWARE) 65 | hardware-receive (SOF_TIMESTAMPING_RX_HARDWARE) 66 | software-receive (SOF_TIMESTAMPING_RX_SOFTWARE) 67 | software-system-clock (SOF_TIMESTAMPING_SOFTWARE) 68 | hardware-raw-clock (SOF_TIMESTAMPING_RAW_HARDWARE) 69 | PTP Hardware Clock: 0 70 | Hardware Transmit Timestamp Modes: 71 | off (HWTSTAMP_TX_OFF) 72 | on (HWTSTAMP_TX_ON) 73 | Hardware Receive Filter Modes: 74 | none (HWTSTAMP_FILTER_NONE) 75 | all (HWTSTAMP_FILTER_ALL) 76 | 77 | ``` 78 | 79 | Hardware timestamping is possible if the capabilities `SOF_TIMESTAMPING_TX_HARDWARE`, `SOF_TIMESTAMPING_RX_HARDWARE`, and `SOF_TIMESTAMPING_RAW_HARDWARE` are available. 80 | 81 | When hardware timestamping is used, the internal clock of the network card first needs to be synchronized with the system clock. This can be done by running: 82 | 83 | `sudo phc2sys -m -c INTERFACE_NAME -s CLOCK_REALTIME -O 0 -u 10` 84 | 85 | This command needs to be kept running in the background to make sure the clock does not drift away again. The command will print information about the synchronization status every 10 seconds. When software timestamping is used this command can be skipped. 86 | 87 | The PTP master can be started with the command below (to use software timestamping replace `-H` with `-S`): 88 | 89 | `sudo ptp4l -m -H -i INTERFACE_NAME` 90 | 91 | This command also needs to be kept running in the background. 92 | -------------------------------------------------------------------------------- /pages/usage_examples.md: -------------------------------------------------------------------------------- 1 | # Usage examples 2 | 3 | We are working on several examples presenting how to use the Alphasense 4 | Core in various kinds of applications. Expect more demos coming soon. 5 | 6 | Note that the sole purpose of the demos is to indicate usage 7 | scenarios of the Core Research using open-source software and not 8 | to provide product-ready or state-of-the-art results. 9 | 10 | To take a full advantage of the Core Research used for SLAM, we 11 | recommend the industry-ready positioning system by Sevensense, 12 | [Alphasense Position](https://www.sevensense.ch/products). 13 | 14 | ## Stereo demo 15 | 16 | In this demo we show how you can use the front stereo pair to get a depth 17 | map using block matching. 18 | 19 | The demo can be found [here](https://github.com/sevensense-robotics/alphasense_stereo_demo). 20 | --------------------------------------------------------------------------------