Revolution The Ricker Lyman Robotic Company provides global companies expertise in cutting edge technology such as big data, streaming microservices, blockchain and the Internet of Things. We empower your information technology revolution.

Swipe to enter!

Designed & Developed by Bambuk

Technology is a useful servant but a dangerous master. (C. L. Lange)

Escape Blog

Sensor fusion: the importance of time

Technology is a useful servant but a dangerous master. (C. L. Lange)

Sampling rates

All sensors sample at an independent rate. Stated another way, different sensors generate a signal at different times and at different intervals. Some sensors such as scanners create a signal of a specific duration. Figure 1 shows conceptual the relationship between the sample times of five different sensors. The signals from the sensors are shown as boxes. The position of the boxes shows the relative occurrence of the signal. The width of the box shows the relative duration of the signal. The distance between boxes on the same row shows the relative sampling rate of the sensor.

Figure 1 Sampling rates

In legacy sensor systems, the interdependencies between the sensors are hard coded. Likewise, the differences in the sampling rates of the various sensors are obscured and association is nearly arbitrary. Figure 2 shows conceptually how a legacy sensor data format combines sensor data. The format is an image with metadata in its headers. All time relations have been stripped from the data. In such a legacy format, there is no easy or direct means for interpolating data from the various sensors for higher accuracy.

Figure 2 Combination of sensor data in legacy format

Interpolation example

We will now return to the example sensor platform from our previous blog shown in figure 3. The platform consisted of an IMU, a GPS, a rotational encoder and an imaging sensor.

Figure 3 Sample system

To extend this example, we will assume that the imaging sensor is a scanner that samples 120 lines at equal increments. The following table shows the sensor readings with timestamps as they would appear. Note that the image data appears 120 time increments after it began, but it is stamped with its beginning time. The other sensors are treated as instantaneous.

Time Sensor
12 gimbal
27 GPS
36 gimbal
37 IMU
60 gimbal
70 IMU
77 GPS
84 gimbal
103 IMU
108 gimbal
0 image
136 IMU
127 GPS
132 gimbal
156 gimbal
169 IMU
177 GPS
180 gimbal
202 IMU
204 gimbal
227 GPS
228 gimbal
235 IMU
120 image

Table 1 Example of sampled data

We wish to calculate the exact position of the 56 line of the second image. The time of that line would be the time the image began plus 55, or 176. For the rest of the calculations, we will use first order interpolation.

Sample data

The following figures are intended to contrast the legacy data format with the new sensor fusion data format. Table 2 shows the legacy approach, the whole image assigned to samplings from each of the sensors.

longitude latitude altitude pitch roll heading

Table 2 Conceptual legacy sensor data format, with metadata in header

Table 3 shows the image data interpolated with other sensors, that is, true sensor data fusion.

Time Sensor reading
2456 Latitude, longitude, altitude
2762 Pitch, roll, heading
2812 Latitude, longitude, altitude
2910 Pitch, roll, heading
3015 Latitude, longitude, altitude
3147 Pitch, roll, heading
3354 Latitude, longitude, altitude

Table 3 Conceptual sensor data fusion format

In our next post, we will discuss the concept of an instantaneous field of influence.