A device, by default, captures and sends frames from each sensor independently. This means there is no guarantee that both sensors capture a snapshot of the environment at the same time, but there may be a lot of cases where you want to reduce the delay between capturing two frames from two different sensors. For example, if you want to use both image and depth streams to retrieve the color of an object recognized by the depth stream from the color stream you need to read the data from both the streams; this means one stream could have a different capture time from the other. But using frame syncing that is available for image and depth streams, you can at least decrease this difference in capture time to the lowest possible value.
In this recipe, we are going to show you how you can enable frame syncing and how much this option can reduce the difference between capture times.