The last typical occlusion scenario we would like to explore is when both the background and the foreground are moving and crossing each other, a situation that occurs frequently among pedestrians during crowd monitoring for instance. To illustrate how our device copes with such a case, we again exposed our proof-of-concept 20-pixel sensor to the same 2 x 2 pixel white square travelling from left to right and encountering a 2 x 2 pixel red square moving in the opposite direction, acting as the occluding entity. Figure 6a to 6e are snapshots of this scenario, with both objects moving at the same speed of 50 ms per frame (movie 5). Figure 6f depicts the corresponding sensor pixels’ responses during this whole event. The occlusion handling algorithm presented in Figure 5h applies for this situation as well. At the onset of partial overlap (Figure 6b), the sensor triggers the computer to probe pixels 1-8 in order to perform path prediction (steps 2 and 3 in Figure 5h). During complete occlusion (Figure 6c), pixels 9 and 10 have to be measured in addition. As Figure 6f illustrates, all pixels show only minor deviations in VOC upon a sudden change to darkness (Table 3), revealing that correct tracking of the occluded object is indeed possible via recollection of its previous trajectory.