Wednesday, February 7, 2007

Tracking Blobs

Since last week's discussion, I have decided to change my detection algorithm by adding a tracking component. After image subtraction to get the moving blobs and thresholding, I track the moving blobs to determine which direction it is moving in. Since the camera traffic light triggering is intended for only traffic stop, we can then determine if the blob is moving in the direction of the targeted traffic light.

My algorithm for tracking is simple:

For all the frames in the video:
Compute the background by a sliding average.

Do an image subtraction of the background image from the current image.

Convert the resultant image to a binary image and use a connected component labeling

threshold on area to reduce noise

for all the blobs on the current frame:
compute position(centroid), area, boundingbox

compare the current blob's position, area against the global blob set
if positions are the nearest and within a certain threshold, add position to the matching global blob
if the positions are not near, add to global blob set

if a blob's trajectory has shown the intended direction path in the specified area, turn on boolean flag for triggering

Results:
night time bike:

car at 5pm:


The cross traffic does not track as well because they are moving faster where as the incoming traffic is moving slowly towards a stop sign and are able to be tracked.

TODO:
-Experiment and grab more training data in the daytime.
-->Expect to run into issues in the day with more cross-traffic, people, lighting conditions.

No comments: