Monday, February 26, 2007

Opponent Color Subtraction Result

I've implemented image subtraction using the blue/yellow channel as opposed to just rgb.
I also had to dilate the image before doing the connected component labeling for the daytime bike to be more apparent.

This is a video clip of the rgb subtract, notice no bike spotted in the image subtraction.


This is a video clip of the blue/yellow channel subtract with dilation.


This is the same capture as above with the footage shown.


Using this blue/yellow channel image subtraction seems to hurt night time bike tracking.The bike is first labeled as 2 and then when the car drives by, it becomes a new blob.


Maybe use RGB for night time and B/Y on daytime?

--
For the classification rules, I'm thinking of something that trains on a set of known trajectories that are known to be bike traffic. Then each actual labeled blob is compared to this set and check if the distance between the two trajectories are within a certain bounds.

Wednesday, February 21, 2007

Opponent Colors Image Subtraction

I did some more experimenting on image subtraction in different color space. I used two frames from the daytime footage. Serge suggested using opponent colors rather than LAB for better debugging. Opponent Color channels has a simpler model.
Below is plain RGB subtraction.


These two show the subtraction using opponent color channels. I ignore The first one is the green/red one and the second one is blue/yellow. The Blue/yellow seems to bring out difference more, but nosier than RGB.


Wednesday, February 14, 2007

Daytime Clip and Blob Position Plots

I recorded a clip with a bike at 1 pm. With the same settings as used in the gilman car 5pm and bike at night, it was not able to recognize the bike correctly. Perhaps the image is too saturated with light.

I found a L*A*B* colorspace function for Matlab. Here are the different components for a snapshot of the above video.


----
I plotted the positions of different objects in a 3d plot with x,y and frame number as the axes.

This one is of the car at 5pm gilman. Only posted ones with more than 10 positions.

This one is of the motorbike at night gilman. Only posted ones with more than 5 positions.



TODO Next:
-Fix daytime issue.
-Come up with classification rules.

Wednesday, February 7, 2007

Tracking Blobs

Since last week's discussion, I have decided to change my detection algorithm by adding a tracking component. After image subtraction to get the moving blobs and thresholding, I track the moving blobs to determine which direction it is moving in. Since the camera traffic light triggering is intended for only traffic stop, we can then determine if the blob is moving in the direction of the targeted traffic light.

My algorithm for tracking is simple:

For all the frames in the video:
Compute the background by a sliding average.

Do an image subtraction of the background image from the current image.

Convert the resultant image to a binary image and use a connected component labeling

threshold on area to reduce noise

for all the blobs on the current frame:
compute position(centroid), area, boundingbox

compare the current blob's position, area against the global blob set
if positions are the nearest and within a certain threshold, add position to the matching global blob
if the positions are not near, add to global blob set

if a blob's trajectory has shown the intended direction path in the specified area, turn on boolean flag for triggering

Results:
night time bike:

car at 5pm:


The cross traffic does not track as well because they are moving faster where as the incoming traffic is moving slowly towards a stop sign and are able to be tracked.

TODO:
-Experiment and grab more training data in the daytime.
-->Expect to run into issues in the day with more cross-traffic, people, lighting conditions.