# MATLAB code (calculating displacement)

• MATLAB
Hey guys,
I need help. Say i take a picture of a dot in a frame followed by another picture whereby the dot has moved to another spot in the frame. How do I calculate the distance between these 2 dots? At the end, i would have 2 jpeg images of the dot at 2 different locations. What i'm trying to do is calculate how far the dot has moved. Any help with the codes would be very much appreciated. Thanks!!

Jeremy

Mark44
Mentor
You have two things going on: picture coordinates and world coordinates. It's fairly straightforward to get the coordinates of the two positions of the dot in the two images, and from those coordinates you can calculate the distance between the dots, relative to the image.

Then you need to convert the distance in the image to the actual distance between what the dots represent. Are the things that show up as dots the same distance from the observer? The picture is a two-dimensional image, but if the dots are at different depths from the viewer, that complicates things.Is the plane of the image the same as the plane of what's being viewed? If not, you have to take that into consideration also.

hi mark,
sorry for the late reply. yeah those dots are at the same distance from the observer. what i'm trying to do is say i have a model put under a load. and at the side of this model i have dot markers sticking out of it. the displacement of these dots will be captured by a video camera in frames and then processed via matlab to calculate the displacement and so on. also, i was thinking of capturing an image of a graph paper so the grids can help with the measurement of displacement. the problem is, i've never done matlab before so any assistance with the codes and how to go about with this whole thing would be very much appreciated. thanks!!

The answer to your question depends on how you plan on isolating the dot positions from all the other image data... which to me sounds like a much harder task than finding the distance it has travelled (you will use Pythagoras in 2D, assuming your camera can take a reasonably flat picture without distortion).

From this I'd agree with your calibration method : taking pictures of grids.

hi mikey,
assuming the camera takes a perfectly flat picture, how do we go about this? what i'm trying to do is to filter the image after capturing to read only the black colour of the dots which means everything else on the picture will be non-existent. from there you will have say a dot in frame 1 at the lower left end of the frame. in frame 2, after displacement, say the dot has moved to the bottom right hand side of the frame. how do i get matlab to automatically calculate the displacement? also, how do i use the grid image to calibrate my data? thanks in advanced!

Jeremy,

I did a program somewhat like this a while ago. I will give you some basics...

The command to read an image in MATLAB is imread(). Use the documentation to figure out the syntax (type help imread).

When you use imread on a jpeg it will return an array that is has size x-pixels x y-pixels x 3. I may be wrong on the order of the x and y, but that's not the point. The key point is that x3. These are the color maps.

The first map is red, second is green, third is blue. Each pixel has a color value that is stored within these maps. The values range from 0-255. A pixel that is pure white has a value of 255 for all three color maps (Black on the other hand is 0 for all).

You can access these color maps using this method:
arrayname(:,:,1) <-- returns the red map
arrayname(:,:,2) <-- returns the green map
arrayname(:,:,3) <-- returns the blue map

I have written a sample code to demonstrate this. Called Sample1.m
http://images2.layoutsparks.com/1/56685/black-white-squares-layout.jpg" [Broken]

You then have to define what you consider 'Black' for example, you may consider black to be where the value of all three color masks is less than 30. It's akin to a tolerance.
See my other sample code, Tolerance.m
This code uses the picture found here
http://www.settasshields.com/black_white_squares_op_600x600.jpg" [Broken]
Again, save this as it's default.

Hope this helps.

P.S. More to come later

#### Attachments

• Sample1.m
139 bytes · Views: 407
• Tolerance.m
1 KB · Views: 379
Last edited by a moderator:
hi guys,
still can't figure it out. my supervisor has been telling me something about image analysis and image thresholding. what does that mean? all i know is my matlab program has to be robust so there won't be any flaws. as the camera records and takes pictures in grayscale, what happens in the whole picture will just be in shades of grey. judging by how the dots aren't exactly the only black thing around, i would need to write a program to recognise the shape and size of the dots i would think. how do i do that? really urgent! need help quick!! thanks!!!

AIR&SPACE's post looks very good... try to implement it, look up "imread" on MATLAB. My advice if you are stuck is to experiment with the image processing toolbox.

One important point that you just mentioned is that the image is grayscale. This means when you use the "I = imread('input.jpg') command, I will be a matrix of your pixel values on a grayscale from 0 to 255 I think.

If you know you are looking for a single black dot, you can simply find the pixel with the minimum value, since it is now a matrix. I've never quite managed to find a simple code to find where the minimum value is in MATLAB, so I always end up writing strung out coding to find it, eg.

Code:
>> I = ones(50);
>> I(32,41) = 0;
>> imwrite(I,'i.jpg');
>> size(O)
ans =
50    50

>> min = min(min(O))
min =
13

>> for i = 1:50
for j = 1:50
if O(i,j) == 13
i
j
end
end
end
i =
32
j =
41

So basically I wrote an image file i.jpg from a matrix which has a single "black dot", ie zero-value. Then I loaded it into a new matrix "O" and searched for the minimum value of O, ie. the darkest pixel. Then I did that loop to find where this darkest pixel was: it is at (32,41). (I guess the darkest pixel is at 13 rather than 0 due to the .jpg compression- it's even worse when you have a white dot on black) This is a VERY basic point finding algorithm, and if you have noise (especially salt & pepper) it is likely to find the wrong dot, for example if you have a 25 pixel dark spot representing your dot but a single pixel of absolute black due to a faulty sensor, this will be chosen instead. You can use filtering to remove some of this noise though.

Given a series of .jpg images it should be a simple extension to this to record the darkest pixel location values into a matrix as you analyse one image after another. After that you just need to use Pythagoras on the pixel locations to find the distance.

Does this help?