I’ve been meaning to build something cool with the NXTCamV3 for a while now but never got around to it. I tried making the Omniwheel suitable for the NXTCam but unfortunately the wheels cause way too much vibration when the robot moves. Oh well, in the not so distant future I hope to pick up a few of these babies: [LINK].
On Friday night I quickly put together this pan and tilt rig you see in the rather crappy picture to your left. That’s me holding the can, coaxing the camera to track it. I took this picture by accident, in fact, I meant to hit the record video button instead. Still, it turned out OK-ish.
The rig’s motors are controlled by two simple PID controllers that look at the X and Y values of the first (and largest) object returned by the camera. They work independently together to keep the camera centered on the can. There is no rotation limit imposed which can be a real pain in the butt when the camera loses track of the can and goes nuts looking for it.
I made another video related to the NXTCamV3 today. This one is a performance test of my firmware (BlobMerge V0.2) versus the standard NXTCam firmware.
For these tests I used the NXTCamV3 with the BlobMerge firmware and an NXTCamV2 with the standard firmware. The data is read via USB by the NXTCamView program and captured by CamStudio. In good indirect natural light both firmwares perform similarly. It’s when you start using incandescent light that you can really tell the difference. The standard firmware returns lots of small objects due to the shadows and bad lighting conditions. The BlobMerge firmware remedies this situation by:
- Merging all objects that are touching or separated by no more than 30 pixels;
- Tracking 12 objects (merging reduces the total number of objects) internally instead of 8 and returning the 8 largest ones;
- Removing all objects smaller than 250 square pixels, these are most likely artefacts;
- Sorting all the objects according to size, largest one first.
This allows you to track just a single object in your program, the first one as this is most likely the object you are after.
So there you have it, a weekend well spent, I’d say. You can download the sources for test rig program here: [LINK]. You can download and compile the firmware by checking it out via SVN from this location: [LINK].
[…] towards the left is the NXTCamV3 in its pan and tilt rig. It still needs quite a bit of work to prevent it from going nuts when it loses the tracked […]
[…] towards the left is the NXTCamV3 in its pan and tilt rig. It still needs quite a bit of work to prevent it from going nuts when it loses the tracked […]
wow that is the smoothest best tracking of an object I have ever seen…brilliant
[…] http://mightor.wordpress.com/2010/03/21/pan-and-tilt-rig-for-nxtcamv3-and-firmware-tests/ […]
Can the firmware be changed on the NXTCamV2? Only I bought the camera just before V3 was released, and was wondering whether your custom firmware would work. Currently, my object merging is all done by the NXT brick, but I thought that it might be better to make it happen in the camera itself.
Eric,
The firmware on the V2 cannot be changed, at least not that I am aware of.
– Xander
Can I get the code for the pan and tilt?
If I can find it, sure 🙂
The link is in the article, actually. http://www.breigh.com/xander/NXTCAM-test3.c
hello
i recently bought NXT CAMV3 i programmed the robot to follow the red ball using NXT CAM but i dun know how to calibrate it…i m using NXT CAMVIEW for capturing image than selecting it and uploading the colours is that how we have to calibrate it?? please assist me here
That is exactly how you calibrate it, you were on the right track 🙂
Hey. Can you send compiled BlobMerge? I’m having troubles with compiling. Or installing firmware trough MindSensors fwupgrader. After install it can’t connect to PC trough USB.
The fwupgrader does not upgrade the firmware on the AVR, only on the PIC bridge chip. You need to use AVR Studio to upgrade the actual camera firmware. The hex file you’re looking for can be downloaded from here: http://nxtcam.svn.sourceforge.net/viewvc/nxtcam/nxtcam_avr_source-mergeobjects/
You should reflash your PIC with the proper PIC firmware. You need to mail the guys at Mindsensors (info_at_mindsensors_dot_com) and they’ll be happy to send you the normal firmware.
Regards,
Xander
I have reflashed from here http://www.rjmcnamara.com/2011/05/restore-mindsensor-nxtcam-firmware/
Can you tell how to do that via AVR Studio?
You can find the guides to do this in the advanced documentation section: http://goo.gl/qBOq
Please keep in mind that this is not a trivial exercise.
Thank you!
Hello, again..
I have problem with connecting NXTCam to AVR Studio 4. I followed instructions from Mindsenors. i have changed USB Comport via Contro panel to range 1-9.(I think changed that to 2). And i entered boot mode (Via NXT). But then i tru to connect AVR Studio shows same connection window.
Thank you for your help
I have made a robot using NXTCam v4 with BlobMerge v0.2. http://www.youtube.com/watch?v=AQQtjT9asA8
Great job, thanks for sharing 🙂
Hello,
I work with the NXTcam v3 and i have installed BlobMerge v0.2.
I want to detect a specific object with four area of the same color.
When i use NXTcamView, the NXTcam detect four blobs but when i use my program (in nxc with BricxCC), the command NXTCam_GetBlobs() return the number of blob and coordinates. The value of number of blob detected is 4. But sometimes i have the same coordinates for different blob. I think the problem is on the transmission i2c.
Without the actual you created it is very difficult for me to see what the issue could be. Do you have a program that can reproduce this problem? Make sure it only contains the necessary code to trigger the bug, nothing extra.
It is possible that you are reading the data but that you’re not doing it quickly enough and are getting data from the previous and next refresh of the camera.
– Xander
please I need your help I am a french student in secondary school I wanted to tracking some color with my nxtcam but I dont know how can do the communication between the camera and the robot in other words how the robot get the information of camera.
exemple if we use the sonar sensor we know that the function sensorvalue(sonarsensor) give us only one information about the distance of the obstacle. but the nxt give many information like X, Y, length, width, area,
please explain me how the robot get all this inforation
You should take a look at the mindsensors-nxtcam*.c programs to get an idea how the camera works with the NXT. They should help you get started.
– Xander
I’m already ready to start. I made my own program with NXTCam but its not working well. I want to try yours and I wanted to know where can I find the two libraries common.h and NXTCam-driver.h. This address “http://nxtcam.svn.sourceforge.net/viewvc/nxtcam/nxtcam_avr_source-mergeobjects/” there are too many I do not know what is good.
Please escuse me for my english
The .hex file ( MergeObjects.hex) is the one you have to upload to your NXTCam and the actual drivers are part of my Driver Suite, which you can find here: http://botbench.com/blog/robotc-driver-suite/. The driver has been renamed and is now called mindsensors-nxtcam.h
thank’s for your help please do you know where cand I found the MergeObjects.hex ???
thank’s so much I thing I use the 3.0 version i will upfrade it at school tomorrow is too late in my country see you…
I have another problem when I compile the minsensor-nxtcam-test1 with robotc I have 2 errors
sendI2CMsg(S1, i2cmsg[0], 0);
sendI2CMsg(S1, i2cmsg[0], 0);
I thing perhab’s the parameter “i2cmsg[0]” have to be sent by “&i2cmsg” it’s the same problem I have with my own progamme
thank for your help
I have another problem when I compile the minsensor-nxtcam-test1 with robotc I have 2 errors
sendI2CMsg(S1, i2cmsg[0], 0);
sendI2CMsg(S1, i2cmsg[0], 0);
I thing perhab’s the parameter “i2cmsg[0]” have to be sent by “&i2cmsg” it’s the same problem I have with my own progamme
Are you using ROBOTC 3.60? If not, upgrade first and see if you are still having issues.
The hex program can be found in the same directory you pointed me to, the one on sourceforge.
I download the 3.60 in my home pc I have the same problem (2 errors)
you can see the two function which do the errors about sendI2CMsg
void reset_cam ()
{
ubyte i2cmsg[] = {3, 0x02, 0x41, ‘R’};
sendI2CMsg(S1, i2cmsg[0], 0);
PlaySound(soundDownwardTones);
wait1Msec(100);
}
void boot_mode()
{
ubyte i2cmsg[] = {3, 0x02, 0x41, ‘b’};
sendI2CMsg(S1, i2cmsg[0], 0);
PlaySound(soundUpwardTones);
wait1Msec(100);
}
What’s the name of this program you’re using to reset the camera? Is it something of mine? I think it will work fine once you change the lines
endI2CMsg(S1, i2cmsg[0], 0);
to
endI2CMsg(S1, &i2cmsg[0], 0);
But you still should update to 3.60. 3.0 is very old and doesn’t have a lot of the bug fixes and features that my drivers require.
= Xander
you are right when I change the problems solved
I am glad to hear it!
Hello Mr Xander already thank you for helping me to solve my problem. But now I have another big big problem. I want to be part of the exhibitors at my school free open house. this day I want to present a robot that follows a red ball on the floor and my program worked once after I tried yours. I tried everything today but I do not see the problem. I wanted to make my own API based on what Mr Gordon Wyeth had did as my program is not working, my teacher said that I made the biggest mistake in the world. He think that I should build on the work of someone other but I wanted to present my own work because I did not want copy and paste work from another and say it is mine. I still have two weeks can you help me to understand your program because it is not very far from mine and I can then adapt or if you have already worked on a robot tracker ball on the ground to help me please.
The teacher who taught me programmed in c is sick you are my only hope.
Thank you in advance.
Which program are you referring to? If I am your only hope, you be better off asking your question on the ROBOTC forums where others can also help you out. I am currently a little under the weather (sick), so I may not be of much use to you. So post your issue in here: [LINK] and describe your problems as detailed as possible.
= Xander
thank’s for your advice.
morning Mr XANDER hope your are fine. now I am using your work to finish my project. I want my robot followed only one color so I use the merge of blob
now I modify your mindsensor-nxtcam-test1.c programme to display the difference between the center of blob and the center of camera but the programme dont work. you can see the code under these lines. I want to notify this programme compile very fine no error but it not running
I need your help to continous my project tuesday at school
thank’s for your help and excuse me for my fake english.
#pragma config(Sensor, S1, cam, sensorI2CCustomFastSkipStates)
[color=#BF0080]#include “mindsensors-nxtcam.h”[/color]
[color=#0000FF]task main[/color] () {
blob_array _blobs;
memset(_blobs, 0, sizeof(_blobs)); [color=#00BF40]// initialise the blobs[/color]
[color=#00BF40]// combine all colliding blobs into one[/color]
bool _condensed = true;
[color=#00BF40]//blob_array _blobs;[/color]
[color=#0000FF]int[/color] _l, _t, _r, _b;
[color=#0000FF]int[/color] _nblobs=0;
[color=#0000FF]float[/color] x_center=0; [color=#00BF40]// to get the center[/color]
[color=#0000FF]float[/color] error=0; [color=#00BF40]// to get the error[/color]
_l=0; _t=0; _r=0; _b=0;
[color=#0000FF]eraseDisplay[/color]();
[color=#00BF40]// Initialise the camera[/color]
[color=#0000FF]NXTCAMinit[/color](cam);
[color=#0000FF]while[/color](true) {
[color=#0000FF]eraseDisplay[/color]();
[color=#00BF40] // Fetch all the blobs, have the driver combine all
// the colliding blobs.[/color]
_nblobs = NXTCAMgetBlobs(cam, _blobs, _condensed);
[color=#0000FF]for[/color] (int i = 0; i < _nblobs; i++) {
_l = (_blobs[i].x1);
_t = (_blobs[i].y1);
_r = (_blobs[i].x2);
_b = (_blobs[i].y2);
}
x_center = SIDE_CENTER (_l,_r); [color=#008040]//get the center of blobs[/color]
error= x_center – 88;[color=#008040] // get difference between center of blobs and camera[/color]
[color=#0000FF]nxtDisplayTextLine[/color](3, "the error is %f", error );
}
}
sorry her is the correct code.
#pragma config(Sensor, S1, cam, sensorI2CCustomFastSkipStates)
#include “mindsensors-nxtcam.h”
task main () {
blob_array _blobs;
memset(_blobs, 0, sizeof(_blobs)); // initialise the blobs
// combine all colliding blobs into one
bool _condensed = true;
//blob_array _blobs;
int _l, _t, _r, _b;
int _nblobs=0;
float x_center=0; // to get the center
float error=0; // to get the error
_l=0; _t=0; _r=0; _b=0;
eraseDisplay();
// Initialise the camera
NXTCAMinit(cam);
while(true) {
eraseDisplay();
// Fetch all the blobs, have the driver combine all
// the colliding blobs.
_nblobs = NXTCAMgetBlobs(cam, _blobs, _condensed);
for (int i = 0; i < _nblobs; i++) {
// Draw the scaled blobs
_l = (_blobs[i].x1);
_t = (_blobs[i].y1);
_r = (_blobs[i].x2);
_b = (_blobs[i].y2);
}
x_center = SIDE_CENTER (_l,_r); //get the center of blobs
error= x_center – 88; // get difference between center of blobs and camera
nxtDisplayTextLine(3, "the error is %f", error );
}
}
/*
* $Id: mindsensors-nxtcam-test1.c 133 2013-03-10 15:15:38Z xander $
*/
This will continue via email, this is not a discussion board. I’ve sent you a mail with a program.
Thank’s very so much.
I come here because my question is for this project thanks to the conversation at RobotC forum (link for another persone who could be interested http://www.robotc.net/forums/viewtopic.php?f=1&t=5656) I begin to understand how this code run:
pwr_x motor power along x
err_x actual error along x
perr_x previous error along x for derivator
aerr_x error adds error along x to sum all these errors for the integrator
sp_x space following you x
p_x, i_x, d_x the PID gains are proportional integrator derivator
if it ‘s this I can assure you that I understand very well what you made before I was like a dog which watched tv.
so now please could you remember which kind of test did you do to have p_x=0.5, i_x=0.05, d_x=1.8
if I know what kind of test did you do to get those gain I can adap to do some test to get my own value I have 13 hours left at school to finish my project
I just played with the values until I was happy. If it reacts too slowly, increase P, if it’s too fast, reduce it 🙂 It is usually safe to not use the I and D factors.