My next project with the omniwheel is to have it react to bumping into things that it encounters on its journey. I want to use the HiTechnic Accelerometer to do it, the Mindsensors one is currently being used in another project.

I’ve been racking my brain for about 2 days now, analysing the data from the runs, trying to come up with a generic way to tell the difference between the collisions and the robot’s wheels hitting the grout. Ideally, I would like to be able to calculate the angle from which the impact came and the amount of force, so that the robot can move away from it at the appropriate speed.

I’ve made a video of two of the test runs I made. One is on the bare tiles, you can hear the “click” sound when the wheels hit the grout. The other run is on smooth boards on the floor. The only impact there is the baby gate at the end of the hall.

If anyone knows how to definitively tell the difference between the grout and a similar impact from the side, I am all ears. I’ve put the two Excel sheets I’ve been working with here: [LINK]. The filter I’ve used is a high-pass filter, controlled by the “alpha” variable. The offsets for each axis is based on the average values of the first 10 readings.

You can post your suggestions in the comments, I’m all ears!

For starters, you should center the sensors (gyro and acceleration, and if possible, by hardware and not case or chip) at the point around which the robot would rotate if it were rotating in place. This should minimize acceleration due to change in direction and make it easier to tell what is happening.

With the present sensor angle, the Z axis should be reading both gravity and bumps. Adding 200 units (using g/200) should make it easier to sense vertical bumps.

If you want to find the angle where X+ is 0°, Y+ is 90°, X- is 180°, and Y- is 270°, then you can take the inverse tangent of (Y divided by X). Using a method more like a clock, you can take the inverse tangent of (X divided by Y). At some angles, where you have a number over zero, you may experience an error, and the software will ignore it if you do not set exceptions for this.

Once you know the angle of the hit, all you have to do is reverse the direction of all motion in that direction.

Based upon a physical science class. I have some spreadsheets that have the formula for equations like this, with formating representing the math, and a manual page, where all numbers are replaced with what they represent, if it helps. If you use them, I would appreciate it if you were to put them up on your website under the most recent terms of the GNU General Public License.

http://www.gnu.org/licenses/gpl.html

The sensors are about as centred as they can be. The problem is not the actual angle calculation, that’s fairly easy, one of my test program already does this. I am having more problems being able to come up with an algorithmic way to see the difference between a bump from the grout, the constant noise created by the general movement of the robot and a small impact. The spikes you get from the two wheels hitting the grouting very slightly apart is quite hard to tell from a normal impact. The big impacts are not that hard to spot.

Well, you can always use the tachometer in your motors if you can’t get this to work using the accelerometer. There are also many designs for multi-directional bumpers, although those won’t let your robot react like you want it to.

Perhaps kalmann filters can help you, that’s a way to extract relevant data hidden into noise. Good website is http://naba.blogspot.com/2008/07/kalman-filter-for-lego-mindstorms-nxt.html

naba has release an odomety toolkit for the nxt called “libnxter”, that i want to test with the holonomic wheels, after i complete my own code for the tilted twister rubik’s cube solver

Regards

Benco

I wish I understood Kalman Filters a little better. I have some tutorials but my math skills are a little weak.

Xander,

I think you first have to calculate an expected value for each of the axes at any given measurement. These values can then be substracted from the actual measurements. The remainder is noise. The information you are looking for is hidden in this noise. It should be more easy to filter just the noise. The noise you measure can come from different sources, each source having its own characteristics (hopefully). I think that noise from:

– sensors, are not correlated over the axes, I mean it does not happen at the same time for the three axis.

– grout noise is somehow related the the force of gravity and is rhytmic.

– impact noise is related to the speed (and direction) of the impact.

If you give me a file with expected axis values as well I’m hapy to investigate it further.

Two more things. It seems you have about 5 to 7 sensor readings each time the robot turns around its axis. You might need to decrease the time between readings.

A kalman filter won’t help you here. A kalman filter is used to fuse data of multiple noisy sensors. It is not designed to distinguish between sensor noise and noise from different external sources.

Aswin,

The Excel sheets are attached to the article. The robot does not rotate at all, it just moved in one direction. The impact is towards the end of the data in each file.

OK, I didn’t know your robot wasn’t spinning because I can’t use YouTube in the train. But I think I got a solution for your problem. It is based on these steps.

1. Use the data collected from the smooth surface to get some statistics about sensor and movement noise. For each of the axes you need the average and standard deviation.

2. Then use these statistics to identify outlayers in any measurement. A measurement is an outlayer when it is smaller as the average minus twice the standard deviation or bigger as the average plus twice the standard deviation.

Until now it is just plain statistics. These steps points out any values that are unlikely to occur. In the next steps I take advantage of patterns I saw in the outlayers. I noticed a few things. First grouts induce very short outlayers, one or two measurements long, but they come in pairs of two or three (because the robot has three wheels). A collision on the other hand produces a longer series of outlayers, also they come alone. Last thing I noticed is that even in the outlayers there is noise. This knowledge is the basis for the next steps where first the noise is reduced and after that the duration of the disturbance is calculated.

3. Count the number of outlayers for each measurement (Add x y and z) This reduces the noise somewhat. You end up with a number between 0 and 3.

4. Calculate a moving average of the result of the previous step. This reduces the noise level even further. The number of measurements in the moving average I used was 5. But you might need to change this value for other speeds.

5. Count the number of consecutive moving averages that exceed a threshhold (0,3). This gives a duration of a disturbance.

6. If the duration exceeds another threshhold (20) it is unlikely to be a grout (or any other short disturbance) so it must be a collision.

7 and further. Once you identified a collision you can calculate the direction it came from by trigonometry using the x and y values.

Notice that this method does not take the force of the collision into respect, but only the duration. For this reason I hope it also works for small collisions.

I’ll send you an excel sheet that implements these steps by mail.

@Aswin, thanks so much for doing this analysis! This is great. I’ll try to implement this in ROBOTC over the next few days. The Excel sheet you sent me is awesome.

Hey Xander,

I thought you might find this article interesting, since it has to do with the type of wheels you’re using:

http://spectrum.ieee.org/automaton/robotics/industrial-robots/omniwheels

Best,

John

Thanks for the link John! There are pretty cool looking wheels there.

[…] when there is a disturbance somewhere. But how do I detect disturbances? A little work I did for Xanders omniwheel robot pointed me in the right direction. My Kalman filter not only gives predicted values for the […]

I’ve been playing around with a LIS302 accelerometer chip trying to get a similar ‘bump’ detection. (What chip is in your device?) The chip has built-in alert functions with programmable thresholds and durations. However I’ve not had much success. My guess is, that as these chips are aimed at drop and orientation detection in iPhone & similar consumer apps, the background rumble from a typical moving robot makes it difficult to detect a genuine bump. Also, with mechanical resonances and linkages in the robot structure a vertical ‘rumble’, as in the tiling grout, could easily result in vibration along the other axes. An additional problem is that the acceleration profile is dependent on the surface bumped into – I’ve spent hours with the bot bouncing between a cushion and a wall looking at this! Filtering helps to see what’s happening, I’ve a simple FIR filter implementation if you’re interested.

I’d be very interested in your filter. As for the type of chip in my sensor, I have no idea. It’s a HiTechnic Accelerometer, so it’s probably a small sensor with a microcontroller between it and the NXT to act as an I2C interface for it.

At the moment, my biggest problem is the low sample rate. When I originally did the experiment, I was able to sample every 10ms or so. Now I am sampling maybe every 30ms and I am losing a LOT of resolution. I am contemplating adding another NXT just to handle the sensors and leaving the motor control to the other NXT. Not a lot of time left before Lego World, though and it all has to be working by then.

The idea is simple enough – sort of a shift register where the samples are averaged, reducing the sample rate, and shifted along. The shift register, looking at it as a 1-D array, then gives (I think) samples at 0,-1,-3,-5,-7,-11,-15,-23,… times the sampling rate. It’s not actually a filter yet – for this the filter array needs to be averaged with weighting coefficients corresponding to the required filter response. It’s really useful though for intuitive investigation as you can see a history of 512 samples in 16 values – easy to fit in the debug window.

Another observation I had is that the bump-stop doesn’t give a single peak – there’s usually a bounce effect with the acceleration changes sign.

Hope this helps

#define FIR_Size 12

typedef struct {int count,StartTime; byte filter[FIR_Size][2];} tFIR_Filter;

void fir_Init(tFIR_Filter &f)

{ ubyte bit=1;

for (bit=0;bit<FIR_Size;bit++)

{ f.filter[bit][0];

f.filter[bit][1];

}

f.count=0;

f.StartTime=time10[0];

}

void fir_Add_Value(ubyte x,tFIR_Filter &f)

{ ubyte bit=1;

int mask=1;

int carry;

int temp;

int oldcount;

carry=x;

oldcount=f.count;

f.count=f.count+1;

for (bit=0;bit>1;// average two slots to carry to next slot

f.filter[bit][1]=f.filter[bit][0];

f.filter[bit][0]=temp;

mask=mask<<1; // shift mask to next bit position

}

else

{ f.filter[bit][1]=f.filter[bit][0];

f.filter[bit][0]=carry;

break; // no carry left so exit

}

}

}

After watching the video, the robot seems to move/run slower over the tiles with the grouting??? Just an observation!

It’s pure perception, the program in both runs is identical. Probably has to do with me holding the camera in a slightly different way.

This is quite interesting. Perhaps you can take a small average, say, if you take 4 readings, then average 1 & 3, then 2 & 4, or something like that, it might cut down on false positives. Also, if you mount the accelerometer on an arm, so that when the bot runs into the wall, the accelerometer swings forward (due to momentum), you get a

muchbetter reading. (This will also help differentiate between random sensor noise, true Z-bumps, and true X/Y-bumps, if you know what I mean.)Was that helpful? I fear I might have said a lot of words with a little meaning. I often have difficulty succinctly communicating vague/complex ideas. 😛

I tried all of that. In the end it turned out to work best when using statistical analysis and looking at the likelihood of a specific event being a collision by looking at its deviation. There are some tricks you can do with the numbers to check this.

The new CruizCore sensor I received yesterday apparently has collision detection in it but I haven’t looked at the details just yet

Even the sensor-on-the-arm idea?

The sensor on the arm wouldn’t work; the robot is allowed to suddenly change direction, which would cause the arm to swing.

but if you use odometry/gyro, you could tell ahead of time that the accelerometer would be detecting that… couldn’t you?

PS where and for how much did you get those wheels? (As in, the ones here: http://mightor.wordpress.com/2011/04/13/new-rotacaster-wheels/)

paying $80 USD doesn’t seem right.

The wheels were given to me by Rotacaster for testing. $80 US is indeed the price for them. These wheels will last longer than your NXT

I see. well, then, I guess I’ll be spending some serious cash when the newer wheels (the ones you have) come out… 😛

on second thought, have you any idea when they’re coming out? I would kind of like to get some soon, but I have no idea what will happen to the prices of the existing ones when the new ones come out, so I might wanna wait and see…? Am I making sense? I’m sorry.

Do you know approximately when the new wheels will become available to those of us in America?