DPRG List  

[DPRG] RE: How much do our choices effect our results?

Subject: [DPRG] RE: How much do our choices effect our results?
From: Randy M. Dumse rmd at newmicros.com
Date: Fri Aug 10 16:15:46 CDT 2007

dan michaels said: Friday, August 10, 2007 3:13 PM
> Hey Randy, I noticed your post on the DPRG forum.
> I'm not a member, so am mailing direct.
> I tried the SRF04 sonar last year, and quickly gave up on it. 
> The unit I have has terrible sidelobe pickup on the Rx 
> transducer side. People on one of the forums suggested 
> mounting it with the transducers oriented vertically, and 
> with the Rx side on top pointed upwards into thin air.
> I ended up using the Parallax Ping sonars, and they worked 
> much better than the SRF04.

Hey dan, Nice to hear from you. Thanks for the reply.Hope you
don't mind my replying so your email goes as a post to the DPRG
group. I want them to benefit from what you say too.

Several people have tried the MaxBotix sonar, and like it. It
seems to perform as good or better than the Ping, depending on
what characteristic you are looking for really is.

I've always liked the SRF04's because you can detect at close
range, do to .1" of the horn. I don't own a Ping so can't
comment on them. The single horn MaxBotix really is only useful
to about 6", and the Polaroid, something like 11" (but I'm
working from long past experience with this comment). So that
close in range of the SRF04's made them my choice.

I was really rather shocked when I tried them in the arena.
While I could still tell the one and only closest wall or object
with a rotation and scan, I couldn't follow the walls shape, and
I couldn't tell if there was a can further out than the closest
wall. Yes, the sonar was mounted horizontally, not vertically.
I've heard people who mount them vertically on a short robot
like I was using (sonar mounted about 3" or 4" above the floor,
sometimes get false readings from the floor when mounted
vertically. So you trade one set of problems for another.

In this application Travis was trying to do CAN CAN. He was, as
usual, on a tight budget, and didn't want to spring for
additional sensors. I'd figured we'd be able to tell where we
were in the arena by the sonar map. Nope. I was able to rotate a
full 360 and then try to rotate close enough beyond to stop at
the closest reading. If that was indeed the can, I was able to
approach the can, and usually grab it. (The sonar was mounted
just above the gripper fingers. And the close in detection
before closing the grippers was critical to even that success.) 

But without other sensors (inertial, IR, IMU, Odometery or what
not, nothing was very reliable, and sorting out anything but the
closest can (or by accident wall) just wasn't happening.

jBot uses its array of sonars to see passages between objects.
It's not perfect. I've seen it think two angled cars were a
closed canyon, when there would have been plenty of room to
pass, if jBot pressed it's course. Yet if it were using SRF04's,
it probably wouldn't have known there was a passage to begin
with, and if it did, it would have thought it was open on then
end, until it was even further in between the two cars (because
of its limited range saying "nothing out there") until it was
deeply in the canyon, then perceived, possibly, it had been
closed in on from all sides.

So perhaps we will have no improvements in robotics until we
stop taking the sensors industry has provided us to the point
"for granted" as the best that can be done. If we are a bit more
demanding, perhaps our success rates would go up. Or at least
that was the inspiration behind the original post.


More information about the DPRG mailing list