DPRG
DPRG List  



[DPRG] RE: How much do our choices effect our results?

Subject: [DPRG] RE: How much do our choices effect our results?
From: John Abshier jabshier at kc.rr.com
Date: Wed Aug 29 16:38:12 CDT 2007

Brainstorming Sensors

 

What sensors we need depend on what we want our robot to do.  If I want a
robot to paint a car body, sensors are not a problem.  An ultrasonic sensor
can detect that a body is in the work station.  Encoders can measure joint
angles and from that I can calculate the location of the paint nozzle.  A
flow meter can tell me if paint is actually being painted.  I think what
many of us want is a robot that can navigate in the real world and interact
with or manipulate objects (including people) in that world.  Hobby robotics
contests are simplifications of the real world as chess is a simplification
of war.  

 

Presently we use ultrasonic sensors for obstacle detection, sometimes
specialized to finding walls and doors, and sometimes for localization and
mapping.  Sharp range finders and laser range finders are used for the same
tasks.  Bumper switches are used for obstacle detection.  The primary sensor
that humans use for these tasks is vision.

 

Phototransistors are mostly used for line following.  A human would use
vision.

 

An Inertial Measurement Unit can provide an orientation in 3d space.  A
human would use vision, inner ear, and general body feel (cannot think of
the proper technical term).  An IMU, a compass and encoders can do odometry.
Humans usually do not use odometry to navigate.

 

Thermopiles can detect heat.  A UV sensor can find a flame.  A human would
use vision and/or feeling heat on the skin.

 

Force sensors measure force.  Humans use touch or general body feel.

 

Microphones detect sound.  Humans use ears.  One use of sound is
communications.  For robots there are many radio systems that can be used
(WiFi, Bluetooth, Zigbee, etc.).  Another use of sound is as a cue.  I look
in the direction of the sound.  Some objects can be identified by their
sound signatures.  

 

A clock measures time of day and can be used to measure elapsed time.  I am
not sure how I do it, but I can poorly tell the time of day and very
inaccurately measure elapsed time.

 

Although not technically a sensor, the brain fuses all that we sense and
provides a "picture" of the environment.  

 

I would put the emphasis on vision.  Several efforts are underway and could
perhaps be combined.  Evolution robotics uses object recognition (with the
SIFT algorithm) to do localization and mapping.  Flash lidars provide color
and range.  People are working on increasing the dynamic range of cameras.
Some people are mimicking early human vision processing in hardware.  There
was a project to implement SIFT on a FPGA.  LeCun has used neural nets to
recognize objects and to control a robot.  Sadly, the cost of these
technologies will probably not come down quickly.

 

 

 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://list.dprg.org/pipermail/dprglist/attachments/20070829/2774db44/attachment.html

More information about the DPRG mailing list