DPRG List  

[DPRG] Fwd: Thingiverse Message - OFF ROAD EP

Subject: [DPRG] Fwd: Thingiverse Message - OFF ROAD EP
From: Doug Paradis paradug at gmail.com
Date: Wed Dec 17 12:14:48 CST 2014

David A.,
     Here is the response that I got from David Beck concerning his 3D
printed rover test bed robot (http://www.thingiverse.com/thing:470486).
When David sends the links to the video and code, I will share them. He has
responded to the message below and has no issue with me sharing the
information with the list. He also answered my questions concerning
outdoors LiDAR range, which I will send in a separate message.

Doug P.
---------- Forwarded message ----------
From: Paradug <paradug at gmail.com>
Date: Wed, Dec 17, 2014 at 1:37 AM
Subject: Re: Thingiverse Message - OFF ROAD EP
To: dbeck at beckengineering.com.au

      Thank you for the detailed response. I would love to have the links
to the videos and code. Would you mind if I shared your response on the
DPRG mailing list?  What is the practical outdoor range of your LiDAR unit?
How bad is it impacted by full sun light?

Doug Paradis

 *From:* Makerbot Thingiverse <no-reply at thingiverse.com>
*Sent:* Tuesday, December 16, 2014 5:31 PM
*To:* doug paradis <paradug at hotmail.com>
*Subject:* Thingiverse Message - OFF ROAD EP

Hello paradug

The user davidbec08
on Thingiverse has sent you the following message:


Dear Doug,

Thanks for contacting me. I'm very happy to help if I can, and I'm planning
on developing it further so feedback helps.

my email, if you prefer to contact me that way is:

dbeck at beckengineering.com.au

>Do you have any other documentation or video of the robot besides

I have several versions of code that achieve different things and movies of

1) a version that just uses the GPS to navigate a series of waypoints. I
have video of that and would be happy to share the code too. I'm away from
home at the moment but will be back soon (send me an email and I'll send
you the links).

In that version it uses only the compass for direction rather than the GPS
because it turns while stationary (is skid steer). i.e. when turning it is
not moving so the GPS can't be used to determine direction.

It is pretty simple: circle on the spot until facing the right direction,
go forward until more than 10 degrees off heading, then re-orientate. It
actually works, but means doing a spin every 10m or so (people like that -
they think it is cute. I don't agree).

Using the IR sensors for collision avoidance in this version would be easy,
but I didn't bother yet because I'm tackling the technically difficult
things first, like getting the LIDAR to work while everything else is

I had a lot of issues getting the compass/gyro/accelerometer to work while
the LIDAR was going. The Mega was just not up to it. So much LIDAR data was
lost it was not possible, and I was losing acc/comp data too, or rather,
getting rubbish. It simply failed too often (it was almost random for it to
avoid obstacles).

I bought the Intel Galileo 2 to fix that, and got the LIDAR going with
that. The current plan is to use the MEGA for everything except the LIDAR,
then have the LIDAR/Galileo simply send a Go-No Go signal type of message.

In theory this should work for that application.

If you want that code and the video and a close up of the wiring I have no
problem sharing it, though you can figure out the wiring from the code.

Please note though: I am a mining engineer, not an electronics engineer.
You may find it sadly inefficient.

The next step: more complete obstacle avoidance using the LIDAR data i.e.
use the scan of the current environment around the vehicle to constrain the
detour before returning to it's navigation 'mission'

2) versions that use the Lidar for object detection (or rather collision

I have video of that failing and working. I gave it random navigation

It misses children and walls but phone screens are seemingly almost
invisible to LIDAR.

>>Have you been able to get the robot to determine location via the the
GPS, IMU, and LiDAR?

It can determine its position and set a bearing using GPS but is not
achieving SLAM.

So far all I have achieved is navigation via waypoints using GPS. It
doesn't have gyros, just a compass and accelerometers, so no 'IMU' . What
it does have is just going to be used for detecting tilt and possibly for
augmenting very basic grid/cell/marker based SLAM if I ever get there.

My idea was just to get everything basically working first (i.e sort out
the hardware issues) then work on the more complex programming task of SLAM
once the packaging was sorted.

Since getting it to the stage it can navigate and avoid objects while
navigating, the biggest hurdle has been that I haven't had time and my
children wanted to make a walking robot. They are the bosses! The walking
robots are on Thingiverse too.

>>The motors appear to be controlled with relays. This seems inconsistent
with the sophistication of the sensor array.

It probably is, but a) I am a mining engineer and b) those motors are about
the minimum power needed if it is to be as mobile as needed and they draw a
lot of current. In a very much smaller prototype I was using L298 H Bridge
ICs, eventually resorting to one per motor, but still they could not handle
the current during a turn. I understand relays so went with that.

If there is a simple motor controller that can handle 4x2 to 3A, I would
get rid of them in a moment. I'm sure there is but I just went with the
expedient solution. I actually think they look good too!

Send me your email address and I'll forward links to movies and my very
simplistic programming.



Please do not reply to this message. If you would like to respond, visit
their profile
and send them a message in response.

Your happy message passing robot,
Thingiverse.com Mailman
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://list.dprg.org/pipermail/dprglist/attachments/20141217/46e9f334/attachment.html 

More information about the DPRG mailing list