12 Dec 2010
robotsros
It's no secret that I'm a big fan of low-cost sensory. In fact, I'm a bit obsessed with it.
And lately, we've been in cheap-sensor overload. Last month I picked up a Kinect, although
I haven't had a ton of time to play with it yet. Luckily it looks as though the OpenNI+ROS
drivers will be in good shape by the end of the week -- when our semester ends and I'll have
a bit more free time.
On the subject of ROS, I'm also a big fan of that. Which brings us to the true reason for this
post: I picked up a Neato XV-11 robotic vacuum this week (Thursday) and it arrived Friday
(Amazon Prime $3.99 next-day shipping FTW). Friday night I spent about 4 hours getting the
laser scanner and motor basics lined up, and then drove it around for a little while to
discover numerous bugs. This afternoon I worked out most of the bugs, although I still need
to work on the odometry calculations a bit more. Anyways, here's a quick video:
A couple of thanks to send out -- had it not been for
this blog post
by Hash79 of the Trossen Robotics Community, I probably wouldn't have even bought a Neato --
but all that data! The Neato looks like it could be a very interesting competitor for the iRobot Create
-- hopefully I can get the odometry/laser data to work in gmapping (so far, I've had *no* luck).
More tomorrow -- as well as a code release (after cleaning things up).
01 Nov 2010
robotsrosslam
Several weeks ago I built a new, small robot platform that could be used as a testbed for the ArbotiX-ROS bindings.
That robot is the Armadillo. The Armadillo has a differential drive base with a pretty decent payload, a FitPC2 brain,
and a 4DOF arm. He's also sporting a Hokuyo URG-04LX-UG01 laser range finder.
The poor Armadillo was sitting around for quite a while until I upgraded his motor drivers to handle the extra weight
of the platform. However, he's now fully operational. To test out whether his odometry would be good enough to
work with the ROS navigation stack, he was driven around to build a series of maps.
The first map is of the first floor of my house. The Armadillo started in the living room, when down the hallway and
into the kitchen, and then returned. The map came out pretty good, I plan to collect a map of the complete house later
this week.
A second, much larger map was made of the CS department hallways. This one had some issues. In particular the scan
matching was creating false positives, which "shortened" the hallways. I'm still hopeful this can be made to work
though, with a bit more parameter tweaking. Below, the image on the left is the map from gmapping, and the image on
the right is a raw odometry-based costmap in RViz:
09 Aug 2010
arbotixrobotsros
I've been a bit ineffective at finishing things lately, and thus haven't had much to say in this blog.
Issy got a new laser cut body,
some teaser pictures are over at TRC,
but I haven't gotten back to software development on him yet.
I've been a bit distracted playing around with navigation and motion control under ROS. During July I built a "Poor Man's LIDAR"
(or PML for short), out of a long range IR sensor and an AX-12. The image to the left shows the PML mounted on ROSalyn, an
iRobot Create based ROS-powered robot I've recently assembled at the University lab. I'm using a new ArbotiX-derivative board
to control an AX-12 pan and tilt and the PML.
I actually bought the sensor about 18 months ago -- originally to put on REX (may he rest in pieces), but hadn't gotten around
to actually hooking it up until recently (partially inspired by
the successes that Bob Mottram had).
All in all, it works fairly well -- way better results than I ever had with a sonar ring, but of course nowhere near a true LIDAR.
The PML results are broadcast within ROS as if they were actually produced by a laser scanner. You can see the scan (black dots)
and then a costmap_2d generation (red dots are lethal objects, blue dots are expanded version for motion planning) showing up in
the RVIZ view (the robot is at the end of our hallway, the range of the costmap is 3m, less than the 5 or so meters that the
laser can trace out, so there's no costmap generation of the walls inside the rooms in the distance, just the laser scan dots).
All of this ROS work is towards the goal of producing a very robust, and extensive ROS package for the ArbotiX. The core of
the package allows the ArbotiX to control AX-12 servos, read/write digital IO, and read analog inputs -- all within ROS.
There's also extensions to control differential drive robots, or NUKE-powered walkers, using the standard "cmd_vel" topic
-- and publish odometry such that the bots could be tied into the ROS navigation stack. Version 0.1 is now in SVN, although
the ROS API is quite unstable and will be changing drastically in 0.2 (to a much nicer, and more robust interface, which
also sets up several features I want to implement further down the line).