Bridgeware is the word you need to know for rapidly prototyping electronics and robotics. These components bridge programs running on computers with electronic components such as sensors and motors. There’s a good video about it by Trossen Robotics on You Tube. I bought my components from Trossen Robotics online and I recommend them (they have a number of how-to videos).
Day 1
I got my Phidgets components in the mail, and went right to work assembling my pan-tilt camera mount, controlled with two servos from a controller board, and connected to my laptop from there by a USB cable. A tiny allen wrench was included, but not a phillips screwdriver, which you will also need. You can see the components assembled and connected in the picture below.
The black thing in the back right corner is just a USB hub. This is connected to (on the right) an 8/8/8 controller (with lots of room for more components to plug in), and then to the mini joystick. The other USB connector plugs into a 4-servo controller, and then to the dual servos in the pan-tilt assembly in the back left of the picture.
On the software side, using the Phidgets .NET API was very easy. Just instantiate a Servo object, provide an event handler to run when a device is attached, and from there I was able to set each servo position or read the position back. The only confusing part was realizing that the servo controller board is represented by the Servo class, and the individual servos plugged into that controller are represented by the ServoServo class. (Was this API written in Bora Bora, perhaps?) I would have named them ServoController and Servo, respectively, but I got over it and moved on.
What you see visualized in the test application (see screenshot below) is a coordinate graph custom control that displays the position of both servos. When I hooked up the mini joystick, I made the mistake of mapping the joystick position to the graph, but then realized that my graph would be useless unless I was controlling everything with the joystick. I wanted to script out servo movements and replay the sequences and still have their position represented in the coordinate graph control, so I made that update (and ever since have been unable to calibrate the system to center itself).
Hooking up the hardware took only 15 minutes, and most of the application involving the Phidgets API took an hour or two at the most, but I spent the rest of the day creating the custom graph control and getting it to translate coordinate systems correctly. The joystick maps values from 0 to 1000 along each axis, the servos have a servo position range of -22 to 232 on one axis and something close to that on the other, and the graph control was 150 pixels wide. I made it general enough to work with any coordinate ranges on both axes.
First Impressions
I have to say that it’s really cool to see physical objects interacting with software that you write yourself. (It reminds me of fun I had with a hydraulic robotic arm I programmed in high school using an Apple 2c and low-level calls to the parallel port, but this is way more powerful). The bridgeware components are easy to connect, and the APIs are a breeze to use. Building the intelligence between inputs and outputs, however, is the really fun and challenging part. Even though this initial experiement was simple, I can already tell that coordinating a much more complicated set of inputs and outputs will require careful planning and the use of tools such as Microsoft Robotics Studio, which include the Concurrency & Coordination Runtime and Decentralized Software Services Protocol.
Now that I’ve gotten my feet wet and have some confidence that I can build, connect, and interface with these components (and have a box full of other goodies like 3-axis accelerometers, light and temperature sensors, sliders, buttons, and switches), I have a bunch of ideas for at least one cool summer project that I hope to announce in the next month or two.