Critical Development

Language design, framework development, UI design, robotics and more.

Archive for the ‘Personal’ Category

Twitter & the “Modeling Oslo” Microblog

Posted by Dan Vanderboom on January 29, 2009

I started using Twitter in December at the insistence of my colleagues at CarSpot, and I really didn’t get it at first.  Now that I’ve been “tweeting” for several weeks, and have my first few “followers”, I’m starting to see it as a useful communication tool (and apparently the fastest growing one on the Internet, according to podcast This Week In Tech).

I’ve been using it for a number of things, though I try (as on my blog) to stick with .NET programming topics.  What I find is that some of my thoughts and avenues of exploration crescendo to the point where I have to blog about it, but most pursuits are smaller and never cross the threshold.  Some of these small adventures in programming with new technologies are worth writing about, but I set such high expectations on myself for completeness of coverage, ability to reference reputable sources, providing adequately tested code samples, writing style including correct grammar and spelling, etc., that I put up a barrier to entry for each thought I might blog about.

Enter Twitter, the casual microblog.  With Twitter, I can share informal tidbits of knowledge, the status of progress on my projects, or just my confusion and frustration at the moment.  As I watch people watch each other on Twitter, I realize they’re learning from each other, and this provides not only as a channel of knowledge transfer, but also a sort of insider’s scoop, and an early look at larger things brewing (like a blog article, or a book).

I also occasionally make announcements as open invitations to meet somewhere fun, in case you happen to live in the Milwaukee area (or elsewhere when I travel).  So if you’re one of those that has caught the Twitter bug, or you’re just curious to see what I’m working on, feel free to listen in.

http://twitter.com/danvanderboom

Twitter Microblog on Microsoft Oslo

One of the things I’ve been studying intensely and starting to work with is Oslo.  I wrote an article to explain it, or at least to scratch the surface.  As I explore and experiment further, I decided that a Twitter account would be a perfect way to let people know what I’m doing with it.  I also plan to make important announcements about release schedules, suggest good articles and resources, point out key conversations (including some of my own) in the MSDN Oslo Forum and elsewhere, and share insights and fundamental concepts/definitions.

Whether you’re interested in Oslo and don’t know where to begin, are considering incorporating it next year, or you’ve decided to jump in and start using it now like me and simply want to compare notes with someone else in the trenches, feel free to tune in.

http://twitter.com/modelingoslo

The name refers to the fact that Oslo itself must be modeled, and that a discussion about modeling is meta-modeling.

I’m going off the beaten path with this one, so if you have any ideas for the type of content you’d like to see in an Oslo microblog, feel free to leave a comment and share your thoughts!

Advertisements

Posted in Oslo, Personal, Twitter | Leave a Comment »

Alienware M17: Ninja Laptop

Posted by Dan Vanderboom on January 27, 2009

My new laptop, an Alienware M17, arrived earlier this morning.  It’s almost fully loaded, sans dual video cards and dual hard drives (after changing my mind last minute).  First impressions?  In stunning matte black, with its ribbed Skull Cap cover design, a back-lit keyboard, and a soft fingerprint-proof and scratch-resistant surface, it’s absolutely gorgeous!  With the keys glowing red, it makes me want to do my programming in the dark.

See for yourself, though I have to say, it’s even sexier in person.

image image

The only thing that confused me was the pair of mouse buttons, which aren’t separated by any space or visual cue.  When I first saw it, I thought it was some kind of touch-sensitive slider bar.  Then I was afraid they’d given me some kind of Mac mouse, but once I figured out they were separate areas to press for left and right buttons, I was enormously relieved.

I’ve wanted an Alienware ever since I first saw their high-end configurations and sleek designs, and now that they’re owned by Dell, they have the same warranty options for hassle-free, next-day on-site service.  As many problems as I’ve had with Dell hardware, there’s nothing like the peace of mind of knowing that it’ll be taken care of immediately.

The shopping experience was almost perfect.  One minor flaw: their website shows order tracking before it gets shipped out, and after reaching a certain phase of the process (order confirmation, billing, pre-production, etc.), it kept going back to phase 1, Order Confirmation.  I watched it jump several times from being almost ready to ship, back to order confirmation, and had to call to confirm that it was their tracking system and not my order that was messed up.

It was shipped through FedEx, and I missed the delivery by twenty minutes.  On a Saturday.  For some reason, FedEx doesn’t deliver on Sunday or Monday, at least not to my house.  I called to see if I could meet the truck to pick it up, and the dispatcher promised to send the message out to the truck, but I never got a call back.  Not a big deal to wait a few extra days, but you can imagine by excitement, and then my frustration.  To make matters worse, FedEx’s online package tracking sucks.  It’s not real time.  By the time they tried delivering it, I had just seen it show up as leaving its previous stopping point (in another state).  I thought these carriers knew exactly where each package was at all times!  If so, this information does not make it to their website in a timely fashion.

At 3.06 GHz, with 4 GB of DDR3 1064 MHz RAM, and an ATI Mobility Radeon video card with 512 MB RAM (for a software engineer, not a gamer), this machine hit 5.6 on the Windows Vista performance index.  This is even better than the 5.3 that my Bad Ass Development Rig scored, although it’s not a fair comparison (and the Vista performance index isn’t a real measurement of performance anyway).

After building my desktop, I learned that it would cost me $200 or so to publish the results of the PCMark performance tests online.  So if you’re curious to know what my desktop or this laptop scored, feel free to leave a comment (and your email, which isn’t shared), and I’ll be happy to share that privately.

This machine seems to be all about the nice little touches, not unlike the subtle details of a luxury automobile: the soft black finish of the case, a plethora of ports (USB, Firewire, Coaxial, SATA, HDMI, etc.), a 2 megapixel camera built into the lid that can pivot to aim higher or lower, the touch sensitive media control bar at the top of the keyboard, the keyboard’s smooth feel, and so on.

I was expecting it to be extremely heavy, and by laptop standards I’m sure it is (with its 17 inch monitor), but as I hefted the package into the house, I was surprised by how light it felt, so it’s still extremely mobile.  The power brick, on the other hand, is truly a monster, but will be stuffed lovingly anyway into my backpack wherever I go.  It will have to go with me, since my expected battery life is only two hours.

So if you have $3,300 burning a hole in your pocket and need a blazing fast mobile monster of a machine, I highly recommend the Alienware M17.  If not, they do have cheaper configurations starting at around $1,800.

Posted in Development Environment, Hardware, Personal | 6 Comments »

Bad Ass Development Rig

Posted by Dan Vanderboom on August 23, 2008

[The powerful workstation described in this article is now for sale on eBay! Click here to see!]

The Need For Speed

I’m not a gamer, I’m a developer.  When I’m on my computer for eight to ten hours a day, I’m typically not rendering graphics, but rather writing, compiling, and testing code.  The writing part hardly requires any resources, but compiling code completely pegs out one of the processors on my dual core laptop (a 2.4 GHz Dell Latitude D830).  Parallel compilers exist, but C# in Visual Studio is not one of them, and by the sound of things, won’t be for quite some time.  This means that if I’m going to see a significant performance increase of this critical task, I’m going to need the fastest processor I can get (and overclock).

Compiling code is also disk intensive, especially toward the end of a build when output files are written to disk.  I ran some benchmarks of C# builds (in Visual Studio) of SharpDevelop.  I chose this code base because it’s fairly large, similar to my own solutions, and it’s open source so others can repeat our tests.  We tracked utilization of individual processors, disk I/O, etc.

Why am I so hell bent on compiling code as fast as possible?  Good question.

mobo

Micro Development Cycles

Software development consists of nested cycles.  There are organizational cycles that envelop project cycles that envelop several-week development sprints, and at the smallest level, it really all boils down to executing many micro development cycles every day.  You start with a goal such as fixing a bug or implementing a feature, do some generally-informal design in your head, plan out your work (again, typically in your head), write code for a few minutes, compile and fix until the build succeeds, deploy if necessary, test the changes, and repeat this sequence anywhere from 20 to 50 or more times in a productive day.  If you do test driven development, you write your tests before the functional code itself, but the cycle is otherwise essentially the same.

Develoment Cycle

Some of these steps take longer than others, and some of them, like designing or thinking about what code to write and where that logic belongs, are creative in nature and can’t be rushed.  But when we work on larger solutions in Visual Studio (and other tools), the time for tools to perform critical processing (compiling code in this case) can lead to Twiddling Thumb Syndrome (TTS).  This is not only an unfortunate affliction, it’s also one that can cause Turret-like symptoms, including swearing at one’s computer, at one’s software tools, and banging on things to entertain oneself while waiting for things to finish.  Sometimes, depending on your projects’ interdependencies and other details, build times can shoot up to several minutes (or worse) per build.  For a long time, I was getting build times in the 5-7 minute range, and it grows as solutions become larger.  Repeat this just 20 times (during which your computer is totally unresponsive) and you’ll quickly get the idea that you’re wasting lots of valuable time (two hours a day!).

Clearly this is unacceptable.  Even if my builds only took a minute, all of the aggregated time spent waiting for progress bars of all kinds (not just compiling) can add up to a significant chunk of wasted time.  In Scott Hanselman’s Ultimate Developer Rig article, which played a part in motivating me to build my own Ultimate Developer Rig, Scott hits the nail on the head:

I don’t want to have time to THINK about what it’s doing while I wait. I wait, in aggregate, at least 15 minutes a day, in a thousand tiny cuts of 10 seconds each, for my computer to finish doing something. Not compile-somethings, but I clicked-a-button-and-nothing-happened-oh-it-was-hung-somethings. Unacceptable. 15 minutes a day is 21.6 hours a year – or three full days – wasted.

I think Scott is being too conservative in his estimate.  It’s easy to waste at least 20-30 minutes a day waiting for general sluggishness, and considerably more when waiting for builds of large solutions.  If you have a computer that’s a few years old, it’s probably worse.  Thirty minutes a day is about 125 hours per year (over 3 weeks), and an hour a day is 6 weeks per year.

Flow = Mental Continuity

Look at it from another perspective.  Even if wasted time isn’t an issue, there’s still a matter of maintaining continuity of thought (and execution).  When we have a plan and are ready to act on it, but are held back behind some bottleneck in the process, we risk losing the fluid flow or mental momentum that we’ve built up.  Often, I have a sequence of pretty good ideas or things I’d like to try, but I end up waiting so long for the first step to finish, that by the time the computer is ready, I’ve lost track of my direction or next step.  This isn’t as much of a problem with long-term planning because those goals and steps tend to be written down, perhaps tracked in some kind of Scrum tool.  But when we’re talking about micro development cycles, a post-it note can be too formal (though when I’m waiting a lot, I do use these).  If we could get near-immediate feedback on our coding choices and reduce the wait time between the execution of tasks, we could maintain this flow better and longer, and our work would benefit from the increased mental continuity.

One analogy is that of reading a programming book.  Some of them are 800-1000 pages or more.  When you read one slowly, say a chapter every other week, it takes so long to read that by the time you finish chapter 10, you have a really hard time remembering what chapter 2 was all about.  But if you focus more and read through the same book in a week, then chapter 2 will still be fresh in your mind when you get to chapter 10, and you’ll be much better able to relate and connect ideas between them.  The whole book sticks in your memory better because all of its content is more cohesive in your mind.

Cost Justification

Scott created a nice computer for the price range he was shooting for, but for my own purposes, I wanted to go with something more extreme.  When I started playing with the numbers, I asked myself what the monthly cost would be for a top-of-the-line, $5,000 to $6,000 power machine.  Spread over 3 years, it comes to only $166 per month.  If you consider the proportion of this cost to the salary of a developer, figure out how much all of our unnecessary wasted time is worth, and realize that this is the primary and constantly-used hardware tool of an engineer, I think it’s very easy to justify.  This isn’t some elliptical trainer that’ll get used for two weeks and then spend the next five years in the garage or the storage shed.  This beast of burden will be put to serious work every day, and will make it easier and more pleasant to get work done.  In an age where we don’t even blink an eye at spending $1,000 on comfortable and ergonomic Herman Miller chairs, I think we’re long overdue for software engineers to start equipping themselves with appropriately-powerful computer hardware.

Compare the cost of a great workstation with the tools of other trades (carpentry, plumbing, automotive repair, etc.) and you’ll find that software development shops like to cut corners and go cheap on hardware because it’s possible to do so, not because it makes the most sense and delivers the greatest possible value.  If you’re in a warehouse and need a forklift, there’s no two ways about it.  But computers are commodities, and though they come in all shapes, sizes, and levels of power, the software you need will normally run on the slowest and most sluggish among them.

Welcome to My Home Office

Welcome to my office.  Since it’s going to appear in the background of many pictures, I thought I’d give a quick tour.  This is my brain dump wall, where many of my ideas first take form.

IMG_0080

And around the corner from this room is the greatest Jack Daniel’s bar in the world, built by Christian Trauth (with a little bit of help from myself).

5

Bad Ass Components

I decided to take a field trip one day, and drove from the Milwaukee area where we live down to Fry’s in Chicago.  This was my first time to a Fry’s.  If you’ve never been to one, just imagine Disney World for computer geeks.  They’re absolutely huge (about 70,000 square feet of computer parts and other electronics).  I bought almost everything I needed there, having ordered a few parts online before this field trip took place.

Here’s what I picked up:

Intel D5400XS “SkullTrail” Motherboard – $575
Intel Core 2 Extreme Processor (QX97750) – $1510 (Tom’s Hardware Review)
  • 3.20 GHz (without overclocking)
  • 1600 MHz FSB
  • 12 MB L2 Cache
ThermalTake Bigwater 760is – $170
  • 2U Bay Drives Liquid Cooling System
Adaptec 5805 RAID Controller – $550
  • 8-Lane PCI Express
  • 512 MB DDR2 Cache
  • Battery Backup
3 Western Digital Velociraptor Hard Drives – $875
  • 900 GB Total
  • 10,000 rpm
  • SATA
8 GB (4 x 2 GB) of PC2-6400 RAM – $400
  • 800 MHz
  • ECC
  • Fully Buffered
GeForce 9800 GTX Video Card – $250
  • PCI Express 2.0
  • SLI Ready
  • 512 MB DDR3
Coolermaster Case – CMStacker 830 SE – $350
  • 1000 Watt Power Supply
  • Lots of Fan Slots
  • Very Modular

Total Damage – $4720

This doesn’t include extra fans (still need to purchase about 11 of them), and the things I already have: a pair of 24 inch monitors, Logitech G15 gaming keyboard (nice for the extra programmable keys), mouse, CD/DVD burner, media card reader, etc.  (When I calculate the cost at $166 per month over 3 years, it’s based on a total price tag of $6,000.)

Building a Bad Ass Development Rig

In the first picture are boxes for the case (and included 1000 Watt power supply), motherboard, video card, memory, and liquid cooling system.  The next two pictures show the motherboard mounted on the motherboard tray, which slides easily into the back of the case.  Notice how huge the video card is (on the right).  It takes up two slots for width, though it doesn’t plug into two slots (I’m not really much of a gamer, so no SLI for me).  The smaller card in the picture on the right is the Adaptec RAID controller.  I chose the slots that I did to maximize airflow; when I first put the graphics card in, it was partially obstructing a fan on the motherboard, so I moved it to the edge.  This blocked a connector on the motherboard, so I ended up moving it again.  Finding the right setup was a matter of trial and error.

1 

Below you can see all the power cables hanging from the case.  They’re wrapped in a strong mesh that keeps the cables bundled together for improved airflow.  On the right, you can see a swinging door with dust filters and empty spaces for four fans (up to 150mm, not included with the case).  Notice the fan on the motherboard tray, and there’s a slot for another one in the roof of the case that you can’t see.  In addition to the fans, the sides, bottom, top, and front all let air pass through for maximum airflow.  The drives on the right are Western Digital Velociraptors: 300 GB and 10,000 rpm.  When set up in a RAID 0 (striping) configuration, they should provide wicked fast disk access, which is important because I’ll be running multiple virtual machines at a time.

2

Next you can see the modular drive cage, which is easier to install the drives in when it’s removed from the case (a couple screws on each side).  It’s nice that it has a fan on the front.  Overall, I’m very impressed with the case and all the attention to detail that’s gone into its design.  It was extremely easy to work with and reach everything.  It’s been several years since I’ve built a desktop computer, and I remember a lot more frustration when trying to reach inside of cases.  Notice that when I put the drive cage back in, I installed it higher up.  I couldn’t put it any higher because of power cables getting in the way, but I need a place for a CD/DVD burner and maybe a media reader anyway.  I moved it up because I’ll be installing a liquid cooling unit, and the radiator takes up two drive height units (2U).  If there’s any chance of that thing leaking (it better not!), I certainly don’t want water dripping down onto my hard drives.

3

Now I’m starting to wire everything up: power and reset switches, power LEDs, USB and Firewire ports, power to the motherboard and video card, power to the hard drives, and the interface cables from the RAID controller to the hard drives.  I start twist-tying the slack on cables and stuff the unused power cables into the ceiling of the case, where they stow away nicely (with more twist ties).  The right-most picture shows some of the other stuff included with the case: an IDE cable bound inside a tubular plastic sheath (for better airflow), SATA cables that I didn’t need because I used the ones that came with the RAID controller, a fan mount for one of the 11 heat sinks on the motherboard (fan not included), and a fun Do-Not-Disturb-style doorknob sign (included with the motherboard) that says “Warning: Noobs Beware.  You will be Pwned.”  And indeed you will be!

4

It’s finally time for some liquid cooling action.  With a motherboard called SkullTrail that was designed for overclocking two 771 processing chips, and a single QX9775 Core 2 Extreme quad core 3.2 GHz processor (to start with), you better believe I’ll be overclocking this bad ass machine to its limit!  I’ve heard rumors that 4.5 GHz is very manageable, and am hoping to be able to pull off upwards of 5 GHz, but we’ll see how it goes (with another processor, this would total 40 GHz across 8 parallel cores).  So far, the liquid cooling tops all other components in documentation: one user guide and one maintenance guide.  And don’t forget that you can’t take a sip of the cooling fluid, or eat any of the rock candy packets that come with the hard drives.  I know it’s tempting.

IMG_0154

I Hit a Snag

The liquid cooling unit doesn’t fit.  It’s close, and I debated whether to let it hang out of the front of the case an inch and a half because of the motherboard being too close.  Not good enough.  Back to the drawing board!

6 

The only way the cooling unit would fit flush in the case was if we cut out one of the aluminum support beams along the top of the case, and inserted the cooling unit in the top drive bays.  This would put the liquid above my hard drives, which I was trying to avoid, but I didn’t have any choice at this point.  So we jumped in his car and stopped at his place to pick up a dremmel.  Ten minutes later we were in the garage, case stripped down and upside down, cutting away.  You can see the end result in the photo on the right, which turned out very nice.

7

Finally, the liquid cooling system fits flush in the case.  We noticed that the cooling system had a fan speed control rheostat connected to it on a wire, and thought it would be nice if we could expose that through the case somehow, so we drilled a hole and fed it through the top (near the power and reset buttons).  I found a knob that fit on it from a robotics kit I purchased a few months ago, and it even matched the color of the case.  Bonus!  You can see the new knob in the picture below on the right.

8

Almost ready to boot up!  I’m waiting for the processor to arrive, and expect it any day now.  As soon as that comes in, I’ll be writing the next article in this Bad Ass Development Rig series, and we’ll see how much we can get this bad mamma jamma overclocked (without making it unstable, of course).  After that, I’ll be setting up the virtual machine system, all of my development environments, and then we’ll do some serious benchmarking.

Posted in Development Environment, Hardware, Personal | 16 Comments »

Software Development Meme

Posted by Dan Vanderboom on June 10, 2008

A meme, if you haven’t heard the term, is a word coined by Richard Dawkins.  It’s analogous to a gene, except that instead of being a unit of biological replication, a meme is a unit of cultural replication.  Though it gets a little melodramatic at the end, there’s a good video at TED.com that explains it fairly well.

So to follow the form of this meme, Damon Payne called me out on a series of questions about my experiences with software development.  I’ve answered some of these questions in my About Me page, but I think it’s good to share these things with new developers and those considering entering the field, so I’ll play along (with a little bit of copy-paste cheating).

I’ve also generalized it with new questions in parentheses, because there are exciting careers other than software engineering that people can benefit from hearing about.  Introducing this mutation in the meme, I believe, will provide some great perspective on other types of careers from some very interesting people.

How old were you when you started programming?

(How old were you when you started in your current career?)

I started programming at the age of 9 on an Apple 2c that my parents bought for Christmas, and also played around with Commodore 64, TI-99 4A, and gaming with the Atari 2600. After writing my first few programs, I already knew I wanted to do that for a living.  At the time, it probably would have been considered a very unrealistic career choice considering how immature the industry was, but I worked at it for hours every day for fun, and it’s definitely paid off. While in grade school, my parents signed me up for a high school programming class over the summer, and despite being by far the youngest and shortest kid in the class (and also because of that), it was a lot of fun.

What was your first language?

(What was the first technology you became familiar with?)

Apple BASIC.  No text editor.  You had to enter lines of logic from a DOS prompt by prefixing each line with the infamous "line number".  Inserting lines between existing lines meant finding an unused line number between them, so lines were written in increments of 10, 100, or whatever.

What was the first real program you wrote?

(What was the first achievement in your education that you were proud of?)

That depends on your definition of a real program.  They’re all real, as far as I can tell.  The first program I wrote for commercial use?  The first genuinely useful program?  That’s so long ago, I couldn’t say.  I was very much into linguistic analysis from an early age.  I used to write natural language processors that would, like Eliza applications years later, understand English statements and respond intelligently.  Mostly, I was breaking down sentences into syntactic trees and trying to determine tenses and so on.

I also played with graphics.  A few years after I started coding, I figured out enough trigonometry to draw points and then wireframes using 3D coordinates and got them to rotate on the screen.

Another hobby was creating Zork-like text games.  Go north.  Look at potion.  Pick up potion.  Drink potion.  Go south.  Fight monster.  That type of thing.  Boring by today’s standards, but representing the world, inventory, character status, etc., it was very gratifying at the time.  Guess you’d have to be there.

What languages have you used since you started programming?

(What other technologies have you learned since you started?)

Apple BASIC, Turbo Pascal, Apple Pascal, C, C++, 8086 Assembler, VBA, VB6, VB.NET, C#, and countless scripting languages.

What was your first professional programming gig?

(What was your first professional white-collar job?)

My first commercial application was working for my Uncle Ted at Pitney Bowes.  They were doing a project for UPS, and the DOS computers they were rolling out needed some kind of menu front end to appear and launch applications on boot up.  Using Turbo Pascal, I created a configuration-file-driven menu with multiple menu pages.  The menu buttons had some neat display effect, and they could be selected to launch an application or jump to another menu page.  It took me two half days, if I remember correctly, I was paid in cash, and they paid for and fetched my lunch.  I was 15 at the time, and I loved every minute of it.  I knew then that a career in computer programming was feasible.

If you knew then what you know now, would you have started programming?

(If you knew then what you know now, would you have started in your present career?)

Yes.  Even if I hadn’t pursued programming as a career, I believe that even tinkering with programming has become very useful in many, especially scientific, careers.  I wouldn’t have gone about it in the same way, but then who would?

If there is one thing you learned along the way that you would tell new developers, what would it be?

(If there’s one or two things you learned along the way that you found have been instrumental to your success, what would you like to share with newbies?)

There are many things I would say.  Don’t be intimidated by the technology; even the best start out completely ignorant.  Take a systematic approach to learning and mastering whatever you need to accomplish your goals, and especially master the language and tools you’ll be using.  Have personal goals you want to achieve; don’t write code to serve others exclusively.  Work on fun projects.  Expand your horizons and explore areas you’ve never had experience with before.  But if I could share only one thing, it would be to balance learning of technical skills with the so-called soft skills: communication, presenting, negotiating, planning, etc.

What’s the most fun you’ve ever had programming?

(What’s the most fun you’ve ever had in your career?)

A few moments come to mind.  Programming a robotic arm in high school.  Staying up until 4am drinking a case of Mountain Dew with John Richardson in high school, cranking out code for new games (for TopSoft Software).  More recently, playing with Phidgets robotics and the Microsoft Robotics Developer Studio.

Who am I calling out?

Michael Burnham

John Lichinia

John Richardson

Phil White

Beth Humphries

Posted in Personal | Leave a Comment »

TechEd 2008

Posted by Dan Vanderboom on June 9, 2008

If you weren’t able to make it to TechEd this year, you really missed out on a fantastic conference and countless opportunities to explore, learn, meet, and connect.

image

I didn’t bring a digital camera, so I take ultimate responsibility for the results, but I was duped into Kodak’s very misleading marketing when I bought a couple of their disposal “digital” cameras.  I found out while developing them at Walgreen’s that it’s actually a film camera.  Apparently they get away with calling it digital because the price of the camera includes having the pictures burned to a CD, which is a digital object.  I still don’t get how the camera can be called digital.  This is dishonest as far as I’m concerned.  Shame on you, Kodak.

So aside from 50 grainy pictures (of memories that are fuzzy to begin with, due to closing the bar every night of the week), it was a great time.  From sessions on robotics and game development, to Carl Franklin jamming on acoustic guitar at the conference center, to meeting and talking with Microsoft employees and others about emerging technologies, and VIP and MVP parties at Charley’s Steakhouse (phenomenal food and service) and House of Blues (thanks Beth!  hi Theresa!), there was something there for every-nerd.

Here’s another bad picture of something I found pretty funny: it’s Windows rebooting on a kiosk at Universal Studios and informing us that we may want to start in safe mode.

image

Here’s one more bad picture, this time of me, at Universal Studios, hanging out with Jaws.

image

I paid particular attention, and even took notes, on the presenters’ speaking styles and skill levels, technical competence, confidence, enthusiasm, audience engagement and participation, humor, as well as the tools they used for zooming, screen annotation, altering UI and font sizes for the audience, etc.  I’ve given some serious thought to submitting proposals for future conferences, and during some of the sessions I couldn’t help but think, “That should be me up there!”

Overall, TechEd has gotten me excited, and sessions often left me wanting to write tons of code and build lots of new programs, from small but useful Pocket PC apps to radical new ideas for libraries, UI frameworks, and robotics control systems.  As The Damonator accurately explained, conferences like TechEd are great for getting you re-energized about development.  It’s been a few years since I was at DevConnections, and I hope I’ll be able to attend these events (PDC later this year, for example) more frequently in the future.

Bill Gates’ Keynote

Bill Gates gave his last public speech Tuesday morning as a full time Microsoft member.  I’ve seen some videos of him online, and I wasn’t blown away by his presenting style.  It’s not very smooth, and he doesn’t seem very comfortable going through a rehearsed script.

However, when it came time to answer audience questions, his intelligence shone through in spades.  His answers were insightful, articulate, and substantive, even when the questions were confusing, long-winded, or occasionally really lame.

Robots

Toward the end of Bill Gates’ keynote, a robot rolled out balancing on two wheels, featuring an LCD screen with a still picture of Steve Ballmer’s head and very articulate arms: the Ballmer-bot is $60,000 of hardware, and I can’t even guess the amount for design and software development.  It balanced on its wheels while the arms extended (changing its center of gravity, which requires compensation), and it announced loudly, “Developers! Developers! Developers!”  Over and over again.  Very funny and well done.  The Ballmer-bot handed Gates his “lifetime XBox Live membership”.  The only disappointing part was the wire that connected this humanoid robot to some kind of game controller.  Why wasn’t it wireless?  As someone pointed out to me, the last thing they wanted for Gates’ last speech was for this robot to get away from them and launch itself into the crowd, injuring someone.  So it must still be in beta.  🙂

I had a chance to meet and talk with Nicolas Delmasso from SimplySim (located in France).  They are experts in 3D simulation.  SimplySim was involved in creating the simulation environment for Microsoft Robotics Developer Studio, which is based on the XNA Game Developer Studio.  SimplySim will likely be working on support for physics to support flight soon (helicopters, airplanes, etc.), as that has been so frequently requested.  How cool would it be to program autonomous aircraft for search and rescue or fire fighting scenarios?  RoboChamps could create same amazing new competitions based on this.

I also attended a session called Software + Services + Robots, which I think is a clever name.  This was about building the RoboChamps competition itself and all of the technology involved, including social/community aspects, Silverlight media content, writing referee services, cameras that can be watched from their website by spectators, and much more.

Session Highlights

There were so many good sessions to attend.  During a few time slots, I found myself annoyed that there wasn’t much to be excited by, but most of the time slots had so many good sessions scheduled simultaneously, it was difficult to pick just one.  In some cases, I didn’t: I went to one for ten or fifteen minutes, and then changed my mind and went to another.

Unity & Prism – Lightweight IoC & WPF Composite Clients

It was during one of these switch-ups that I wound up catching the tail end of one of Glen Block’s talk on the Unity and Prism libraries.  Unity is a lightweight, IoC dependency injection container that is almost identical to one I created about two years ago while working for Panatrack, and which I have redesigned working for CarSpot.  Unity does support some things that I didn’t have any need for, and I really like Unity’s approach: for example, allowing you to plug in your own module loader and module initializer, separately.  Prism is the new composite client framework (they’re cautiously calling it a library now, I think) for WPF, though its concepts can be used in other technologies (like Windows Forms) with some additional work.  This is essentially a redesign and simplification of the same concepts that appeared in the Smart Client Software Factory, and I’m really excited to see support for patterns like MVC and MVP, which I use extensively.  Prism will work with Silverlight (great news!), but neither Unity nor Prism support Compact Framework currently.  If I end up using one or both of them, I will likely port them to Compact Framework, and will contribute to the project on CodePlex so that everyone will benefit.

XNA Game Development on Zune

This was a great lunch session.  Andrew Dunn explained that Game Studio, DirectX, and a few others are all owned by the XNA brand, and he demonstrated not only how to create a game from templates installed from Game Studio, but also how to publish the game on XBox live so it can be rated and reviewed by other game developers.  I’m definitely going to join the Content Creator’s Club so I can play around with this.

Unfortunately, XNA is not supported for Windows Mobile devices.  Zune was chosen primarily because it’s a fixed target, but as Zune runs some version or subset of the Compact Framework, hopefully a Windows Mobile version will emerge sometime in the not-so-distant future.  With 3D accelerator cards and VGA or better screens appearing in awesome new phones like the HTC Diamond, this could be a hot new gaming platform.  Zune is very limited, of course, but it still sounds like a lot of fun, especially knowing that up to 16 Zunes can play via the built-in Wifi.

Data Visualization Applications with Windows Presentation Foundation

Tim Huckaby did a great job and attempted to break the record for the most demos done in a single presentation.  I don’t know if he accomplished his goal, but he did do a dazzling number of nice demos.  He showed off the cancer research 3D molecule application (which strangely plugs into SharePoint), and had guest presenters walk through an application that allows administration, monitoring, and flexible visualizations of all of the slot machines in various casinos around the world.

My favorite demo, though, was a system that manages tours of the San Diego Zoo, the largest zoo in the world, and apparently impossible to see in its entirety in a single day.  Visitors can specify what animals and attractions they’re interested, and the system will map out a path and plan for them, making sure they see animals at the best times (while pandas are eating, at 2pm, for example).

Hardcore Reflection

I eat, sleep, and breathe reflection, so it was a special treat to see Dustin Campbell’s 400-level session on this topic.  I still wasn’t sure I would learn much, but I’m glad I went.  From dispelling myths about reflection’s performance and memory consumption problems (which were real prior to .NET 2.0), to seeing some (albeit simple) examples of creating dynamic methods and emitting IL, I got a few nuggets of goodness out of this.

Mock Objects & Advanced Unit Testing

I saw a presentation at a local .NET User Group about mock objects, specifically with Rhinomock.  Typemock was mentioned, and something peculiar, interesting, and… amazing was happening, and I knew C# was incapable of making use of the code I saw up on the screen, and then someone asked.  It turns out that Typemock uses the Debugger Profiler API to rewrite the code as it executed on the desktop.  A similar approach is used for code coverage in NCover.  Because of the dependency on this API, these tools won’t work for Compact Framework software, and so they’re useless to me.

I do have a plan to bring code coverage to Compact Framework, perhaps even plugging into NCover.  I’ll be writing some articles about that this summer, I’m guessing.

Conclusion

Overall, TechEd was a great experience.  I met a lot of interesting people, was inspired with new ideas, and had seriously geeky conversations with some very smart people.  As I took notes for each session, I found myself jotting down specifications for new applications and tools that I’m eager to start working on, and enhancements and new avenues to explore for ongoing projects that I’ve already blogged about.

Posted in Conferences, Personal, Reflection, Robotics | 2 Comments »