Critical Development

Language design, framework development, UI design, robotics and more.

Archive for the ‘Uncategorized’ Category

TreeView for Xamarin.Forms – Open Source!

Posted by Dan Vanderboom on August 14, 2014

On a recent enterprise tablet app development project, I decided to use Xamarin and Xamarin.Forms to build Android and iOS apps. I’m having a blast and am delightfully productive compared to all the years I spent building enterprise mobile systems with Windows CE/Mobile.

However, a TreeView control is nowhere to be found. I decided to build my own, having committed to it in the project specs and having immediate need for it, but I didn’t want to bill my client for it. After some discussion, we agreed that I should develop the TreeView control separately as an open source library.

 

In the demo, tree nodes will expand or collapse when you touch them, although currently there’s no visual feedback that a button has been pushed. It scrolls, and the right side bar chart is actually part of each TreeNode’s item template.

I did some work in 2008 studying and writing about Tree data structures in .NET, and developed a simple but powerful non-binary Tree library.

Now I’m extending that library, adding a layer of ViewModel abstractions and a TreeView control to bind it to. I’ve created a GitHub repository of that Tree library as well as the spike I’ve done on the TreeView control itself, along with a working demo app.

TreeView & Tree GitHub Repository

TreeViewDemo

This is a 6 hour spike, not a finished control library. But the initial prototype is working, proving the concept of its basic design. Next on the agenda are to separate out the item template of the node header from the control, add support for Xamarin.Forms data binding, define a nuget package for it, and there are a ton of features worth considering beyond the basics.

If you’re interested in contributing to this project and you’re familiar with best practices in custom control development in Xamarin.Forms, leave me a comment or email me @ dan at highenergydesign dot net. You could help shape the design direction for this basic element of enterprise user interfaces.

Posted in Uncategorized | 2 Comments »

Phidgets Robotics Project – Source Code Posted

Posted by Dan Vanderboom on October 30, 2009

By request, I’ve made available the source code to my Wii remote controlled pan-tilt camera system using Phidgets components.  See my original article here.

Warning: It’s messy in there.  Don’t look to this as a good example of how robotics applications should be developed.  There are far better patterns and libraries to use.  This was merely my “get it up and running as quickly as possible” approach, to prove out the concept of getting all of the components to work together correctly.

Enjoy!

Posted in Uncategorized | Leave a Comment »

Misadventures in Pursuit of an Immutable Development Virtual Machine

Posted by Dan Vanderboom on May 23, 2008

Problem

Every three to six months, I end up having to rebuild my development computer.  This one machine is not only for development, but also acts as my communications hub (e-mail client, instant messenger, news and blog aggregator), media center, guitar effects and music recording studio, and whatever other roles are needed or desired at the time.  On top of that, I’m constantly installing and testing beta software, technical previews, and other unstable sneak peeks.  After several months of this kind of pounding, it’s no wonder the whole system doesn’t grind to a complete halt.

This is an expensive and tedious operation, not to mention time lost from poor performance.  It normally takes me a day and a half, sometimes two days, to rebuild my machine with all of the tools I use on a regular or semi-regular basis.  Drivers, SDKs, and applications have to be installed in the correct order, product keys have to be entered and software activated over the Internet and set up the way I want, wireless network access and VPN connections have to be configured, backups have to be made and application data restored once they’re reinstalled, and there is never a good time for all of the down time.  A developer’s environment can be a very complicated place!

If it’s not error messages and corruption, it’s performance problems that hurt and annoy the most.  There’s a profound difference in speed between a clean build and one that’s been clogged with half a year or more of miscellaneous gunk.  It can mean the difference in Visual Studio, for example, between a 30 second build and three or four minutes of mindless disk thrashing.

Using an immutable development machine means that any viruses that you get, or registry or file corruption that occurs–any problems that arise in the state of the machine–never gets saved, and therefore disappears the next time you start it up.  It is important, however, that everything within the environment is set up just the way you want it.  If you set up your image without ever opening Visual Studio, for example, you’ll always be prompted with a choice about the style of setup you want, and then you’d have to wait for Visual Studio to set itself up for the first time, every time.

Still, if you invest a little today in establishing a solid environment, the benefits and savings over the next year or two can be well worth the effort.  As I discovered over the past week and a half, there are a number of pitfalls and dangers involved.  If you’ve considered setting up something similar, I hope the lessons I’ve learned will be able to save you much of the trouble that I went through.

Solution

After listening to Security Now! and several other podcasts over the past couple of years about virtual machines and how they’re being used for software testing to ensure a consistent starting state, I began thinking of how nice it would be if I could guaranty that my development environment would always remain the same.  If I could get all of my tools installed cleanly to maximize performance and stability, and then freeze it that way, while still allowing me to change the state of my source code files and other development assets, I might never have to rebuild my computer again.  At least, not until it’s time to make important software upgrades, but then it would be a controlled update, from which I could easily roll back if necessary.

But how can this be done?  How can I create an immutable development environment and still be able to update my projects and save changes?  By storing mutable state on a separate drive, physical or virtual, which is not affected by virtual machine snapshots.  It turns out to be not so simple, but achievable nonetheless, and I’ll explain why and how.

If the perfect environment is to be discovered, I have several more criteria.  First, it has to support, or at least not prevent, the use of multiple monitors.  Second, I’d like to synchronize as much application data as possible across multiple computers.  Third, as I do a lot of mobile device development, I need access to USB and other ports for connecting to devices.

Implementation

For data synchronization across machines, I’ve been using Microsoft’s Mesh.com which is based on FeedSync, and is led by Ray Ozzie.  Based on my testing over the past two weeks, it actually works very well.  Though it’s missing many of the features you would expect from a mature synchronization platform and toolset, for the purposes of my goals explained in this article, it’s a pretty good choice and has already saved me a lot of time where I would have otherwise been transferring files to external drives and USB keys, or e-mailing myself files and trying to get around file size and content limitations.  If this is the first time you’ve heard of Mesh, make a point of learning more about it, and sign up for the technical preview to give it a test drive!  (It’s free.)

Originally I wanted to use Virtual PC to create my development virtual machine, however as of the 2007 version, it still does not support USB access, so immediately I had to rule it out.  I downloaded a demo of VMWare’s Workstation instead, which does support USB and provides a very nice interface and set of tools for manipulating VMs.

The diagram below illustrates the basic system that I’ve created.  You’ll notice (at the bottom) that I have multiple development environments.  I can set these environments up for different companies or software products that each have unique needs or toolsets, and because they’re portable (unlike normal disk images), I can share them with other developers to get them up and running as quickly as possible.

Ideal Development Environment

Partitioning of the hard drive, or the use of multiple hard drives, is not strictly necessary.  However, as I’m working with a laptop and have only a single disk, I decided to partition it due to the many problems I had setting up all of the software involved.  I rebuilt the machine so many times, it became a real hassle to restore the application data over and over again, so putting it on a separate partition allowed me to reformat and rebuild the primary partition without constantly destroying this data.

My primary partition contains the host operating system, which is Windows XP Professional SP3 in my case.  (If you’re using Vista, be aware that Mesh will not work with UAC (user account control) disabled, which I find both odd and irritating.)  The host OS acts as my communication workstation where I read e-mail, chat over messenger, read blogs and listen to podcasts, surf the Internet and save bookmarks, etc.  I always want access to these functions regardless of the development environment I happen to have fired up at the time.

Mesh is installed only on the host operating system.  To install it on each virtual machine would involve having multiple copies of the same data on the same physical machine, and clearly this isn’t desirable.  I considered using VMWare’s ESXi server, which allows you to run virtual machines off the bare metal instead of requiring a host operating system, but as I always want my communications and now also synchronization software running, their Workstation product seemed like an adequate choice.  Which is great because it’s only $189 at the time I’m writing this, opposed to $495 for ESXi Server.

With the everyday software taken care of in the host OS, the development virtual machines can be set up unencumbered by these things, simplifying the VM images and reducing the number of friction points and potential problems that can arise from the interaction of all of this software on the same machine.  That alone is worth considering this kind of setup.

Setting up my development VM was actually easier than installing the host OS since I didn’t have to worry about drivers.  VMWare Workstation is very pleasant to use, and as long as the host OS isn’t performing any resource intensive operations (it is normally idle), the virtual machine actually runs at “near native speed” as suggested by VMWare’s website.  The performance hit feels similar to using disk encryption software such as TrueCrypt.  With a 2.4 GHz dual-core laptop, it’s acceptable even by my standards.  I’m planning to start using a quad-core desktop soon, and ultimately that will be a better hardware platform for this setup.

Hiccup in the Plan (Part 1)

The first problem I ran into was in attempting to transfer a virtual machine from one computer to another.  Wanting to set up my new super-environment on a separate computer from my normal day-to-day development machine, I set up VMWare on one computer and when I thought I had the basics completed, I attempted to transfer the virtual machine to my external hard drive.  Because the virtual disk files were over 2 GB and the external hard drive is formatted as FAT32 (which has a file size limitation of 2 GB), I was immediately stopped in my tracks.  I tried copying the files over the local network from one computer to the other, but after 30 minutes of copying, Windows kindly informed me that a failure had occurred and the file could not, in fact, be found.  (Ugh.)  Lesson learned: VMWare has an option, when creating a new virtual machine, to break up virtual disks into 2 GB files.  From that point on, I decided not only to use this option, but also to simply build the virtual machines on the actual target computer, just in case.

The next trick with VMWare was in allowing the virtual machine to share a disk with the host operating system.  My first route was to set up a shared folder.  This is a nice feature, and it allows you to select a folder in your host OS to make visible to the virtual machine.  I thought it would be perfect.  However, Visual Studio saw it as non-local and therefore didn’t trust it.  In order to enable Visual Studio to trust it, you have to change machine-level security configuration (in the VM).  There are two ways of doing this: there is a .NET Configuration tool (mscorcfg.msc) with a nice user interface, and then there is the command-line caspol.exe tool with lots of confusing options and syntax to get right.

Navigating to Administrative Tools, I was stumped to find that I didn’t have this nice GUI tool any more.  I’ve fully converted everything to Visual Studio 2008 and no longer work in 2005, so the last time I built my machine, I ran the VS2008 install DVD.  I learned the hard way that Microsoft no longer includes this tool in the new Visual Studio 2008 install DVD.  I Googled around to discover that Microsoft, for reasons unknown, did in fact decide to remove this tool from their installer, and that if I wanted to have it, I would have to install (first) the .NET 2.0 Redistributable, (second) the .NET 2.0 SDK, and (finally) Visual Studio 2008.  This would mean rebuilding the VM… again.  I tried caspol.exe briefly and finally gave up (the example I found in the forums didn’t work), telling myself that I really did want this GUI tool anyway, and if I was going to set up the perfect development environment, it was worth one more rebuild to get it right.

Several blue screens of death, some puzzling file corruption, and two reinstallations later, I was thinking that the prescribed solution I was attempting wasn’t all it was cracked up to be after all.  Whoever suggested it was messing with me, and was probably crazy anyway.  I did eventually get these components installed and working by simply repeating the procedure, and after using the configuration tool, Visual Studio did seem pretty happy to open the solutions and build them for me.

Until I opened my other solution and tried to build it, that is.  I keep custom controls and designers in a separate solution because of a post-build task that runs and registers the control designers (which is itself an infuriating requirement, but that’s for another article).  Whenever I tried building these projects, I would get an error that access was denied to create some kind of resource file.  I looked at the properties of the shared folder and saw that the file system claimed to be HPFS instead of NTFS.  HPFS is a proprietary format of VMWare’s that somehow provides an accessibility tunnel to the real underlying storage format, and I don’t know anything else about it, but I wouldn’t be surprised if that didn’t have something to do with my problem.  Visual Studio does some finicky things, and VMWare certainly does its share of hocus pocus.  Figuring out the possible interaction between them was going to be beyond my voodoo abilities and resources, so I had to find another way around this shared disk situation if I planned on developing custom controls in this environment.

Hiccup in the Plan (Part 2)

My Dell Latitude D830 is four months old.  I requested a Vostro but was absolutely refused by the company I work for, who shall remain nameless.  Supposedly the Latitude’s are a “known quantity”, will have fewer problems, and are therefore better for support reasons.  Regardless, the D830 is for the most part a good, fast machine.  This one in particular, however, became a monster in the past week during the time I was trying to get this new setup working, costing me a full week of lost time and a great deal of frustration.  Every time I thought I had isolated the cause, some other misbehavior appeared to confuse matters.  Each step of troubleshooting and repair seemed reasonable at the time, and yet as new symptoms emerged, the dangling carrot moved just beyond my reach, my own modern reenactment of Sisyphus’s repeated torment.

MeshCorruption

Not only was I getting Blue Screens of Death several times a day, but CHKDSK would start up before Windows and all kinds of disk problems would be discovered, such as orphaned files and bad indexes.  Furthermore, the same things were happening with the virtual disks, and VMWare reported fragmentation on those disks immediately after installing the operating system and a few tools.  There were folders and files I couldn’t rename or delete, and running the Dell diagnostics software turned up nothing at all.

Finally, having a second D830 laptop, the Dell tech suggested swapping hard drives.  After installing my whole environment, plus the VMs, about a dozen times, I really didn’t feel like going through this yet again, but it seemed like a reasonable course of action, and so I went through the process again.  Getting almost all the way through everything without a problem, I finally (with a smile) rebooted my VM and waited for it to come back up, only to see CHKDSK run and find many pages of problems once again.

Warning: The great thing about Mesh is that you can make changes to your files, such as source code, recompile, and all of those changes shoot up into the cloud in a magical dance of synchronization, and those changes get pushed down to all the other computers in your mesh.  What’s scary, though, is when you have a hard drive with physical defects that corrupts your files.  Those corruptions also get pushed up to the cloud, and this magically corrupts the data on all of the computers in your mesh.  So be aware of this!

The Value of Offline Backups

Make backups.  Check your code into version control.  Mesh is a great tool for synchronizing data, and initially I thought this would be sufficient for backups of at least some of my data, but it falls short in several ways.

First, Mesh doesn’t version your backups.  Once you make a change and you connect to the Internet, everything gets updated immediately.  If data is accidentally deleted or corrupted, these operations will replicate to the cloud and everywhere in your mesh.  Versioned backups, as snapshots in time, are the only way to ensure that you can recover historical state if things go awry as they did for me.

Second, Mesh is great for synchronizing smaller, discrete files that either aren’t supplemented with metadata, or whose metadata also exists in files within the same folder structure and which also gets synchronized.  By the latter, I mean systems such as Visual Studio projects and files: the files are referenced by project files, and project files referenced by solution files, but these are all small, discrete files themselves that can also be seamlessly synchronized.  When I add a file to a project and save, Mesh will update the added file as well as the project file.

Application data that doesn’t work well would be any kind of monolithic data store, such as a SQL Server database or an Outlook’s (.pst) data file.  Every time you got an e-mail and your .pst file changed, the whole file would be sent up to Mesh, and if your e-mail files get as large as mine, that would be a problem.  Hopefully in the future, plug-ins will be developed that can intelligently synchronize this data as well through Mesh.

I’m using and highly recommend Acronis TrueImage for backups.  It really is a modern, first-rate backup solution.

Conclusion

In the end, Dell came and replaced my motherboard, hard drive, and DVD-RW drive (separate problem!), and I was able to get back to building my immutable development environment.  Instead of using shared folders, VMWare lets you add a hard drive that is actually the physical disk itself, or a partition of it.  Unfortunately, VMWare doesn’t let you take a snapshot of a virtual machine that has such a physical disk mounted.  I don’t know why, and I’m sure there’s a reason, but the situation does suck.  The way I’ve gotten around it is to finish setting up my environment without the additional disk mounted, take a snapshot, and then add the physical disk.  I’ll run with it set up for a day or two, allowing state to change, and then I’ll remove the physical disk form the virtual machine, revert to the latest snapshot, and then add the physical disk back in to start it up again.  This back-and-forth juggling of detaching and attaching the physical disk is less than ideal, but ultimately not so bad as the alternative of not having an immutable environment, and I haven’t had the last word quite yet.

I’ll continue to research and experiment with different options, and will work with VMWare (and perhaps Xen) to come up with the best possible arrangement.  And what I learn I will continue to share with you.

Posted in Custom Controls, Development Environment, Uncategorized, Virtualization, Visual Studio | 5 Comments »