Critical Development

Language design, framework development, UI design, robotics and more.

Archive for the ‘ADO.NET Data Services’ Category

AssignAsync Extension Method for ADO.NET Data Services

Posted by Dan Vanderboom on November 10, 2009

ADO.NET Data Services is a rapidly evolving set of tools that provides data access to remote clients through a set of REST-based services.  The Data Services Client Library for .NET performs the magic of translating your Linq queries to URLs and passing them to the data service back-end, as well as retrieving results and hydrating objects in the client to represent them.

After running into a number of problems with the current CTP of RIA Services (see my article), I decided to fall back on Data Services to provide data access in my newest project.  Data Services has the advantage of allowing you to write fairly normal Linq queries against Entity Framework entity sets, and entity data models can reside in a dedicated data model assembly (instead of requiring them to be part of the web project).

One of the differences that remain when using Data Services in Silverlight—as opposed to accessing an Entity Framework ObjectContext directly—is that Silverlight doesn’t allow asynchronous calls.  So code like this, which would force a synchronous call (with FirstOrDefault), will fail in Silverlight:

var result = (from p in context.Properties
              where p.Required
              select p).FirstOrDefault();

This forces us to adopt some new patterns for data access.  This isn’t a bad thing, however.  And it’s an inevitable transition we’re making to asynchronous, concurrent program logic.

Here’s a typical example of querying data with Data Services in Silverlight:

var RequiredProperties = from p in context.Properties
                         where p.Required
                         select p;

var dsq = RequiredProperties as DataServiceQuery<Node>;
dsq.BeginExecute(ar =>
    {
        var result = dsq.EndExecute(ar);
        // do something with the the result
    }, null);

When using a lambda statement for brevity, the syntax isn’t too bad, but the pattern gets a little more involved when you include error handling logic.  If EndExecute fails, you’ll need the ability to perform some compensating action.

So what I’ve done to keep my client code simple is to define an extension method called AssignAsync that encapsulates this whole pattern.

public static class DataServicesExtensions
{
    public static void AssignAsync<T>(this IEnumerable<T> expression, 
        Action<IEnumerable<T>> Assignment, 
        Action<Exception> Fail)
    {
        var dsq = expression as DataServiceQuery<T>;
        dsq.BeginExecute(ar =>
            {
                IEnumerable<T> result = null;
                try
                {
                    result = dsq.EndExecute(ar) as IEnumerable<T>;
                }
                catch (Exception ex)
                {
                    Fail(ex);
                    return;
                }
                Assignment(result);
            }, null);
    }
}

This enables me to write the following code:

var RequiredProperties = from p in context.Properties
                         where p.Required
                         select p;
RequiredProperties.AssignAsync(result => properties = result, 
    ex => Debug.WriteLine(ex));

In other words: if the query succeeds, assign the result to the properties collection; if it fails, send the exception object to Debug output.  Either action can be used to send signals to other parts of your application that will respond appropriately.  Instead of Debug.WriteLine, you might add the exception object to some collection that triggers an error dialog to appear and your logging framework to record the event.  Instead of assigning the result to a simple collection, you could convert it to an ObservableCollection and assign it to an ItemsControl in WPF or Silverlight.  Anything is possible.

As I explore Data Services further, I will be looking for ways to share query and other model-centric logic between Silverlight and non-Silverlight clients.  I suspect that the same asynchronous patterns can be used in non-Silverlight projects as well, and that those projects will benefit from this query style.

Posted in ADO.NET Data Services, Design Patterns | Leave a Comment »

MSDN Developer Conference in Chicago

Posted by Dan Vanderboom on January 13, 2009

I just got home to Milwaukee from the MSDN Developer Conference in Chicago, about two hours drive.  I knew that it would be a rehash of the major technologies revealed at the PDC which I was at in November, so I wasn’t sure how much value I’d get out of it, but I had a bunch of questions about their new technologies (Azure, Oslo, Geneva, VS2010, .NET 4.0, new language stuff), and it just sounded like fun to go out to Fogo de Chao for dinner (a wonderful Brazilian steakhouse, with great company).

So despite my reservations, I’m glad I went.  I think it also helped that I’ve had since November to research and digest all of this new stuff, so that I could be ready with good questions to ask.  There’ve been so many new announcements, it’s been a little overwhelming.  I’m still picking up the basics of Silverlight/WPF and WCF/WF, which have been out for a while now.  But that’s part of the fun and the challenge of the software industry.

Sessions

With some last minute changes to my original plan, I ended up watching all four Azure sessions.  All of the speakers did a great job.  That being said, “A Lap Around Azure” was my least favorite content because it was so introductory and general.  But the opportunity to drill speakers for information, clarification, or hints of ship dates made it worth going.

I was wondering, for example, if the ADO.NET Data Services Client Library, which talks to a SQL Server back end, can also be used to point to a SQL Data Services endpoint in the cloud.  And I’m really excited knowing now that it can, because that means we can use real LINQ (not weird LINQ-like syntax in a URI).  And don’t forget Entities!

I also learned that though my Mesh account (which I love and use every day) is beta, there’s a CTP available for developers that includes new features like tracking of Mesh Applications.  I’ve been thinking about Mesh a lot, not only because I use it, but because I wanted to determine if I could use the synchronization abilities in the Mesh API to sync records in a database.

<speculation Mode=”RunOnSentence”>
If Microsoft is building this entire ecosystem of interoperable services, and one of them does data storage and querying (SQL Data Services), and another does synchronization and conflict resolution (Mesh Services)–and considering how Microsoft is making a point of borrowing and building on existing knowledge (REST/JSON/etc.) instead of creating a new proprietary stack–isn’t it at least conceivable that these two technologies would at some point converge in the future into a cloud data services replication technology?
</speculation>

I’m a little disappointed that Ori Amiga’s Mesh Mobile wasn’t mentioned.  It’s a very compelling use of the Mesh API.

The other concern I’ve had lately is the apparent immaturity of SQL Data Services.  As far as what’s there in the beta, it’s tables without enforceable schemas (so far), basic joins, no grouping, no aggregates, and a need to manually partition across virtual instances (and therefore to also deal with the consequences of that partitioning, which affects querying, storage, etc.).  How can I build a serious enterprise, Internet-scale system without grouping or aggregates in the database tier?  But as several folks suggested and speculated, Data Services will most likely have these things figured out by the time it’s released, which will probably be the second half of 2009 (sooner than I thought).

Unfortunately, if you’re using Mesh to synchronize a list of structured things, you don’t get the rich querying power of a relational data store; and if you use SQL Data Services, you don’t get the ability to easily and automatically synchronize data with other devices.  At some point, we’ll need to have both of these capabilities working together.

When you stand back and look at where things are going, you have to admit that the future of SQL Data Services looks amazing.  And I’m told this team is much further ahead than some of the other teams in terms of robustness and readiness to roll out.  In the future (post 2009), we should have analytics and reporting in the cloud, providing Internet-scale analogues to their SQL Analysis Server and SQL Reporting Services products, and then I think there’ll be no stopping it as a mass adopted cloud services building block.

Looking Forward

The thought that keeps repeating in my head is: after we evolve this technology to a point where rapid UX and service development is possible and limitless scaling is reached in terms of software architecture, network load balancing, and hardware virtualization, where does the development industry go from there?  If there are no more rungs of the scalability ladder we have to climb, what future milestones will we reach?  Will we have removed the ceiling of potential for software and what it can accomplish?  What kind of impact will that have on business?

Sometimes I suspect the questions are as valuable as the answers.

Posted in ADO.NET Data Services, Conferences, Distributed Architecture, LINQ, Mesh, Oslo, Service Oriented Architecture, SQL Analysis Services, SQL Data Services, SQL Reporting Services, SQL Server, Virtualization, Windows Azure | 1 Comment »

Hosted ADO.NET Data Services & Silverlight

Posted by Dan Vanderboom on January 7, 2009

I’ll be the first to admit I’m a novice in world of web development, so I expected a learning curve and my fair share of hurdles as I started a new project in Silverlight and ASP.NET to be hosted in the cloud.

After a brief stint writing ASP.NET 1.0 code years ago, I quickly learned to loathe web programming, and decided to do thick-client and mobile device development for the past five years.  But the technologies that have emerged since then–Silverlight 2.0, ADO.NET Data Services, LINQ to Entities–all accessible from within the browser, has forced me to take a second look.  The absolute coolness I’m witnessing suggests that my newest project should take advantage of this much richer Microsoft Interweb stack.

I spent the past month furiously studying WPF in all its gory detail.  I even wrote a 3D demo, using a hot racing car from Google’s 3D Warehouse, converting that model into XAML with 3DPaintBrush (15 day trial available), and animating it through code.  I plan to hook up my Wii remote code to translate twists and turns of the remote into RotateTransforms.

But now I’m learning the differences between Silverlight and WPF, and even the difference between hosting a Silverlight application in a development environment and hosting it with a provider like DiscountASP.net.

In short, I’m very impressed with everything so far, but I ran into a problem with ADO.NET Data Services.  I enjoyed watching the DNRTV episode by Sean Wildermuth on Silverlight data access, and I followed along to build my application just as he did.  When I published to my web host, however, I ran into problems.

"This collection already contains an address with scheme http.  There can be at most one address per scheme in this collection."

Not being a web guy, I have no idea what this means.  Fortunately, a little bird told me that http://danvanderboom.com and http://www.danvanderboom.com, although they both point to the same place, provide multiple base addresses, and this is apparently too confusing for WCF services to handle.

When the ServiceHost is created, an array of base addresses are passed in.  If it gets more than one, it wigs out.  Rob Zelt posted a good article on resolving this issue on his blog, which involves changing the .svc file’s markup to something like this:

<%@ ServiceHost Language="C#" Factory="CustomHostFactory" 
    Service="devCatalyst.MyDataService" %>

And code like this:

using System;
using System.ServiceModel;
using System.ServiceModel.Activation;

class CustomHostFactory : ServiceHostFactory
{
    protected override ServiceHost CreateServiceHost(Type serviceType, Uri[] baseAddresses)
    {
        return new CustomHost(serviceType, baseAddresses[0]);
    }
}

class CustomHost : ServiceHost
{
    public CustomHost(Type serviceType, params Uri[] baseAddresses)
        : base(serviceType, baseAddresses)
    { }
}

This is fine for regular WCF services, but when using ADO.NET Data Services, there’s already a factory defined.

<%@ ServiceHost Language="C#"
    Factory="System.Data.Services.DataServiceHostFactory, System.Data.Services, Version=3.5.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" 
    Service="FunWithData.Web.MyDataService" %>

In order to supply your own custom host factory without losing any of the functionality of the Data Services host factory, you’ll need to inherit from that class like this:

public class CustomHostFactory : DataServiceHostFactory
{
    protected override ServiceHost CreateServiceHost(Type serviceType, Uri[] baseAddresses)
    {
        return base.CreateServiceHost(serviceType, new Uri[] { baseAddresses[0] });
    }
}

In the CreateServiceHost method, note how I create a new Uri array, adding only a single Uri.  Rob Zelt’s code used an index of 1 erroneously, which will work fine so long as you always have at least two base addresses.  But in one of my tests, I only had one, and thereby found the bug.

I also eliminated the need to define CustomHost, instead simply calling base.CreateServiceHost from within CustomHostFactory, passing in my new Uri array.

Actually, the very first error I got was for having Windows authentication setup in my Web.config file, but removing that line was easy.  After writing the custom ServiceHost factory, that error went away and was replaced by this one:

"IIS specified authentication schemes ‘Basic, Anonymous’, but the binding only supports specification of exactly one authentication scheme. Valid authentication schemes are Digest, Negotiate, NTLM, Basic, or Anonymous. Change the IIS settings so that only a single authentication scheme is used."

It took me a few minutes to find the appropriate area of my web host’s control panel to set authentication schemes, but once I did, I noticed that both Basic and Anonymous were enabled.  I disabled Basic authentication and then the application came to life.

Except there was no data actually displayed.  My relative Uri, copied from Mr. Wildermuth’s example, wasn’t cooperating.

var context = new MyEntities(
    new Uri("/TestDataService.svc", UriKind.Relative));

By changing it to an absolute Uri, it finally worked:

var context = new MyEntities(
    new Uri("http://danvanderboom.com/Test/TestDataService.svc", UriKind.Absolute));

As simple as 1-2-3?  Not quite.  After hours of researching the issues and surreptitiously talking to a web development guru, I was able to throw something together and get it working.  Hopefully these tips can save someone else the time and frustration that I encountered.

Posted in ADO.NET Data Services, ASP.NET, Silverlight | 4 Comments »