If you've worked with ASP.NET Dynamic Data before, you know that you can customize the scaffoling by creating a partical class for the generated Entity Framework/Linq-to-SQL classes. In this partial class, you can specify another class as the metadata class, and in the metadata class, you can create public fields with the same name as the public properties in your data-access classes and use the set of attributes in System.ComponentModel.DataAnnotations and System.ComponentModel to customize the scaffolding.
There's, of course, no intellisense ... you manually have to get all the field names right. Given that there are no "real" dependencies, there won't be any Exceptions or compile-time errors, either.
For me, this gets pretty old pretty quickly. I've written a small code generator which helps you get the initial scaffolding classes without writing them manually. It does this by loading an assembly, looking for all Entity Framework and Linq-to-SQL classes and generating the partial classes and metadata classes as a starting point for your customization.
The code is so little that I hardly dare to post it --- but I figured that it might help one or another developer. It should work for Entity Framework and Linq-to-SQL. (Note: you can also download the complete project and a ready-to-run binary at the end of this post).
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Reflection;
using System.Data.Objects.DataClasses;
using System.Data.Linq.Mapping;
namespace DynamicDataCodeGenerator
{
public class MetadataGenerator
{
public static string GenerateMetadata(string assemblyName)
{
StringBuilder bld = new StringBuilder();
Assembly asm = Assembly.LoadFrom(assemblyName);
foreach (Type type in asm.GetTypes())
{
EdmEntityTypeAttribute[] atts = (EdmEntityTypeAttribute[])type.GetCustomAttributes( typeof(EdmEntityTypeAttribute), true);
if (atts.Length== 1)
{
AppendCodeForType(bld, type);
}
TableAttribute[] tableAtts = (TableAttribute[])type.GetCustomAttributes( typeof(TableAttribute), true);
if (tableAtts.Length == 1)
{
AppendCodeForType(bld, type);
}
}
return bld.ToString();
}
private static void AppendCodeForType(StringBuilder bld, Type type)
{
bld.Append("[MetadataType(typeof(Metadata))]").AppendLine();
bld.Append("public partial class ").Append(type.Name).AppendLine();
bld.Append("{").AppendLine();
bld.Append(" [ScaffoldTable(true)]").AppendLine();
bld.Append(" public class Metadata").AppendLine();
bld.Append(" {").AppendLine();
foreach (PropertyInfo prop in type.GetProperties())
{
bld.Append(" [ScaffoldColumn(true)]").AppendLine();
bld.Append(" public object ").Append(prop.Name).AppendLine(";");
bld.AppendLine();
}
bld.Append(" }").AppendLine();
bld.Append("}").AppendLine();
bld.AppendLine();
bld.AppendLine();
}
}
}
As some of you know, I enjoy working at high-throughput web applications. One thing I however don't particularily like, is the creation of the necessary backend administration tools. In a lot of the applications I've worked on, data is fed using various XML-ish or proprietary data feeds and is usually 99.9% correct. For the remaining 0.1%, support personell needs some possbility to maintain and change data in every way imaginable. This usually means that it's necessary to create CRUD-style data maintainance applications for dozens or - more likely - hundreds of tables.
As this doesn't really sound like a lot of fun, I had high hopes that ASP.NET Dynamic Data would help our clients a bit in this regard. And, yes, it does! Now, to be fair: it's Version One and lacks a few features which would be nice. But the cool thing is that it's extremely extensible.
As those of you who've looked at dynamic data before will know, ADD by default bases its view of the world on annotated partial classes based on code generated from an Entity Framework EDMX or on Linq-To-SQL classes. This of course also means that, by default, you will get exactly one representation for each class (one set of configured columns). There is no built-in (automatic) way to simply configure different views, including different columns in different orders for each table. For most of the applications my clients work with, this is unfortunately not enough, as their users usually want to show/hide different columns based on the particular use case.
What I'd like to do instead was to define multiple views (say, in an XML file) and define key for each view which can then be used to retrieve the desired layout. (Note: the following sample is based on an Entity Framework Dynamic Data Web Site based on the Northwind database ... But I'm sure that you can follow even without this setup):
The first thing was to define the different columns I'd like to show in the different contexts. To do this, I've created a file called ColumnConfiguration.config in the Web's root directory:
I've then created a custom field generator (loosely based on the one which has been published last year with the ASP.NET dynamic data futures ... from which I also took the ColumnOrder-Attribute which I include for completeness' sake).
public class ConfigurationBasedFieldGenerator : IAutoFieldGenerator
{
private MetaTable _table;
private bool _multiItemMode;
private string _configurationName;
public static Dictionary> _configurations = new Dictionary>();
public static DateTime _nextRefresh = DateTime.MinValue;
private void EnsureConfigurations()
{
lock (_configurations)
{
if (_nextRefresh > newConfigs = new Dictionary>();
using (XmlReader rdr = new XmlTextReader(HttpContext.Current.Server.MapPath("~/ColumnConfiguration.config")))
{
rdr.Read();
rdr.ReadToDescendant("Configuration");
while (rdr.Name == "Configuration")
{
string name = rdr.GetAttribute("Name");
string table = rdr.GetAttribute("Table");
string columns = rdr.GetAttribute("Columns");
List tmp = new List(columns.Split(','));
for (int i = 0; i ().DefaultIfEmpty(ColumnOrderAttribute.Default).First();
}
public ICollection GenerateFields(Control control)
{
EnsureConfigurations();
bool isWildcardConfig= true;
string key = _table + "|" + _configurationName;
List columnsToInclude=null;
bool hasKey = _configurations.TryGetValue(key, out columnsToInclude);
if (hasKey)
{
if (columnsToInclude.Count==0 || columnsToInclude[0] != "*")
{
isWildcardConfig = false;
}
}
if (isWildcardConfig)
{
// use standard config, ordered by ColumnOrderAttribute - borrowed from an older version
// of the ASP.NET futures
var fields = from column in _table.Columns
where IncludeField(column)
orderby ColumnOrdering(column)
select new DynamicField()
{
DataField = column.Name,
HeaderText = column.DisplayName
};
return fields.ToList();
}
else
{
var allFields = from column in _table.Columns
where IncludeField(column)
select new DynamicField()
{
DataField = column.Name,
HeaderText = column.DisplayName
};
List fields = new List();
foreach (string col in columnsToInclude)
{
DynamicField field = allFields.FirstOrDefault(p=>p.DataField == col);
if (field != null) fields.Add(field);
}
return fields;
}
}
}
// borrowed from an older version of the asp.net futures
[AttributeUsage(AttributeTargets.Property | AttributeTargets.Field, Inherited = true, AllowMultiple = false)]
public class ColumnOrderAttribute : Attribute, IComparable
{
public static ColumnOrderAttribute Default = new ColumnOrderAttribute(0);
public ColumnOrderAttribute(int order)
{
Order = order;
}
///
/// The ordering of a column. Can be negative.
///
public int Order { get; private set; }
public int CompareTo(object obj)
{
return Order - ((ColumnOrderAttribute)obj).Order;
}
}
To enable this custom field generator, I've changed the default List.aspx (and the other view as well) to explicitly use it:
Et voilĂ . If I browse to http://localhost/Customers/List.aspx, I get the full list (as configured in the metadata), but if I browse to http://localhost/Customers/List.aspx?config=Phonelist, I only get the fields which have been configured in ColumnConfiguration.config (CompanyName, ContactName, ContactTitle, Phone, Country .... exactly in this sequence!). I could of course also define more than one additional view just by adding the entries to the configuration file.
And to take this one step further, you could imagine to also add attributes like "RoleName" to the configuration file to automatically check the user's role membership to ensure that only users with the correct access rights can view certain combinations of columns.
At next week's TechDays conference in Antwerp/Belgium, I wanted to show an example of how an Azure worker role can update data in a user's Live Mesh Desktop. Instead of doing it in the common "demo" way of simply supplying a hardcoded username/password combination (or, in a similar vain, by simple storing the user's Mesh username/password in a database), I wanted to make this as real as possible.
Reality of course means delegated authentication. Now, there are a few things harder than getting this to work for the first time, but I'd be hard pressed to name a lot of them. To make a long story short, it seems that, currently, delegated authentication is only possible for Mesh Applications (web sites which run directly inside the users mesh ... think "Facebook App" for Mash). But that's not what I wanted ... I wanted to asynchronously, update data in the Mesh at a time when the user is not online.
Turns out that this is absolutely possible: you can simply act as if your external web site is a mesh application by registering for a "new mesh application" (in the azure portal) and sending the given AppID to the Live's delegation authentication URL when your external web site needs the user's consent to interact with Mesh data. This will however - behind the scenes - also cause your application to be installed in the user's Mesh Desktop so that I ended up with an "Empty" application on my desk. But that's something I can absolutely live with.
What I can't live with is something I noticed a few hours later. I planned to deploy the part of my application which uploads a file to the user's Mesh into Azure. For my first tests, I've simply used a console application which references Microsoft.LiveFX.Client, Microsoft.LiveFX.ResourceModel and Microsoft.Web from the Live Framework SDK. I've created a folder called "DemoFolder" in my Mesh desktop and ran code similar to the following (which worked perfectly well in my console application):
static void UploadWithAPI()
{
string token = "... delegation token I've received from Live ....";
LiveOperatingEnvironment loe = new LiveOperatingEnvironment();
Uri serviceUrl = new Uri("https://user-ctp.windows.net/V0.1");
Console.WriteLine("Connecting");
loe.Connect(token, AuthenticationTokenType.DelegatedAuthToken,
serviceUrl, new LiveItemAccessOptions(true));
Console.WriteLine("Connected");
var demoFolder =
loe.Mesh.MeshObjects.Entries.First(p => p.Resource.Title == "DemoFolder");
string path = @"C:\images\";
string filename = "IMG_3957.jpg";
var feed = demoFolder.DataFeeds.Entries.First(p => true);
using (FileStream fs = File.OpenRead(path + filename))
{
feed.DataEntries.Add(fs, filename);
}
Console.WriteLine("File added");
}
Nice and short. Even very understandable for a non-REST person like myself. Only problem: It won't work in Azure. You'll get a SecurityException telling you that the assembly does not allow partically trusted callers. I think that this is simply an oversight (to forget APTCA) or at least I don't fully understand the rationale. But the end-result is absolutely the same: I can't run this code in Azure.
Fortunately enough, Live Mesh's base API is actually HTTP/REST-based and the .NET API is just a small wrapper on top of it. So now I just needed to find out how to pass my delegated authentication token to the Live Mesh service. I've googled quite a bit (and used Fiddler ... thank god for SSL interception!) before I found that the missing HTTP header was called AppDelegationToken. Now I could remove the reference to the client DLLs and go about my business in a very low-level way:
static void UploadWithREST()
{
string token = "... delegation token I've received from Live ....";
string path = @"C:\images\";
string filename = "IMG_3957.jpg";
string serviceUrl = "https://user-ctp.windows.net/V0.1/";
string folderName = "DemoFolder";
string relativeUrl = GetRelativeUrlForFolder(token, serviceUrl, folderName);
PostToLive(serviceUrl + relativeUrl, token, path, filename, "image/jpeg");
}
// return the URL for a folder's MediaResources based on the folder's name
private static string GetRelativeUrlForFolder(string token, string baseUrl, string folderName)
{
string relativeUrl = "Mesh/MeshObjects?$filter=(Title%20eq%20'" + folderName + "')&$expand=DataFeeds";
XmlDocument doc = GetLiveResponse(baseUrl + relativeUrl, token);
XmlNamespaceManager mgr = new XmlNamespaceManager(doc.NameTable);
mgr.AddNamespace("atom", "http://www.w3.org/2005/Atom");
mgr.AddNamespace("win", "http://user.windows.net");
XmlNode nod = doc.SelectSingleNode("/atom:feed/atom:entry/atom:link[@rel='LiveFX/DataFeeds']" + "/win:Inline/atom:feed/atom:entry/atom:link[@rel=" + "'LiveFX/MediaResources']", mgr);
relativeUrl = nod.Attributes["href"].Value;
return relativeUrl;
}
// execute a simple GET request and return the XmlDocument
static XmlDocument GetLiveResponse(string url, string token)
{
HttpWebRequest req = (HttpWebRequest)HttpWebRequest.Create(url);
req.Headers.Add("AppDelegationToken", token);
req.Method = "GET";
XmlDocument doc = new XmlDocument();
using (HttpWebResponse res = (HttpWebResponse) req.GetResponse())
{
using (Stream st = res.GetResponseStream())
{
doc.Load(st);
}
}
return doc;
}
// POST a file to the Live Mesh
static void PostToLive(string url, string token, string path, string filename, string contentType)
{
HttpWebRequest req = (HttpWebRequest)HttpWebRequest.Create(url);
req.Headers.Add("AppDelegationToken", token);
req.ContentType = "image/jpeg";
req.Headers.Add("Slug", filename);
req.Method = "POST";
using (Stream st = req.GetRequestStream())
{
byte[] buf = File.ReadAllBytes(path + filename);
st.Write(buf, 0, buf.Length);
}
using (HttpWebResponse resp = (HttpWebResponse)req.GetResponse())
{
// not really needed ...
}
}
I have to admit that I didn't yet try this in Azure, but I think that it should work ...
Is this thing still on? Is anyone still subscribed after all these years?
After taking a two-years-and-a-bit hiatus from the cutting edge of technology, I'm now about to return full time. But before we're back to the regular program, let me answer the question inevitable question: what was up with you??
What happend was that I've had the pleasure to find a few really amazing clients and decided to work with them on some of their medium/longer-term projects (longer term meaning substantially more than the usual three-day-visit to a client to help them with a particular issue or question they might have). I really like to do this every couple of years to get a more "real" feeling about the day-to-day pain points in software development projects. So instead of doing only the regular short-term consulting (which of course still was going on as well), I went back to the trenches and cranked out some real, production code in some real, production projects for a change. It was fun. Really!
On the research side, I've concentrated not so much on tomorrow's technologies, but more on currently shipping stuff which could be used in these projects. I've also worked a *lot* with WinDbg (and its possibilities constantly amaze me ... I still think that every serious development team should have at least one WinDbg expert). With this tool, it's just *so* much easier to find problems in production code which happen only on production machines far away from Visual Studio or any other debugger.
During the second half of 2008 I started to feel the urge to return back to the cutting edge (and the other thinktecture guys were already making some serious fun of me and my dedication to already-shipping ASP.NET stuff). But I didn't find the compelling topics to look into. WPF would be nice, but frankly, I'm more of a server-side person. WF? Yes, sure. Doing this on a regular basis. WCF? Just a tool for me, no religious feelings about it.
But then along came October. And as it was the case for a lot of developers, last year's PDC gave me the opportunity to see some really cool upcoming stuff. (Publicly and even more so in private meetings). I've then started to scale back the longer-term committments to return to researching the cutting edge. In 2009, I'll return to the conference track with these topics, plan to write more blog posts (well ... it would be hard to write less than what I did, anyway) and I'm actually also currently discussing an idea for a new book. I think there's some fun time ahead.