Soon after I introduced a Domain Model into the application I was creating for my employer, I found myself at a team meeting trying to explain to fellow programmers, and my manager at the time in particular, why such a change is beneficial. I started by describing how, by the nature of data flowing from the relational database to the object-oriented runtime environment and back, we have an inherent need to translate between two different data models. The manager stopped me in my tracks when she said that I sounded “too academic”. Although left speechless at the moment, in retrospect (and at the risk of sounding like a smart ass) I should have responded, “the law of gravity may sound academic when described by a physicist, but that alters neither its truth nor practical consequence.” Just as the laws of physics describe our physical world, the world our applications live in can be described in terms of data models. The problem stemming from the presence of two data models, instead of just one, is called the
Object-Relational Impedance Mismatch.
The Object-Relational Impedance Mismatch is an “academic” (to quote my former manager) sounding term for a practical problem facing software application development.
…work directly against the relational structure of database tables.
This is all well and good for simple databases and simple applications. However, as a database gets richer and an application more sophisticated, we see increasing divergence between the way an application needs to look at the data—its “conceptual” or “object” model—and the way information is structured in the database—the “storage” or “relational” model.” A manufacturer's ordering system, for example, may store its order information in multiple related tables, yet the application programmer really wants to work with a single, conceptual “order” entity without having to do a complex JOIN in every order-related query. (This is a case of what’s called the “object-relational impedance mismatch,” if you’ve heard that term.)
The above quote is taken from an article in the MSDN Library titled
Microsoft Data Development Technologies: Past, Present, and Future by Kraig Brockschmidt and Lorenz Prem, Program Managers at Microsoft.
Microsoft released the ADO.NET Entity Framework in August 2008 as part of the .NET Framework version 3.5 SP1, and then an improved version in the .NET Framework 4 released in April 2010. What is the ADO.NET Entity Framework? From
its overview:
The ADO.NET Entity Framework enables developers to create data access applications by programming against a conceptual application model instead of programming directly against a relational storage schema. The goal is to decrease the amount of code and maintenance required for data-oriented applications.
The “conceptual application model” is commonly known as the Domain Model.
With the Entity Framework finally here, it is clear that Microsoft has at long last embraced the Domain Model pattern. This is significant given that, in the 12 year hiatus since Microsoft's last data-access paradigm shift when it released ADO in October 1996, they’ve been careful to find the new emblem of application architecture for years to come. In general, Microsoft is the most immune to “fads”. They tend to do things their own way, and avoid industry standards up until the standards become so engrained that the obvious choice becomes to embrace them or get left behind. Microsoft’s embrace of the Domain Model is evidence, in my eyes, that this has become an engrained industry standard.
For those of us using the .NET Framework to build applications, Microsoft’s commitment to the Domain Model goes beyond confirming an industry standard. Having Visual Studio 2010 installed on our machines, while relying on it so much for development, we also have access to the extensive tooling support Microsoft has built around the pattern. From
the MSDN library: “conceptual and relational representations can be easily created using designers in Visual Studio, and the Entity Framework designer in Visual Studio will create a default mapping without any effort on your part.” It’s a safe bet that this tooling support will be continually improved with future Visual Studio releases.