Archive for August, 2010

The Entity Framework is easy to slide into for Greenfield applications, isn’t everything when you got a clean slate to work with? But what about retrofitting into an existing application? One of the challenges here is the connection string itself. In a legacy system, (by legacy I mean one that does not use EF 🙂 ) We have the connection string stored in the connectionStrings section of the App or Web config file. We certainly do not want to introduce another connection string just for our EDMX, as that gives us more points of maintainability. We need to be able to build up one of those funny EDMX connection strings from an existing regular connection string. And no, string concatenation is NOT the answer here.

Here is the code: (in VB.net)

  1. Namespace Repository
  2. Public Class ConnectionFactory
  3. Public Shared Function GetEDMXConnectionString(Of T As ObjectContext)(ByVal connectionName As String) As EntityConnection
  4. Dim connectionString = ConfigurationManager.ConnectionStrings(connectionName).ConnectionString
  5. Dim dbConnection As New SqlConnection(connectionString)
  6. Dim resourceArray As String() = {“res://*/”}
  7. Dim assemblyList As Assembly() = {GetType(T).Assembly}
  8. Dim metaData As New MetadataWorkspace(resourceArray, assemblyList)
  9. Dim edmxConnection As New EntityConnection(metaData, dbConnection)
  10. Return edmxConnection
  11. End Function
  12. End Class
  13. End Namespace

There are a few things to look at here.

  1. We use a generic type of our ObjectContext. This is needed to get the assembly for the meta data.
  2. We create a stock SQL Connection
  3. We create a MetaDataWorkspace
  4. We then create an EntityConnection passing in the meta data and the SQL connection

You can then pass the resulting EntityConnection straight into your ObjectContext constructor.

Advertisements

That’s right. I have skipped part 3 for now. We will return to that. Lets take a step back and look at some inheritance and splitting scenarios that the Entity Framework supports. Relational inheritance is a way of describing your relational data in a OO form, where the rules of polymorphism hold true for your model. Splitting is a way of classifying your data so that it maps onto sets of entities in your model.

Inheritance Scenarios

 

TPT: Table per Type
Scenario

I have a 1-1 table relationship in my database that represents a base class – sub class structure

Example
  • Person is a contact
  • Business is a contact
How to
  • Drag the 2 entities onto the designer
  • Link them with the inheritance tool

Or

  • Use the context menus.

This is the model first design default behaviour.

Notes
  • Polymorphic rules apply to loading. Querying the base type will cast to subtypes along with their respective data.
TPH: Table per Hierarchy
Scenario

I have a single table in the database where sets of columns identify difference inheritance structures and there is a condition column (lookup or code) that identifies the entity type.

Example

A user table that has a lookup on user type that varies as the user passes through my user lifecycle workflow (I.E.: Anonymous -> Registered -> Made Purchase -> VIP), as he does so, more methods and options are opened up.

How To
  • Drag an entity onto the designer and link it to the table.
  • Mark this entity as abstract
  • Delete non common properties including the keying column.
  • Create another entity that derives from the base entity
  • Under the mapping set a condition/s on the switch column/s
  • Add the additional columns unique to the child entity.
Notes
  • Polymorphic rules apply to loading. Querying the base type will cast to subtypes along with their respective data.
TPC: Table per Concrete Class
Scenario

I have multiple identical tables that represent an inheritance structure. Conceptually they are the same, but structurally there is additional information that is mutually exclusive (I.E.: Relationships)

Example
  • Employee -> CurrentEmployee that maps to Employees table with relationships to Manager and Subordinates
  • Employee -> PreviousEmployee that maps to the PreviousEmployee table. It merely is a track of prior employees.
How To
  • This is not supported by the designer.
  • On the designer, create the two entities with their default mappings
  • View the designer in XML
  • Copy one of your entity elements from the CSPI layer and paste it.
  • Mark it as abstract and name it as the base type.
  • Delete all the common properties from the original entity and set its base type to your new base type.
  • For the other entity, mark its base type as your base type.
  • What this does is retains the mapping to the original for the second entity whilst classifying the conceptual entity as that of the base type.
Notes
  • Polymorphic rules apply to loading. Querying the base type will cast to subtypes along with their respective data.

Splitting Scenarios

Vertical Splitting
Scenario

We have two tables with a 1 – 1 relationship that we want to represent in a single entity. This is like TBT inheritance mapping without the inheritance structure.

Example

I have a User table with a 1 – 1 relationship with a Location table, since in my application a user is associated with a single location. I want to represent the User Entity with the users location information in it.

How To
  • Add both entities onto the designer.
  • Select and Cute the Location properties from the Location Entity
  • Paste them into the User Entity
  • Delete the Location Entity.
  • When prompted if you want to remove the Location entity from the Store layer, click No
  • Go to the mapping details on the User entity and add the Location table to the mappings list. This will automatically populate the property mappings.
Notes
  • When selecting this creates an inner join transparently
  • Because this is NOT inheritance, polymorphic loads do not apply.
Horizontal Splitting
Scenario

I have 2 similar tables with the same column names and I want only one entity. However I still want to distinguish between the two via a Boolean flag

Example

I have two tables that represent batches of data. Processed and Unprocessed. I want to easily remove a row from the Unprocessed table and insert it into the Processed table when I have finished processing it.

How To
  • Add the two entities to the designer
  • Delete the Processed entity
  • When prompted if you want to remove its data from the Store Layer, click No
  • Select the Unprocessed entity and add the Processed table to its mappings
  • Right click the Unprocessed entity and add a new scalar property called “IsProcessed” of type Boolean
  • Go to the EDMX XML, as this next step is not supported by the designer
  • In the mapping layer under the Processed ENTITY you will see 2 mapping elements, one for Processed Table and one for Unprocessed Table, under the Processed TABLE element, add a condition element
    <Condition Name=”IsProcessed” Value=”false” />
  • Under the UnProcessed TABLE element, add a condition element
    <Condition Name=”IsProcessed” Value=”true” />
Notes
  • Creates a union on the set when selecting
  • When changing the Boolean flag between conditions, the row is deleted from the one table and inserted into the other transparently.
Table Splitting
Scenario

I have one table but I want to create a navigation property to some other columns on the same entity.

Example

I have a Person entity with a picture column on it. I do not want to load this image each time I query the entity. Rather I want to have a Picture property that I can navigate to and defer the loading of the image.

How To
  • Add the entity to the designer
  • Copy and paste the entity
  • Rename the second entity
  • Pluralize the second entities Entity Set Name from its properties.
  • Delete all unwanted properties from the second entity
  • Set the mapping on the second entity to the same Store as the first.
  • Delete opposite properties on the first entity
  • Add an association element to the designer
  • Set the multiplicity to 1 – 1
  • Double click on the association entity
  • Set the Principal to the second entity
Notes
  • Setting the contexts ContextOptions.LazyLoadingEnabled = false will cause the entity to NOT load the navigation property EVEN when it is navigated to. It will instead return null.
  • With lazy loading enabled this will query the navigation property when it is called, as per expected lazy load behaviour
  • Using the Include(string path) method on the query allows you to force the loading of the navigation property upfront.

Whilst playing with the Entity Framework, I though it would be pretty cool if the auto generated code would create and implement an interface along with the standard data context and entity classes. This would make unit testing with a mocking framework really easy, and allow you to write tests without having to access a data access store. Usually I write a wrapping repository to mock out, but this is a little tiresome. Since the Entity Framework is really flexible, this shouldn’t be too much trouble. So lets give it a bash.

I used the POCO class T4 templates as a basis. You can read more about them here. I am no T4 expert, but the code is intuitive enough to make sense of it, and more importantly to make some modifications to it.

Making the Changes

We first create a new class library project and create a folder for our EDMX and supporting classes in the repo.

01-New Project

After this we create a standard EDMX

02-Add New EDMX

I am going to play with a basic invoicing model

03-Model

I am also going to be doing this in design first mode, but it is no different if you are generating off an existing database. If we take a look at the generated code behind we can see a couple of things.

04-Designer Code

We see that our container inherits from the ObjectContext base class, we also see that outside of this class is a region for all the generated entities. These entities by default derive from the EntityObject class. The objective of the POCO T4 template is to decouple this using dynamic proxies and some black voodoo magic. Also if we take a look at the properties for the ObjectContext we notice that the Code Generation Strategy is set to Default.

05-Code Generation Strategy

What we are going to do next is to tell the EDMX to use our own custom code generation strategy. Right clicking on the design surface, we select “Add Code Generation Item..” from the context menu.

06-Add Code Generation Item

If you have installed the POCO generator extension then you will see it in the dialog box. Select this.

07-POCO Entity Generator

What this does is adds two T4 templates to the project, one for generating the new ObjectContext and the other for generating the entities. These .TT files have the generated code behind .CS files. You can explore them as well as the template that generated them. To help in viewing the T4 template I recommend downloading a copy of Tangible T4 Editor, this gives you some nice highlighting and some intellisense on the T4 template code.

08-Template Files

Next, we need to copy the .Context.TT file and paste it in the folder, this will give us a file to work with for the interface. Rename it to something more appropriate like “Context.Interface.tt”. We now dive into this new file and make some changes so that it generates an interface off the EDMX file.

10-Interface Definition

We create a prefix an “I” onto the container name, and implement the IDisposable interface onto the interface. Also make sure to change “class” to interface. We then remove all the unnecessary stuff like the constructor definitions.

11- Property Definition

For the property definitions we change them to a return type of IObjectSet. This is so we can mock out the return types for any LinQ / Lambda queries against the interface. Also make sure we only implement the GET portion of the property. Also get rid of all the implementation stuff around the property so that only the definition is left.

Onto the function imports. We do a similar thing, reduce to definitions and remove the implementation stuff.

12-Function Imports

For the VB.Net version of the POCO class generator there is a bug in the function import code, I blogged on this here. You will have to code around this. To be fair, I haven’t tested the ObjectResult return type, but if you cant instantiate it then you will have to do something similar to the properties and find an appropriate interface to substitute.

13-Generated Interface

If all goes well, on saving the template we get a nice clean interface generated against our context entities.

We also need to add some of the functionality found in the ObjectContext, like SaveChanges, etc. So it may be invoked through the interface. Add what you need.

18-Additional Context Methods

The next task is now modifying the generator for the ObjectContext so that it implements our interface.

14-Context Implementing Interface

We do this by modifying the definition of the ObjectContext to implement the interface.

We also change the property definition to return IObjectset. Also note for VB.Net, since you need to implement interfaces explicitly, you will also need to add this on the end of the property.

15-Properties Implement IObjectset

We can now inspect the new generated ObjectContext.

16-New Generated Context

Testing it Out

Lets write some implementation code to test. Typically one write some form of service that implements the data context. We also want to unit test the service by mocking out the repository.

Lets create a basic service.

17-Create Service

Some things to note here. We use a constructor dependency injection pattern here to allow us to inject our own flavor of repository. Also we have a method called “CreateInvoice” that receives some information (via a front end or wherever) and creates the header and associative entities for an invoice. You should be able to eyeball the implementation.

We now need a little helper class that implements the IObjectSet interface so we can return it. We also dont want it to be completely dumb as it would be nice if we could base our assertions off this object to verify the integrity of the data the service method is producing. So we base it off a List class. This was pulled off MSDN

19-Fake Obkect Set

We call this FakeObjectSet appropriately. Notice it is a generic.

We also add Rhino Mocks as our mocking framework.

20- Add Rhino Mocks

We then create a fancy mocking test to verify the behaviour of the service method. I am not going to worry about philosophies of mocking vs. stubbing here. This is just an example of a test.

21- Create fancy mock test

Some interesting points here are:

We use the generate our FakeObjectSets with fake data relevant to the test.

We use the mocking framework to return this data to the service.

The Repo is mocked.

We can query the repo after the fact to verify the data in it. We compare this to our input data to ensure all business rules have been applied.

We then run the test.

22- Verify Results

Green lights baby…green lights.

For getting started with the Entity Framework follow my multi part tutorial

Windows Live Writer

Posted: August 6, 2010 in Tools
Tags:
del.icio.us Tags:

This blog post is kind of experimental and informational. I decided to try out Windows Live Writer which is a blog posting tool by Microsoft. It is meant to work seamlessly with most popular blogging engines, and gives you the nice rich UI experience as well as a few extra goodies. Let me run through some of the features.

Editing and Formatting

Formatting is done via the a pretty intuitive toolbar at the top. Most common layout features are available.

image

The three tabs at the bottom let you see the final result without having to post. This is a pretty neat feature.

image

What it does is download your theme from your blog, and then sets up an small render prototype for you. Also because you are working on your local machine, all rich copy and paste tooling works. This is great for images when coupled with the snipping tool that ships with Windows 7.

image The context menu on docked on the right gives you some nice formatting options for the currently selected element. For images it gives you some nice pixel effects like drop shadow and the regular suspect.

Elements

There are a couple of elements available to you:

  • Hyperlink
  • Picture
  • Photo Album
  • Table
  • Maps
  • Tags
  • Video

Plus an options to add some additional plugins. The maps looks interesting so I am going to give that a try.

Map picture

Not too sure I like the pushpin graphic, but hey it gets the job done and is pretty easy to do.

It also handles tagging an categories. All in all I would say pretty neat. Will be using it from now on I reckon.

Ok I give up. Is someone able to explain to me why the VB POCO generator T4 template refuses to generate my function imports. Not only that, but it doesn’t even generate the region for the function imports.
This pic shows the T4 code clearly defining the function import region

And this pic clearly shows the code NOT being generated.

Any ideas?

UPDATE: Found the problem

To answer my question, it seems that yes there is a bug. Looking at the template I cam across this nugget:

If edmFunction.ReturnParameter Is Nothing Then
  Continue For
End If

Essentially what this does is checks the return type of the import function and quits the rest of the iteration if the import function has no return type. Seems logical? NO.

You see in VB functions that don’t return a value are Subs. Since the generator can only handle functions, any function imports that don’t return a collection will not be generated. This is an issue for non-query functions.

You got a couple of options:

  1. Ensure your procs ALWAYS return something. Not always feasible.
  2. Modify the T4 template to handle Subs

I will try options 2.

Keep you posted

UPDATE: Found a potential solution

Ok I did my work around and replaced my “Function Imports” section in the T4 template with the following:

<#
region.Begin("Function Imports")
#>
<#
For Each edmFunction As EdmFunction In container.FunctionImports
Dim parameters As IEnumerable(Of FunctionImportParameter)  = FunctionImportParameter.Create(edmFunction.Parameters, code, ef)
Dim paramList As String = String.Join(", ", parameters.Select(Function(p) "ByVal " & p.FunctionParameterName & " As " & p.FunctionParameterType).ToArray())
If edmFunction.ReturnParameter Is Nothing Then
#>
<#=Accessibility.ForMethod(edmFunction)#> Sub <#=code.Escape(edmFunction)#>(<#=paramList #>)
<#
For Each parameter As FunctionImportParameter In parameters
If Not parameter.NeedsLocalVariable Then
Continue For
End If
#>
Dim <#=parameter.LocalVariableName #> As ObjectParameter
If <#=If(parameter.IsNullableOfT, parameter.FunctionParameterName & ".HasValue", parameter.FunctionParameterName & " IsNot Nothing")#> Then
<#=parameter.LocalVariableName#> = New ObjectParameter("<#=parameter.EsqlParameterName#>", <#=parameter.FunctionParameterName #>)
Else
<#=parameter.LocalVariableName#> = New ObjectParameter("<#=parameter.EsqlParameterName#>", GetType(<#=parameter.RawClrTypeName #>))
End If
<#
Next
#>
MyBase.ExecuteFunction("<#=edmFunction.Name#>"<#=code.StringBefore(", ", String.Join(", ", parameters.Select(Function(p) p.ExecuteParameterName).ToArray()))#>)
End Sub
<#
Else
Dim returnTypeElement As String = code.Escape(ef.GetElementType(edmFunction.ReturnParameter.TypeUsage))
#>
<#=Accessibility.ForMethod(edmFunction)#> Function <#=code.Escape(edmFunction)#>(<#=paramList #>) As ObjectResult(Of <#=returnTypeElement #>)
<#
For Each parameter As FunctionImportParameter In parameters
If Not parameter.NeedsLocalVariable Then
Continue For
End If
#>
Dim <#=parameter.LocalVariableName #> As ObjectParameter
If <#=If(parameter.IsNullableOfT, parameter.FunctionParameterName & ".HasValue", parameter.FunctionParameterName & " IsNot Nothing")#> Then
<#=parameter.LocalVariableName#> = New ObjectParameter("<#=parameter.EsqlParameterName#>", <#=parameter.FunctionParameterName #>)
Else
<#=parameter.LocalVariableName#> = New ObjectParameter("<#=parameter.EsqlParameterName#>", GetType(<#=parameter.RawClrTypeName #>))
End If
<#
Next
#>
Return MyBase.ExecuteFunction(Of <#=returnTypeElement#>)("<#=edmFunction.Name#>"<#=code.StringBefore(", ", String.Join(", ", parameters.Select(Function(p) p.ExecuteParameterName).ToArray()))#>)
End Function
<#
End If
Next
region.End()
#>

It seems to build and generate what seems to be a decent Sub. Please note, this is UNTESTED.