Sunday, April 14, 2013

GOTO Zurich 2013

Last week I attended the GOTO software conference in Zurich. It was intensive 2 days with a lot of good presentations about Agile, Polyglot Programming, Multi-Core Programming, Continuous Delivery and more. There was a good mix of international and local speakers. My favourite speakers where Dan North, Bruce Tate and Greg Young.

In his presentation "8 Lines of Code" Greg Young talked about simplicity in software programming. He showed us a code example which used an AOP style approach to implement cross-cutting concerns. He showed us the depenendencies and complexity of these few amount of code. Multiple times he raised the question: "Can you explain this to a junior?". He then showed how the code can be restructured to remove the dependencies. He ended up with a simple design with no dependencies to external code.
Simplicity was also the topic of Greg's keynote the day after. The keynote had the provocative title "Developers have a mental disorder". He went after our sacred cows like ORM, relational databases, design patterns, DRY and unit testing. He didn't say these patterns and practices are bad. His point was that they are a lot overused or used without thinking. Regarding DRY he mentioned the disorder that programmers try to apply DRY to the extreme also in unit test code. I can confirm this observation. I too often saw completly unreadable unit tests where every common code was factored out in helper classes or methods.

Bruce Tate gave two talks with insides out of his journey Seven Languages in Seven Weeks. It was a very fascinating tour of programming languages like Clojure, Haskell, Io, Prolog, Scala and Erlang. Eventhough this was all exciting I don't know how this big choice of languages can help me as a developer who develops most of the time business applications. In my opinion the "Right Tool for the Job" is a myth. In the last 10 years I cannot remember a single project where I could choose the language. It was always given by the organization I worked with or by the customer.

Besides the talks I very enjoyed to get in touch with other attendees and speakers. My boss and me had a quick chat with Greg Young which motivated us to dig deeper into the implementation details of the event sourcing pattern which we already use for persistence in a product we are developing. These special moments are at least as valuable as the content of the presentations itself.

Sunday, July 11, 2010

Automating the creation of your installation packages

Windows Installer XML (WiX) is a great technology for creating installation packages for your .NET based applications. The tools from the WiX toolset do a great job if you have to create frequent releases of your application. Tools like heat.exe help you to create Wix fragments but you have to maintain those files manually by hand. If you practice continuous deployment this is not an option.

With Visual Studio and some small chuncks of msbuild it's still possible to automate the creation of your installation package. The solution to automate your setup is based on the following tools:
  • Wix Setup project for Visual Studio 2010 (part of Wix 3.5)
  • HeatDirectory MsBuild task (part of Wix 3.5)
  • for web applications: Visual Studio Web Deployment 2010
I'm showing you how you do this with Visual Studio 2010 but it's also possible with Visual Studio 2008 (with the use of Visual Studio Web Deployment Projects). Even if the solution is based on Visual Studio there is no need to install Visual Studio on your build server.

First you have the add a new setup project to your Visual Studio solution. You could generate wix fragment files for every component with heat.exe and include them in your setup project. But that's not what we want here. Our goal is that the fragment files will be generated on every build. If we add new output files to a project in our solution, these files will then automaticly show up in our setup.

Fortunatly Wix comes with an msbuild task called HeatDirectory that helps us in accomplishing our goal. We can hook into the build process of the setup project and call this task for every output directory of our solution in the before build target. As an example if your solution contains a windows service project in a folder WindowsService you had to modify the setup project file as follows:

 <Target Name="BeforeBuild">
<ItemGroup>
<CrtHeatDir Include="..\WindowsService\bin\$(Configuration)" />
</ItemGroup>
<HeatDirectory OutputFile="$(ProjectDir)\ProductCrtDirRef.wxs" Directory="@(CrtHeatDir)" ComponentGroupName="CrtComponentGroup" DirectoryRefId="INSTALLDIR_CRT" AutogenerateGuids="true" PreprocessorVariable="var.CrtDirPath" SuppressRegistry="true" SuppressRootDirectory="true" ToolPath="$(WixToolPath)" NoLogo="true" />
</Target>


Notice that we pass a preprocessor variable CrtDirPath to the HeatDirectory task. This serves as the source path of the wix component files in our generated fragment file. We have to define this as a constant in our setup project file:

 <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|AnyCPU' ">
<DefineConstants>Release;CtDirPath=$(SolutionDir)WinodwsService\bin\$(Configuration)</DefineConstants>
</PropertyGroup>


Creating output files for Web Application Projects

In our example we could create our component files based on the output directory of our windows service project. If you have a web application in your solution you have to make a little detour as there is no such output directory. With the help of Visual Studio 2010 Web Deployment we can create the output of your web project and use is it as input for heat. If you right click on a web application project in Visual Studio 2010 there is a command called "Build Deployment Package". This is the command that we want to integrate in our setup project. The msbuild target behind this command is called Package. We can call this target of the web application project within our setup project with a call to the task msbuild:

<Target Name="BeforeBuild">
<ItemGroup>
<WebProjects Include="..\WebServices\WebServices.csproj" />
<SwsHeatDir Include="..\WebServices\obj\$(Configuration)\Package\PackageTmp" />
</ItemGroup>
<Msbuild Projects="@(WebProjects)" Targets="Package" Properties="Configuration=$(Configuration);Platform=AnyCPU">
</Msbuild>
<HeatDirectory OutputFile="$(ProjectDir)\ProductSwsDirRef.wxs" Directory="@(SwsHeatDir)" ComponentGroupName="SwsComponentGroup" DirectoryRefId="INSTALLDIR_SWS" AutogenerateGuids="true" PreprocessorVariable="var.SwsDirPath" SuppressRegistry="true" SuppressRootDirectory="true" ToolPath="$(WixToolPath)" NoLogo="true" />
</Target>


Package creates it's output in the web application's obj directory. The directory PackageTemp contains all the files that we have to include in our deployment.

The next step in automating our setup would be to include the build of our setup in a continuous integration or nightly build. During this build it is often practical to include the product version in the name of the msi and to copy it to a drop location where it can be downloaded by testers or even customers.

Even if WiX rather supports a manual creation strategy of deployment packages the toolset provides tools for automation. Thanks to MsBuild and Visual Studio you only have to do a little bit of additional work to automate the creation of your installation package.

Download sample source

Saturday, January 16, 2010

MSF Agile goes SCRUM

My new company uses Microsoft Team Foundation 2008. The last time I had a look at Team Foundation and it's supporting processes from MSF was five years ago when I had to evaluate the product for my former company. It was 2005 and Microsoft created yet another agile software development process called MSF Agile. In an STSC CrossTalk article MSF Agile creator Randy Miller praised the "new innovative techniques" which the process introduced. I never undestood which new techniques he was talking about as it appeared to me that MSF Agile took advantage of practices from the agile community and gave them other names. Scenarios (aka user stories), scenario list (aka backlog), shadowing (aka incremental architecture), iteration based planning and time-boxing are all proven agile practices that even at that time existed since years.
Last week I was wondering what the new version of MSF Agile based on Team Foundation 2010 will look like. I was very surprised when I read the pre-release documentation of MSF for Agile Software Development v5.0 on msdn. MSF for Agile v5 is based on SCRUM and the engineering practices from XP. It now also uses the same terminology as our industry uses since years. So no more scenarios, scenario lists etc. I think this is a wise decision from Microsoft. We don't need another termonology for the same thing. It's sometimes hard enough that there is no common style of software development in our industry.

Saturday, December 5, 2009

ASP.NET MVC - Thoughts and advices

It's almost one year ago when I got into web development with ASP.NET MVC. We prototyped with the betas, started using the release candidate to develop a new release of a product early 2009 and switched to final 1.0 when it was released. We went for MVC instead of web forms because we liked the simplicity of the web mvc concept. We also wanted increased testability of our code.

We never regretted our decision. Here are some thoughts, advices, pros and cons about ASP.NET MVC 1.0 after using it on a day by day basis for almost one year:

  • The mvc pattern and the web is really a good duo. I was fascinated when I saw how Rails implemented the pattern with concepts like actions, filters and routes. Fortunately Microsoft didn't reinvent the wheel and implemented most of these proven concepts in ASP.NET MVC.
  • ASP.NET MVC is a presentation layer framework for the web. It's not an application framework like RAILS. There is nothing like ActiveRecord, migrations or test fixtures in the framework. The positive thing is that you can choose whatever technologies you want to add. The negative thing is that you waste time in debates while choosing these technologies.
  • ASP.NET MVC is not a very opinionated piece of software. It provides the hooks to add conventions and functionality, but doesn’t provide much in the way of structure itself. I couldn't find any advices or guidelines from Microsoft. More opinions are available from the guys from CodeBetter or in the Book ASP.NET MVC in Action
  • Model binders and action filters are very helpful to keep controller actions clean. Use these guidelines for deciding when to use them:

  • Consider moving logic to an action filter when you start repeating yourself in your controller code.

    Consider using built in model binding whenever you have to access the requests forms collection in a controller action.

    Consider writing your own model binder if you can't bind your form data with the default model binder.

  • Even thought you can do everything with HTML, CSS and JavaScript the cost of a nice and rich user interface in the browser are not to be sneezed at. In web forms you have a wide range of controls for almost every UI element. In ASP.NET MVC you have to build a lot of UI stuff (at least at time of this writing) by your own. The wonderful JQuery library and it's plugins provide basics like auto complete text boxes, tree views, calendars and popups. But you always have to write your own html helpers to integrate them into your views.
  • The ASP.NET MVC team did a good job and built the framework with testability in mind. Calling controller actions from within a unit test and asserting the action's result is very easy - as long as you don't have real world code in your controller. Even if you do a good job in layering your application the presentation layer is the place where everything is put together. That means that you will mostly end up with many dependencies in your controllers and that's exactly the scenario where unit testing gets hard. Isolation frameworks like Typemock, Moq or Rhinomock can help you overcome these issues. But the use of these tools often lead to over specified tests.
  • Lately we had to extend our product with an outlook addin with a reduced feature set in comparison of the main web client. We decided to implement the web services as RESTful services. ASP.NET MVC 1.0 has no built in RESTful web service support but with a little help from the REST For ASP.NET MVC SDK and the WCF REST Starter Kit it was very easy to build a web API with MVC.
In summary I can say developing web applications with ASP.NET MVC is fun. I'm looking forward for Version 2 which will ship with Visual Studio 2010.

Sunday, January 18, 2009

Layered Architecture with LINQ to SQL (Part 2)

In my last post I mentioned two popular approaches for structuring your data access logic when choosing the domain model: the Active Record and the pure domain model way. In this post I want to explain how I would implement a data access Layer for LINQ to SQL for a pure domain model. There are two reasons why I prefer this to Active Record :
  • I like the idea of persistence ignorance, clean ordinary classes where you focus on the business problem. I'm not too dogmatic about that. As an example I don't care about the LINQ To SQL mapping attributes I have to put in my domain classes.
  • I don't want to run most of my unit tests against the database. The Repository pattern helps a lot in achieving this.
Repositories, Unit Of Work and Entities with LINQ to SQL

The central object of LINQ to SQL is the DataContext object. It tracks changes to all retrieved entities. It implements the Unit of Work and the Identity Map patterns and also provides query functionality on a per table basis. It's similar to NHibernate's Session object. Too bad that Microsoft didn't define an interface for this class (like the NHibernate team did it with the ISession interface). Such an interface is import to provide a stubbed implemenation during unit testing. So let's define our own interface and name it IDataContext:
public interface IDataContext: IDisposable
{
void Commit();

void DeleteOnSubmit<T>(T entity) where T: class;

ChangeSet GetChanges();

IQueryable<T> GetTable<T>() where T: class;

IQueryable<T> GetTable<T>(Expression<Func<T, bool>> predicate) where T: class;

void InsertOnSubmit<T>(T entity) where T: class;
}

The class that implements this interface is just an adapter for the DataContext class. The code is straight forward:
public class LinqToSqlDataContextAdapter: IDataContext
{
private readonly DataContext _dataContext;
private bool _disposed;

public LinqToSqlDataContextAdapter(IDbConnectionConfiguration connectionConfiguration): this(new DataContext(connectionConfiguration.ConnectionString))
{

}

protected LinqToSqlDataContextAdapter(DataContext dataContext)
{
_dataContext = dataContext;
}

public void Commit()
{
_dataContext.SubmitChanges();
}

public void DeleteOnSubmit<T>(T entity) where T: class
{
_dataContext.GetTable<T>().DeleteOnSubmit(entity);
}
//... more adapter code
}

Let's continue with the Repository Pattern. According to Fowler a Repository "provides a layer of abstraction over the mapping layer where query construction code is concentrated", to "minimize duplicate query logic". A Repository usually provides a set of query operations for an Entity. In addition to that objects can be added to and removed from the Repository. My interface for a generic Repository looks like this:
public interface IRepository<T> where T: IGuidIdentityPersistence
{
void Add(T entity);

long Count();

long Count(Expression<Func<T, bool>> predicate);

void Delete(T entity);

bool Exists();

bool Exists(Expression<Func<T, bool>> predicate);

T FindFirst(Expression<Func<T, bool>> predicate);

T Find(object id);

IQueryable<T> FindAll();

IQueryable<T> FindAll(Expression<Func<T, bool>> predicate);
}

I decided to use IQuerable instead of returning a collection as the return value of the FindAll methods. This makes the usage very flexible. IQuerable is a deferred query so clients can add filters as needed. For more specialized methods I think it's better to return a collection than a query.

The generic implementation of IRepository<T> goes here:
public class Repository<T>: IRepository<T> where T: class, IGuidIdentityPersistence
{
private readonly IDataContext _dataContext;

public Repository(IDataContext dataContext)
{
_dataContext = dataContext;
}

public Repository()
{
_dataContext = UnitOfWork.Current;
}

private IDataContext DataContext
{
get { return _dataContext; }
}

public void Add(T entity)
{
DataContext.InsertOnSubmit(entity);
}

public long Count()
{
return DataContext.GetTable<T>().Count();
}

public long Count(Expression<Func<T, bool>> predicate)
{
return DataContext.GetTable(predicate).Count();
}

public void Delete(T entity)
{
DataContext.DeleteOnSubmit(entity);
}

public bool Exists()
{
return DataContext.GetTable<T>().Count() > 0;
}

public bool Exists(Expression<Func<T, bool>> predicate)
{
return DataContext.GetTable(predicate).Count() > 0;
}

public T FindFirst(Expression<Func<T, bool>> predicate)
{
return FindAll(predicate).FirstOrDefault();
}

public T Find(object id)
{
return DataContext.GetTable<T>().Where(e => e.Id.Equals(id)).FirstOrDefault();
}

/// <summary>
/// Returns a query for all object in the table for type T
/// </summary>
public IQueryable<T> FindAll()
{
return DataContext.GetTable<T>();
}

/// <summary>
/// Returns a query for all object in the table for type T that macht the predicate
/// </summary>
public IQueryable<T> FindAll(Expression<Func<T, bool>> predicate)
{
return DataContext.GetTable<T>(predicate);
}
}

I made the class concrete on propose. As the Repository class already defines a lot of helpful methods the class can be used in situations where you don't need a custom Repository. Below is an example where a generic Repository is used in a ASP.NET MVC Controller:
public class BookController: Controller
{
private IRepository<Location> _locationRepository;

public BookController(IRepository<Location> locationRepository)
{
_locationRepository = locationRepository;
}

//...
[AcceptVerbs("Post")]
public ActionResult Edit(Guid id, string title, string author, Guid locationId)
{
Book book = GetBook(id);
UpdateModel(book, new string[] {"Title", "Author"});
Location selectedLocation = _locationRepository.Find(locationId);
book.AddLocation(selectedLocation, GetCurrentUserName());
UnitOfWork.Current.Commit();
return View("Show", book);
}

A custom Repository could look like this:
    public class BookRepository: Repository<Book>, IBookRepository
{
public BookRepository(IDataContext dataContext)
: base(dataContext)
{}

public BookRepository()
{}

public IEnumerable<Book> FindByTitle(string title)
{
return FindAll().Where(b => b.Title.Contains(title)).ToList();
}
}

The implemenation of the Repository pattern I showed in this post adds a layer of abstraction on top of LINQ to SQL and results in a more decoupled architecture. As a side effect the design simplifies database independent testing. The generic Repository provides easy and flexible usage for simpler situations.

Layered Architecture with LINQ to SQL (Part 1)

As with many other Microsoft technologies LINQ to SQL mainly supports a RAD (rapid application development) style of development. You can drag and drop tables into the LINQ to SQL Designer and Visual Studio will define a simple 1:1 mapping, generate domain classes and a typed datacontext for your database. All set, ready to hack! Let's create a win form, drag some controls in it, code some LINQ to SQL queries in a click event handler and bind it to a grid. Sounds easy right? Well it's easy and you can do it, but consider the following problems:
  • Writing database queries in your presentation layer is not as bad as using SQL code in it, but it’s still a questionable practice. If you do it, you’re mixing data access code with business logic and presentation code. This means that you’ve decided to abandon the benefits of the separation of concerns.
  • LINQ to SQL queries are scattered around in your business logic or presentation code. Sooner or later you run into a maintenance problem.
  • There is no concept for code reuse. Queries that need to be used at different places have to be duplicated.
But that doesn't mean that LINQ to SQL is not suitable for building layered enterprise applications. There are a lot of resources available that can help you to structure your data access logic. Most of them are documented in Martin Fowler's Patterns of Enterprise Application Architecture (PEAA) book. If you choose the Domain Model approach there are mainly two options for your DAL:
  • Active Record. Active Record puts data access logic directly in the domain object. In most implementations you can see a set of static finder methods on the domain objects and instance methods for operations like save, update and delete.
  • Pure Domain Model with a Unit Of Work and Repositories. This approach separates the data access logic from the domain objects. With the combination of these patterns different objects exist for different concerns: the unit of work is responsible for change tracking, repositories for data access and domain object for domain logic.
In my next post I want to show you how I would apply the pure Domain Model patterns in the context of LINQ to SQL.

Sunday, December 7, 2008

ThoughtWorks Cruise: First Impressions

We use CruiseControl.Net as our continuous integration server in our company. CruiseControl.NET is a great tool and helps us a lot in adopting our continuous integration practice. Over the years a lot of cruise control projects accumulated. Currently we have 8 build servers with all together over 100 projects. Most of the time some servers just do nothing while others are under heavy load. So we're looking for a solution to use our hardware more efficiently.

What we really need is a tool that can distribute builds to different machines. Fortunately most of the commercial continuous integration servers do support this kind of feature. Most of them use some sort of a server/agent based concept to distribute builds. Thoughtworks provides a good overview of several CI products that are available on the market. I decided to have a deeper look at Cruise, the enterprise version of Cruise Control from Thoughtworks.

So I downloaded the free edition of Cruise 1.1 and installed it on my laptop. Cruise consists of a server (Cruise Server) and several agents (Cruise agents), that receive work delegated from the server. Therefore a build can be processed in parallel by several cruise agents. What I really liked about Cruise is their implementation of the concept of the deployment pipeline. Cruise allows monitoring of changes to an application as they progress from initial check-in to functional testing, performance testing, user acceptance testing, staging, and release. A pipeline consists of one ore more stages which again consists of jobs. You can consider a pipeline configuration as a workflow of a particular release process. In that way Cruise goes beyond basic continuous integration that many products support these days.

What's interesting is that not every state transition has to be automatically triggered. Some stages may be manually approved before the release process can go further. For instance, in my current project most of the tests are accomplished manually by domain experts outside of the development team. During an iteration or at the end of an iteration we provide builds for manuall testing. In Cruise, this process could be implemented by a stage that needs manual approval.

When I hit Cruise's dashboard for the first time I was disappointed about the few features the product seemed to provide. The main page has just four tabs for viewing the current activity, managing pipelines and agents and a administration tab. Everything looked so minimalistic. But when I begun to use it, I was surprised that most of the things I wanted to do I could do.

But there are some negative points too. In the current version 1.1 they just support Subversion, Mercurial, Git and Perforce for source control integration. There are some other features that are missing in comparison to CruiseControl.NET. There is no support for visualizing reports from tools like FxCop or NCover, just NUnit is supported. They don't provide a plugin mechanism like Cruise Control. So there is currently no contribution or self extension possible.

To summarize, Cruise looks very promising for me. I really liked the concept of pipelines and the minimalistic user interface. From a CruiseControl.NET user perspective there are some key features missing in the current version 1.1. But I'm sure that the agile guys from Thoughtworks will soon come out with a new version that may include those missing features.