Sunday, April 3, 2011

Ønsker du å jobbe med meg?

Har du krevende utfordringer, og jobber i et firma eller skal sette sammen et prosjekt, med behov for en engasjert, positiv og konkurranseinnstilt tech lead/utvikler/arkitekt? Som forventer og stiller krav, som kontinuerlig jobber mot forbedring, og som vil utvikle seg innen både teknologi og ledelse? Og som ikke minst alltid har et sterkt fokus på målet?


Kort oppsummert har jeg

  • Prosjekterfaring fra alle tekniske hovedområder av et IT-system. Dette er oppnådd gjennom alt fra korte utviklings- og rådgivningsoppdrag, til to og et halvt år på et større forsikringssystem.

  • Organisasjonserfaring gjennom halvannet år som fagsjef i Objectware/Itera Consulting med arbeidsoppgaver rundt fagledelse, fagplanlegging, arrangering av fagdager og -kvelder, intervjuer, tilbudsarbeid og kundemøter.

  • Ekstern erfaring som tidligere leder og nåværende styremedlem av NNUG Oslo, samt foredragsholder hos NNUG, ved XP- og Smidig-konferansene, og på årets kommende Norwegian Developer Conference (NDC).


Etter over fire meget lærerike år i Objectware/Itera Consulting har jeg nå tatt en beslutning om å bytte arbeidsgiver. Tiden i Itera er noe jeg ser svært positivt tilbake på. Jeg har fått jobbe med utrolig dyktige kolleger, fått rikelig med utfordringer internt og i prosjekter, og i det hele tatt lært veldig mye.


For meg er det svært viktig å fortsette utviklingen. Jeg tror det gjøres best gjennom krevende utfordringer, og har du det så håper jeg du tar kontakt. Ønsker du å se mer detaljer om mine prosjekterfaringer, så er det tilgjengelig i min CV (pdf). I tillegg forsøker følgende bloggpost å gi et innblikk i hva jeg mener er viktig i IT-prosjekter, vel og merke fra et høyt perspektiv: link.


Rent praktisk er jeg i oppsigelsestid frem til 31 mai. Jeg satser derfor på at noe av det første jeg gjør for ny arbeidsgiver blir å gå på scenen på NDC i starten av juni. Ønsker du å ta nærmere kontakt er jeg tilgjengelig på mail: rune...@gmail.com, telefon: se CV, LinkedIn og Twitter.

Sunday, March 20, 2011

Quality in software – a summary

Who should read this: Software developers or other participants in IT projects who want a high level overview of what is needed to make projects succeed, with a technical bias.

 

All work in a project must be done to increase the likelihood of success. The process imposed, the roles involved, the people chosen and the technologies used should all work towards this goal.

 

For the record, I apologize for simplifying, not covering and simply forgetting important areas. I am trying to give you the pure and simple truth, however:

 

The pure and simple truth is rarely pure and never simple.

- Oscar Wilde

 

 

Process

IT projects are complex. The Standish Group’s Chaos Report has attempted to cover this over many years (2009 report in pdf).  To handle the complexities of software, the industry appears to move towards streamlining project processes in increasingly agile ways. A common denominator for this work is the well known manifesto for Agile Software Development devised in early 2001:

 

Individuals and interactions over processes and tools
Working software over comprehensive documentation
Customer collaboration over contract negotiation
Responding to change over following a plan

That is, while there is value in the items on
the right, we value the items on the left more.

 

As anyone involved in real projects know, you need to adapt to changes and challenges, deliver and receive feedback on a variety of levels, and continuously work to reduce waste. You’ll never know everything in advance, and unexpected changes will occur.

 

The process within IT consists of both the main methodology used and the more technical practices followed.

 

Overall process

The industry today is steadily working towards being agile, solving problems through iterations and improving continuously. From a developers perspective, this is usually something you will have to accept. Even if it’s not your responsibility, it doesn’t mean you couldn’t or shouldn’t influence the process. Being part of a project team means you have a responsibility for the project’s success. Communicate your concerns. Help inform your coworkers and management.

 

Within agile there are many different processes; Lean, Scrum, Kanban, and many more. Which is the right one? The choice depends on experience, your organization and the challenge ahead. But never forget: mix and match what works for you - being agile certainly apply to both the product created and the process used. (And be wary of certified [process]-masters. Are you sure it means what you think it does?)

 

Whenever you care about the results of what you are doing, skilled people are a must. But even the brightest people fail because of misunderstandings or general problems with communication. One very import point to focus on is to improve the feedback loop. Have the customer and business experts work close with you throughout the process, and increase the frequency of releases tenfold (At least to test environments). Training on and delivering new releases reduces the risks involved, increases the feedback from users, and enables you to deliver the right functionality at the right time.

 

Before moving on, I thought it proper to address a common misconception with agile - that agile equals little or no planning. I have to insist that this is a false claim, and the reality is quite the opposite. If you want to learn more, this is a good article explaining the importance of it, and at what levels planning should occur.

 

Technical process

The technical process is pretty much anything that directly relates to what is produced; estimation, tasks, quality assurance, and releasing new versions.

 

In project and sprint planning, estimation is usually considered an important practice. Estimation is useful to consider what is involved in producing a feature, visualizing the cost of implementing it, and ensuring that everyone has a similar understanding of the scope. How estimation, planning and tasks are handled varies, and you should experiment with what method works for you. Do ensure that you work to build transparency and trust, as described in this quote from Gabrielle Benefield, Director of Agile Product Development in Yahoo!:

 

Keeping everything transparent, and letting the business know of any changes as they come up, means that the business can adapt quickly to make the best decisions. At my last company I saw us go from a state of permanent chaos, where we had an extremely ambitious roadmap but couldn’t deliver products, to a predictable state where we could genuinely sign up for projects that we could deliver. The business said they might not always like the answers (they always want things tomorrow, after all), but at least they believed our answers and were not frustrated from feeling that they were being consistently lied to.

 

The quote, and significantly more, can be found in Agile Estimating and Planning by Mike Cohn.

 

There will often be doubt when planning something; which technology is right, what main flow of logic is appropriate, and so on. Don’t be afraid to use prototyping to verify assumptions. Creating an end to end working example and/or ensuring that the toughest problems can be solved can save you a lot of time in the future.

 

I mentioned feedback earlier as a key principle. Within the technical process, this should be supported through a few good practices. Code reviews, pair programming and static code analysis are tools useful to get quick feedback on quality. Automated tests are another, with unit, integration and acceptance tests (and sometimes a few more) as common categories. Continuous integration is another practice that helps increase feedback and quality. By automating compilation and running of all tests for each check-in, you add another step for ensuring quality, increasing feedback and helping visualize the current state of the code.

 

The practices mentioned above are often introduced more from the technical personnel in the project. But to support the overall process need of more frequent releases to test and production, a number of practices are key: automating deployments, including moving code, configuring machine and web servers, handling environment specific configuration settings and setting up the database. It won’t come free of charge, but can bring about great benefits.

 

A lot more can be said about improving the release and deployment process. I recommend reading the book Continuous Delivery to get a better overview.

 

 

Technology

Beyond the general practices, what is important in technical development?


At both higher and lower levels, having a good overview of alternatives is necessary to select the right technology or solution (Never forget the hammer nail analogy). It is impossible to know the details of everything, but learning of qualities plus pros and cons of alternatives enables you to better select the correct one. Subsequent tests might find your initial knowledge about the technology was wrong, but that’s why you use prototyping anyway.


When it comes to code, I am a strong supporter of simplicity and readability. If what you’re creating can’t be understood without hassle, how is anyone going to be able to maintain it? Refactor as your understanding improves or changes occur, or be aware that you will end up with code that in near or distant future will take time to understand, even more energy to change and will undoubtedly create bugs. And automate testing of anything moderately complex. Pretty please?


Beyond mastering the technologies chosen, what more is there? Never forget that your system will go into production one day. Security, logging, monitoring, error handling, integrations, multithreading, scalability, hardware and firewall issues are areas you need to focus on. Now is the time to go ahead and read Release It!.

 

So much more can be said about technology and technical development, but it is hard to generalize. Find the right people and all will be good with the world.

 


People and roles

A software project can consist of a variety of people. You can have software developers, UI specialists, database administrators, operations, business experts, testers, project leader, project owner (and a variety of other project something roles).


No matter what roles exist, make sure the communication is frequent between all related parties. If you can co-locate it will help reduce misunderstandings.


Management 3.0 has a lot to say about leading people in IT projects. As the author says

I think that people are the most important parts of an organization and that managers must do all they can to keep people active, creative and motivated.


Sunday, August 1, 2010

Converting WCF WSDL to single file (FlatWSDL) - correctly!

Note! This post is slightly different than most FlatWSDL posts, and fixes the bug with removing the <XS-import>’s.

 

The team behind WCF chose to split all generated WSDL-definitions into multiple files, one for each namespace, schema, etc. That’s a valid approach, and most modern tools support that out of the box, it’s done correctly after the WSDL standard.

 

However, not all tools support it. Various blog posts discuss different tools, and in my case it was a mainframe integration that caused the problem. I needed to get WCF to output it as one file. The good news is that’s really simple. You can use either FlatWSDL by Thinktecture or FlatWSDL in WCFExtras (They’re almost identical).

 

The only problem is that both solutions emit a WSDL file without appropriate <XS:Import>’s for the schemas. Many tools are able to resolve the references without the import, but not all are as forgiving. This post describes a code change to WCFExtras that will fix it, but I’ll add mine here as well.

 

What you need to do is to download one of the FlatWSDL approaches (This post describes it, as well as a host of others), then change the code in ExportEndpoint in FlatWsdl.cs to what I’ve included below, and you’re good to go.

 

public void ExportEndpoint(WsdlExporter exporter, WsdlEndpointConversionContext context)
{
XmlSchemaSet generatedXmlSchemas = exporter.GeneratedXmlSchemas;

foreach (ServiceDescription generatedWsdl in exporter.GeneratedWsdlDocuments)
{
var referencedXmlSchemas = FindAllReferencedXmlSchemasRecursively(generatedWsdl, generatedXmlSchemas);
ClearWsdlOfExistingSchemas(generatedWsdl);
AddAllReferencedSchemas(generatedWsdl, referencedXmlSchemas);
}

RemoveSchemaLocationFromXmlSchemaImports(exporter, generatedXmlSchemas);
}


private static IEnumerable<XmlSchema> FindAllReferencedXmlSchemasRecursively(ServiceDescription wsdl, XmlSchemaSet generatedXmlSchemas)
{
var referencedXmlSchemas = new List<XmlSchema>();
foreach (XmlSchema schema in wsdl.Types.Schemas)
{
AddReferencedXmlSchemasRecursively(schema, generatedXmlSchemas, referencedXmlSchemas);
}
return referencedXmlSchemas;
}

/// <summary>
/// Recursively extract all the list of imported
/// schemas
/// </summary>
/// <param name="schema">Schema to examine</param>
/// <param name="generatedXmlSchemas">SchemaSet with all referenced schemas</param>
/// <param name="referencedXmlSchemas">List of all referenced schemas</param>
private static void AddReferencedXmlSchemasRecursively(
XmlSchema schema,
XmlSchemaSet generatedXmlSchemas,
List<XmlSchema> referencedXmlSchemas
)
{
foreach (XmlSchemaImport import in schema.Includes)
{
ICollection realSchemas = generatedXmlSchemas.Schemas(import.Namespace);
foreach (XmlSchema ixsd in realSchemas)
{
if (!referencedXmlSchemas.Contains(ixsd))
{
referencedXmlSchemas.Add(ixsd);
AddReferencedXmlSchemasRecursively(ixsd, generatedXmlSchemas, referencedXmlSchemas);
}
}
}
}

private static void ClearWsdlOfExistingSchemas(ServiceDescription wsdl)
{
wsdl.Types.Schemas.Clear();
}

private static void AddAllReferencedSchemas(ServiceDescription wsdl, IEnumerable<XmlSchema> referencedXmlSchemas)
{
foreach (XmlSchema schema in referencedXmlSchemas)
{
wsdl.Types.Schemas.Add(schema);
}
}

private static void RemoveSchemaLocationFromXmlSchemaImports(WsdlExporter exporter, XmlSchemaSet schemaSet)
{
var mySchemaSet = new XmlSchemaSet();
mySchemaSet.Add(schemaSet);
foreach (XmlSchema schema in mySchemaSet.Schemas())
{
exporter.GeneratedXmlSchemas.Remove(schema);
}
}

Wednesday, July 28, 2010

Generating sample data for objects using AutoFixture, extending it with collections support, and using it as Silverlight sample data generator

This blog post consists of three parts

- On generating sample data in general

- Extending AutoFixture with automatic collections generation support

- Adding a sample data generator in Silverlight for Visual Studio/Blend screen previews


[Note: Windows Live Writer did a horrible job in transferring the post to the blog. I've fixed quite a bit on the formatting, but it's still in poor quality. Apologies for that. Let me know if you want the code samples without the formatting]


Generating sample data in general

Creating objects with sample data is a challenge developers often need to handle. The most common use case is for testing purposes, where you need an object with various values set. Some might be important, but many just need to be set to anything (Also called an anonymous variable).


I’ve seen and used various solutions for this. Often you see it either done manually, by an object mother, with test data builders, or by loading existing objects from a database. Many people start out writing a simple reflection tool to handle it, but stops a few hours down the road once the actual complexity involved becomes apparent.


I had a new use case for it this time, and wanted to avoid many of the problems with the solutions above. After searching for and trying various reflection based tools, I found AutoFixture, and was impressed from the get-go. It pretty much does exactly what you’d expect, and has a clean interface as well. Some examples:


- Creating an anonymous string

var anonymousText = fixture.CreateAnonymous<string>();

Result:

- anonymousText: aa48c714-6c6a-4ac4-9c27-3658c9e78d5f


- Creating an anonymous object

public class Person
{
    public string Name { get; set; }
    public int Age { get; set; }
}

..

var personWithAnonymousData = new Fixture().CreateAnonymous<Person>();

Result:

- Name: Nameb7c2a9fc-a5b0-4836-83d9-be1b057e0ff1

- Age: 1



Take a look a the AutoFixture cheat sheat for a number of good examples. There’s many things you can do to customize the behavior and output.


That covers the introduction to AutoFixture. If you need a sample data creator, for test data or other purposes, I advice you to check it out. Over to the next point.




Extending AutoFixture with automatic collections generation support

There was one feature I was missing in AutoFixture that I really wanted. If an object has a collection (, list, or any other sort of .NET collection) of something, that collection will only be initialized to an empty collection automatically. If I do a change to the Person object and rerun, the PhoneNumbers list below will be an empty list of strings.


public  class Person
{
    public string Name { get; set; }
    public int Age { get; set; }
    public List<string> PhoneNumbers { get; set; }
}


AutoFixture have functionality to handle this. If you run the line below first..


fixture.Register(() => fixture.CreateMany<string>().ToList());

..then the output of PhoneNumbers will be something similar to:

[0]: 635bc212-8d16-4029-8423-1f45016fe020

[1]: fcbed54b-94f6-4eea-b773-e7ec5e75b6dd

[2]: 814c120a-7e9c-487a-b99e-9e23d9d511b0



However, I would like this to be handled automatically. Doing it the AutoFixture-way you’ll need to know the internal structure of a class to define that it should handle a certain collection. Another problem is collection interfaces. If PhoneNumbers was defined as IList<string>, I would be forced to define it since it causes a runtime exception of:

“AutoFixture was unable to create an instance from System.Collections.Generic.IList`1[System.String], most likely because it has no public constructor.”


So I set out to see if I could extend AutoFixture to handle collections automatically. Seemed like a fun enough way of getting to know AutoFixture better as well as some good reflection fun. The requirements I set was to handle generic collections and interfaces to collections automatically.



First how to use it (Only the second line is new):


var fixture = new Fixture();
fixture.Customizations.Add(new CollectionsGenerator(fixture));
var personWithAnonymousData = fixture.CreateAnonymous<Person>();

The new CollectionsGenerator:


   1: /// <summary>
   2: /// Support for handling automatic generation of anonymous data for collections
   3: /// </summary>
   4: internal class CollectionsGenerator : ISpecimenBuilder
   5: {
   6:     private readonly Fixture _parentFixture;
   7:  
   8:     /// <param name="parentFixture">Requires parentFixture to ensure existing options are used.</param>
   9:     public CollectionsGenerator(Fixture parentFixture)
  10:     {
  11:         _parentFixture = parentFixture;
  12:     }
  13:  
  14:     public object Create(object request, ISpecimenContext context)
  15:     {
  16:         if (!ReflectionHelper.IsGenericDotNetCollectionOrInterface(request))
  17:             return new NoSpecimen(request);
  18:  
  19:         Type collectionType;
  20:         if (ReflectionHelper.ObjectIsADotNetCollectionInterface(request))
  21:             collectionType = ReflectionHelper.GetConcreteCollectionTypeToMatchInterface(request);
  22:         else
  23:             collectionType = ReflectionHelper.GetUnderlyingSystemType(request);
  24:  
  25:         var returnCollection = (IList)Activator.CreateInstance(collectionType);
26:

27:
AddAnonymousValuesToCollection(returnCollection, _parentFixture);
28:

29:
return returnCollection;
30:
}
  31:  
32:
private static void AddAnonymousValuesToCollection(IList collection, Fixture parentFixture)
  33:     {
34:
Type genericType = collection.GetType().GetGenericArguments()[0];
  35:         var createAnonymousMethod = typeof(SpecimenFactory).GetMethod("CreateAnonymous", new Type[] { typeof(ISpecimenBuilderComposer) }).MakeGenericMethod(new[] { genericType });
  36:         for (int i = 0; i < parentFixture.RepeatCount; i++)
  37:         {
  38:             collection.Add(createAnonymousMethod.Invoke(null, new ISpecimenBuilderComposer[] { parentFixture }));
  39:         }
  40:     }
  41: }


And an accompanying ReflectionHelper:

   1: public static class ReflectionHelper
   2: {
   3:     private const string UnderlyingSystemTypeString = "UnderlyingSystemType";
   4:  
   5:     public static bool CanRetrieveUnderlyingSystemTypeFromObject(object input)
   6:     {
   7:         return (input != null)
   8:                && (input.GetType().GetProperty(UnderlyingSystemTypeString) != null)
   9:                && (input.GetType().GetProperty(UnderlyingSystemTypeString).GetValue(input, null) != null);
  10:     }
  11:  
  12:     public static bool InputIsAssignableFrom(object request, Type ofType)
  13:     {
  14:         return (request != null)
  15:                && (request.GetType().GetProperty(UnderlyingSystemTypeString) != null)
  16:                && (request.GetType().GetProperty(UnderlyingSystemTypeString).GetValue(request, null) != null)
  17:                && (ofType.IsAssignableFrom((Type)request.GetType().GetProperty(UnderlyingSystemTypeString).GetValue(request, null)));
  18:     }
  19:  
  20:     public static Type GetUnderlyingSystemType(object input)
  21:     {
  22:         return (Type)input.GetType().GetProperty(UnderlyingSystemTypeString).GetValue(input, null);
  23:     }
  24:  
  25:     public static bool ObjectHasGenericTypeSpecified(object input)
  26:     {
  27:         return GetUnderlyingSystemType(input).IsGenericType && GetUnderlyingSystemType(input).GetGenericArguments().Length > 0;
  28:     }
  29:  
  30:     public static bool IsGenericDotNetCollectionOrInterface(object request)
  31:     {
  32:         if (!CanRetrieveUnderlyingSystemTypeFromObject(request))
  33:             return false;
  34:  
  35:         return    (ObjectIsADotNetCollection(request) || ObjectIsADotNetCollectionInterface(request)) 
  36:                && (ObjectHasGenericTypeSpecified(request));
  37:     }
  38:  
  39:     public static bool ObjectIsADotNetCollection(object request)
  40:     {
  41:         return InputIsAssignableFrom(request, typeof(IList));
  42:     }
  43:  
  44:     public static bool ObjectIsADotNetCollectionInterface(object request)
  45:     {
  46:         var objectTypeName = GetUnderlyingSystemType(request).ToString();
  47:  
  48:         var dotNetCollectionTypes = new List<string> //.NET Collections
  49:                     {
  50:                     "System.Collections.Generic.IList",
  51:                     "System.Collections.Generic.IEnumerable",
  52:                     "System.Collections.Generic.IEnumerator",
  53:                     "System.Collections.Generic.ICollection",
  54:                     "System.Collections.Generic.ISet",
  55:                     "System.Collections.IList",
  56:                     "System.Collections.IEnumerable",
  57:                     "System.Collections.IEnumerator",  58:                     "System.Collections.ICollection", 
  59:                     };
  60:         
  61:         return dotNetCollectionTypes.Any(objectTypeName.Contains);
  62:     }
  63:  
  64:     public static Type GetConcreteListTypeToMatchInterface(object request)
  65:     {
  66:         Type genericType = GetUnderlyingSystemType(request).GetGenericArguments()[0];
  67:  
  68:         string genericListTypeName = "System.Collections.Generic.List`1"
  69:                                      + "[[" + genericType.AssemblyQualifiedName + "]]"
  70:                                      + ","
  71:                                      + Type.GetType("System.Collections.IList").Assembly.FullName;
  72:         return Type.GetType(genericListTypeName);
  73:     }
  74: }



Note that this isn’t release quality. There’s a few issues, like with recursion if a complex type contains an instance of itself, and the way .NET collections are identified.



Adding a sample data generator in Silverlight for Visual Studio/Blend screen previews

The actual use case I was trying to solve was creating automatic sample data for XAML pages in Silverlight. This would enable me to better see how pages looked in Visual Studio and Blend, and a way of testing data bindings as well.


Note that AutoFixture doesn’t support Silverlight just yet, but there is a fork available that supports it (If you want to use a newer version, you only need to do a couple of changes to trunk to make it work).


I wanted a more natural interface for the sample data, so I created a SampleDataGenerator class:


   1: /// <summary>
   2: /// Creates sample data for object or collection of objects recursively
   3: /// </summary>
   4: public static class SampleDataGenerator
   5: {
   6:     public static T GenerateObjectWithData<T>()
   7:     {
   8:         return CreateFixtureWithDefaultSetup<T>().CreateAnonymous<T>();
   9:     }
  10:  
  11:     public static T GenerateObjectWithData<T>(int collectionItemAmount)
  12:     {
  13:         var fixture = CreateFixtureWithDefaultSetup<T>();
  14:         fixture.RepeatCount = collectionItemAmount; //Object count generated per list
  15:         return fixture.CreateAnonymous<T>();
  16:     }
  17:  
  18:     public static IEnumerable<T> CreateMany<T>()
  19:     {
  20:         return CreateFixtureWithDefaultSetup<T>().CreateMany<T>();
  21:     }
  22:     public static IEnumerable<T> CreateMany<T>(T seed)
  23:     {
  24:         return CreateFixtureWithDefaultSetup<T>().CreateMany(seed);
  25:     }
  26:  
  27:     public static IEnumerable<T> CreateMany<T>(int count)
  28:     {
  29:         return CreateFixtureWithDefaultSetup<T>().CreateMany<T>(count);
  30:     }
  31:  
  32:     public static IEnumerable<T> CreateMany<T>(T seed, int count)
  33:     {
  34:         return CreateFixtureWithDefaultSetup<T>().CreateMany<T>(seed, count);
  35:     }
  36:  
  37:     private static Fixture CreateFixtureWithDefaultSetup<T>()
  38:     {
  39:         var fixture = new Fixture();
  40:         fixture.Customizations.Add(new StringGenerator(() => ""));
  41:         fixture.Customizations.Add(new CollectionsGenerator(fixture));
  42:         return fixture;
  43:     }
  44: }

At the bottom of the code above I have included use of the CollectionsGenerator class. I also changed the default behavior of StringGenerator, so it only outputs the property name instead of property name + guid.


You can then use it in a sample object that inherits the ViewModel:


 1: public class EditPersonSampleData : EditPersonViewModel
2:
{
3:
public EditPersonSampleData()
4:
{
5:
Person = SampleDataGenerator.GenerateObjectWithData<Person>();
6:
}
7:
}


And then in the EditPersonView.xaml, include:

 1: xmlns:vm="clr-namespace:SomeNamespace.EditPerson" 
2:

3:
...
4:

5:
<Grid x:Name="LayoutRoot" Background="White" d:DataContext="{d:DesignInstance Type=vm:EditPersonSampleData, IsDesignTimeCreatable=True}">
6:


That’s all you need to get sample data visible in a XAML viewer. Good way of checking that the interface looks OK and that the binding is set up correctly.


Vote here to get similar support in the actual AutoFixture product.