# Tuesday, July 9, 2013

LeanKit if you haven’t heard of it is a popular Kanban board available as a service over the Internet. I’ve used it myself and seen it used on several client sites in the past. However the one question I seem to be hearing lately is “can’t we use LeanKit for our tasks and TFS for code?”.

Its a reasonable question, Team Foundation Server depending on what version you are using gives you quite a lot of functionality out of the box. You could argue that if you have TFS 2012 you wouldn’t need to use a separate KanBan board. However just because you have a KanBan board doesn’t necessarily mean you’re writing code and an organisation may have been set on LeanKit way before TFS was introduced.

So can we have both?
You certainly can, and one sleepy afternoon I decided to see if I could get the two to talk to each other. The initial  integration between the two was a lot easier than I thought it would be, anyone who is familiar with REST and the TFS API would find it a breeze! And yes it answered the question of could it be done but it also raised several more questions!

  • Why?
  • What about conflicts?
  • Does this add value?

Now the “why” is probably self explanatory from the paragraphs above. However conflicts and “Does this type of integration actually add value” weren’t that straight forward. My conundrum wasn’t getting them to work together but more what to do about someone changing something in TFS and another changing something in LeanKit?

My Solution
So I came up with the following scenario.

“If you are storing tasks and editing them in LeanKit in the first place you’re probably not that likely to want to edit them in TFS? However when you check in your code you will probably want to associate your check in with a task you have been working on.”

While I appreciate the above probably doesn’t apply to everyone, this is one way with the least amount of effort I came up with for integrating the two that could add value. There are several more that I will go into in more detail on later.

Firstly I created a program that polled LeanKit for updates (I would have preferred an event service but couldn’t find one in the LeanKit API). I’ve called my simple polling app “LeanKit Kanban Caller” it basically goes over the tasks in LeanKit and ensure they exist in TFS.


If a task doesn’t exist in TFS the program creates one otherwise it just updates any existing tasks if they are different.


The program knows which task to associate with LeanKit because I altered the default Product Backlog Item WIT from the Scrum 2.0 template in TFS 2012 to store the LeanKit ID (note this type of integration should also work with TFS 2010 on a different template).


For the purposes of my experiment I only mapped the title, description and size fields across but mapping others is easy. I didn’t bother with the State of the item in TFS, partly because the state in LeanKit can be changed so easily by adding more columns while in TFS its a question of editing the workflow on the WIT. While it is possible to map these to each other it would end up making your board inflexible and for the purposes of this scenario we’re not actually using that field.

I have assumed that in TFS everything is a PBI and child tasks are redundant for the purpose of this scenario.

Pros and Cons of my approach

  • The polling app can sit on any machine that has the Team Explorer client installed and has an Internet connection
  • You can associate code check-ins with tasks from your LeanKit board as they now exist in TFS
  • Using LeanKit is business as usual as no changes are made to your LeanKit board


  • Changing a task in TFS will not update the task in LeanKit
  • The states in TFS are not mapped to those in LeanKit
  • Analysis tools in TFS for the backlog have no use with this scenario.

How else could we have worked?
There are several other ways in which I could have made this solution work.

  • Treat TFS as the master and overwrite anything in LeanKit (no real value)
  • Allow updates to go both ways and make use of the datetime stamp to determine if TFS updates LeanKit or if LeanKit updates TFS (could lead to conflicts).
  • Make TFS and LeanKit responsible for different fields. This way both can update each other sound in the knowledge that no data will be overwritten.

Are there any other scenarios? If you have any suggestions I’d love to hear them.

Tags: LeanKit | TFS | TFS Tools

Tuesday, July 9, 2013 2:31:00 PM (GMT Daylight Time, UTC+01:00)  #    Comments [0]

# Thursday, June 13, 2013

A while ago I had to do a bit of work with the TFS Subscription service. The annoying problem was I needed to find a particular subscription and the tools that came out of the box weren't very helpful.  I also wanted to keep the original subscription details but just add to it. So I created this little app


This little app will enable you to

  • View TFS Event Subscriptions from a machine that has the TFS 2010 or TFS 2012 client installed
  • Create a new event subscription from an existing one.
  • Unsubscribe from an event


  • Type in your TFS URL in most cases this would be http://tfs:8080/tfs if you have several instances ensure you place the correct instance onto the URL the above will usually give you the default instance. If you have an instance called “test” the url would be http://tfs:8080/tfs/test .
  • Type in your TFS username and password and hit the “List Subscriptions” button.
  • Select the subscription you’d like to look at and this will be shown in more detail in the bottom right hand section of the app called “Selected Event”
  • You can now unsubscribe or create a new event from the event you just selected. Ensure you change the event first or you may get an error having duplicate events.

In the sample picture you can see I am using the app to make a change to the Scrum For Team System event service by adding more WITs to it.

Most of the credit for the example code for this app should go to Rene van Osnabrugge’s article found here all I have done is throw a GUI on the front and added a few more functions that enable me to copy events.

Before using this app I would like to point out the following (pretty standard stuff)

  • Use of this app is completely at your own risk I or my employer accept no responsibility what so ever for any unintended effects as a result of its usage. If you are worried I would recommend downloading the code and stepping through it.
  • The code is rough and ready (I created it in a hurry) and is provided as is, by all means please look at the code before running the executable.
  • Please let me know if it was at all helpful

Download the Code

Download the TFS 2010 Executable Only 

Download the TFS 2012 Executable Only

Tags: TFS | TFS Tools

Thursday, June 13, 2013 2:21:06 PM (GMT Daylight Time, UTC+01:00)  #    Comments [2]

# Thursday, May 2, 2013

My reason for looking into this particular subject was being on a client site and sitting with a BA who was feeling a bit down hearted. She couldn’t figure out how she could properly communicate to the team how a bit of the system worked so they could refer back to it for a piece of work they were currently working on. I asked her why she just didn’t document it, her response to me was “but agile mean’s you don’t write documentation”. I must admit it through me when she said that. I had always without even thinking about the methodology itself provided anything I possibly could to help a team out when they needed it. If it made their lives easier to do a job and to burn through those tasks what was the harm in that? If in doubt on how to provide a piece of information to a team or if you are wondering if it adds value, why not ask them?

For example, sometimes to help a team focus on what they are building I usually suggest taking a wireframe (if you are indeed using wireframes) or rough sketch of what they are working on and place it on the wall. I may even suggest taking a copy of the PBI and sticking it next to it, the tasks that come off that PBI can float around the wireframe and you can use bits of string (nothing high tech!) to point at the various bits they are starting on. After a while the team would start doing this themselves and use it as a handy visual aid to what they were actually working on. In the morning stand-up's they could point to the bit of functionality they were having problems with (if any). This didn’t replace the task board it helped visualise an aspect of it that they were currently working on.

The approach worked well from a top down approach of development. Another approach I found worked after they had worked their way down the wireframes is doing a rough diagram of the stack that sat underneath that wireframe that they were currently developing. They could then point their tasks at what they were working on if it helped them to communicate to the team where they were or helped them visualise what they had to do next. When I say rough it would make sense for the team to collaboratively draw this in marker pens on a white board. By doing this it enables them to change or improve it.

Talking to others about this approach..

They found it was good as long as the wireframes the team were using were those they had contributed towards creating in the first place and that the team accepted they were not set in stone i.e. they could evolve with the teams input.

Using sketches instead of wireframes, gives the idea that the design is not permanent and can be changed at any time.

It was ok as long as the wireframes were modelling what had already been produced.

It was “taken to the team”, i.e. let the team decide if it adds value and or it helps them.

What are your thoughts, are there any visual aids as a developer you may have found helpful in the past?

Please note..
I must point out that I am not an Agile coach, but have worked as a developer, lead developer or architect on several teams in the past.

Tags: Agile

Thursday, May 2, 2013 11:37:13 AM (GMT Daylight Time, UTC+01:00)  #    Comments [0]

# Tuesday, April 23, 2013

A while a go I wrote an article on how I managed to get the test results from a build that was using PSAKE to surface in TFS’s summary view. I provided one solution to this problem in that article that would “just get you by” and have since been reminded that I said I would write another article on my second solution to the problem which was the PSAKE Incubator.

Its been a long time so its taken me a while to recollect what I did and I no longer have the old dev TFS server I originally got this working on so I will provide what I have done below in order to help anyone else who is looking for a similar solution and needs a steer in the right direction.

Basically I had removed the need for a InvokeProcess in the build XAML workflow and created a new custom activity called PSAKE Incubator. This custom activity would invoke the PSAKE powershell script natively allowing you to make use of the TFS code activity context directly in your PSAKE power shell scripts. In addition the PSAKE Incubator takes its parameters as an array and passes this through to PSAKE.

Below is the code I used to create the custom workflow activity to compile it you will need to create a custom workflow activity, Ewald Hofman has a great example here . Please also note that if you do not want to modify PSAKE if memory serves correctly you may need to create a custom activity host to handle the console output from PSAKE, hope this will be of use to someone, when I get some more time I’ll try and get a TFS box setup to test this out again.

Please note the code below is used at your own risk and I would advise testing on a dev environment before using on your production system.

If you do find the code below useful, please drop me a line and let me know how it went.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Collections.ObjectModel;
using System.Management.Automation;
using System.Management.Automation.Runspaces;
using System.IO;
namespace TfsWorkflowActivities
    using System.Activities;
    using System.Collections;
    using System.Globalization;
    using Microsoft.TeamFoundation.Build.Client;
    using Microsoft.TeamFoundation.Build.Workflow.Activities;
    using Microsoft.TeamFoundation.Build.Workflow.Services;
    public sealed class PSAKEIncubator : CodeActivity
        public InArgument<string> TFSURL { get; set; }
        public InArgument<string> TFSTeamProject { get; set; }
        public InArgument<string> TFSBuildURI { get; set; }
        public InArgument<Hashtable> LogTargets { get; set; }
        public InArgument<string> SourceDirectory { get; set; }
        public InArgument<string> BuildFile { get; set; }
        public InArgument<string[]> TaskList { get; set; }
        public InArgument<Hashtable> Parameters { get; set; }
        public InArgument<Hashtable> Properties { get; set; }
        public InArgument<string> PSAKEModuleDIR { get; set; }
        public InArgument<string> LogName { get; set; }
        public InArgument<string> OutPutDir { get; set; }
        public OutArgument<int> ExitCode { get; set; }
        protected override void Execute(CodeActivityContext context)
            IActivityTracking activityTracking = context.GetExtension<IBuildLoggingExtension>().GetActivityTracking((ActivityContext) context);
            context.TrackBuildMessage("PSAKEIncubator Started", BuildMessageImportance.Low);
            string tfsURL = context.GetValue(this.TFSURL);
            string tfsTeamProject = context.GetValue(this.TFSTeamProject);
            string tfsBuildURI = context.GetValue(this.TFSBuildURI);
            string outPutDir = context.GetValue(this.OutPutDir);
            string sourceDirectory = context.GetValue(this.SourceDirectory);
            string buildFile = context.GetValue(this.BuildFile);
            string[] taskList = context.GetValue(this.TaskList);
            Hashtable properties = context.GetValue(this.Properties);
            Hashtable parameters = context.GetValue(this.Parameters);
            string pSAKEModuleDIR = context.GetValue(this.PSAKEModuleDIR);
            string logName = context.GetValue(this.LogName);
            string arguments = String.Format("$global:tfsUrl = '{0}'\n",tfsURL);
                   arguments+= String.Format("$global:tfsTeamProject = '{0}'\n", tfsTeamProject);
                   arguments+= String.Format("$global:TfsBuildUri = '{0}'\n", tfsBuildURI);
                   arguments+= String.Format("$global:logTargets = @{{}}\n");
                   arguments+= String.Format("$global:logTargets.Add('TfsBuild', @{{verbosity='Progress'}})\n");
                   arguments += String.Format("$global:logTargets.Add('LogFile', @{{verbosity='Debug'; logDir='{0}\\_Logs'; logFilename='build-output.log'}});\n", outPutDir);
            context.TrackBuildMessage("Source Directory:" + sourceDirectory, BuildMessageImportance.Low);
            // Replace the line below with the location and name of the powershell script you use to kick off your PSAKE Build
            string scriptPath = string.Format("{0}\\run-psake.ps1", Path.Combine(sourceDirectory, "BuildAndDeploy\\buildframework"));
            context.TrackBuildMessage("Script Path:" + scriptPath, BuildMessageImportance.Low);
            List<CommandParameter> parametersArgument = new List<CommandParameter>();
            Hashtable parametersCleaned = new Hashtable();
            foreach (DictionaryEntry valHash in parameters)
                if (valHash.Value == null)
                    parametersCleaned.Add(valHash.Key, string.Empty);
                    parametersCleaned.Add(valHash.Key, valHash.Value);
            Hashtable propertiesCleaned = new Hashtable();
            foreach (DictionaryEntry valHash in properties)
                if (valHash.Value == null)
                    propertiesCleaned.Add(valHash.Key, string.Empty);
                    propertiesCleaned.Add(valHash.Key, valHash.Value);
            parametersArgument.Add(new CommandParameter("buildFile",buildFile));
            parametersArgument.Add(new CommandParameter("tasklist",taskList));
            parametersArgument.Add(new CommandParameter("outputDir",outPutDir));
            parametersArgument.Add(new CommandParameter("parameters", parametersCleaned));
            parametersArgument.Add(new CommandParameter("properties", propertiesCleaned));
            parametersArgument.Add(new CommandParameter("psakeModuleDir",pSAKEModuleDIR));
            parametersArgument.Add(new CommandParameter("logName",logName));
            context.TrackBuildMessage("PSAKEIncubator passing arguments to RunScript", BuildMessageImportance.Low);
            int exitResult = RunScript(scriptPath,arguments,context,parametersArgument.ToArray());
            context.TrackBuildMessage("PSAKEIncubator exit code " + exitResult, BuildMessageImportance.Low);
            context.SetValue<int>(this.ExitCode, exitResult);
        public string InformationNodeId { get; set; }
        private int RunScript(string scriptFile, string scriptText, CodeActivityContext context, CommandParameter[] parameters)
            // create Powershell runspace
            string encodedCommand = scriptText;
            //BespokePSHost bespokePSHost = new BespokePSHost();
            Runspace runspace = RunspaceFactory.CreateRunspace();
            IActivityTracking activityTracking = context.GetExtension<IBuildLoggingExtension>().GetActivityTracking((ActivityContext)context);
            InformationNodeId = activityTracking.Node.Id.ToString("D", (IFormatProvider)CultureInfo.InvariantCulture);
            // open it
            context.TrackBuildMessage("PSAKEIncubator Opening runspace", BuildMessageImportance.Low);
            //We set these variables so they become available in our PSAKE powershell scripts.
            // You will need to ensure these variables exist in your scripts to make use of them.
            runspace.SessionStateProxy.SetVariable("InformationNodeId", InformationNodeId);
            //This enables us to make use of the CodeContext directly from within PSAKE
            runspace.SessionStateProxy.SetVariable("CodeContext", context);
            // create a pipeline and feed it the script text
            Pipeline pipeline = runspace.CreatePipeline();
            context.TrackBuildMessage("PSAKEIncubator adding encoded script to pipeline", BuildMessageImportance.Low);
            Command parameterCommands = new Command(scriptFile);
            context.TrackBuildMessage("PSAKEIncubator adding parameters to pipeline", BuildMessageImportance.Low);
            foreach (CommandParameter item in parameters)
                context.TrackBuildMessage("processing param name:" + item.Name, BuildMessageImportance.Low);
                context.TrackBuildMessage("processing param value:" + item.Value, BuildMessageImportance.Low);
                if ("System.Collections.Hashtable" == item.Value.GetType().ToString())
                    Hashtable hashTable = (Hashtable)item.Value;                  
                    foreach (DictionaryEntry valHash in hashTable)
                        string valueType = "";
                        if (valHash.Value != null)
                            valueType = valHash.Value.GetType().ToString();
                            valueType = "NULL";
                            //valueType = valHash.Value.GetType().ToString();
                            //valHash.Value = string.Empty;
                        context.TrackBuildMessage("HashVal:" + valHash.Key + ":" + valHash.Value + "(" + valueType + ")", BuildMessageImportance.Low);
                    context.TrackBuildMessage("---End of HASH---", BuildMessageImportance.Low);
            //pipeline.Commands.Add("exit $LastExitCode;");
            // execute the script
            Collection<PSObject> results = pipeline.Invoke();
            // close the runspace
            context.TrackBuildMessage("PSAKEIncubator closing runspace", BuildMessageImportance.Low);
            // convert the script result into a single string
            context.TrackBuildMessage("PSAKEIncubator iterating through results", BuildMessageImportance.Low);
            StringBuilder stringBuilder = new StringBuilder();
            foreach (PSObject obj in results)
                //return Convert.ToInt32(obj);
                context.TrackBuildMessage(obj.ToString(), BuildMessageImportance.Low);
            return 0;


Tuesday, April 23, 2013 2:07:51 PM (GMT Daylight Time, UTC+01:00)  #    Comments [0]

# Tuesday, March 26, 2013

If like me you’ve written code that makes use of Microsoft.TeamFoundation.Framework.Server you may also have made use of Microsoft.TeamFoundation.Framework.Server.TeamFoundationApplication. This is the class that contains the TFS Logging method.

If you’ve recently ported your applications over to TFS 2012 or have updated your TFS to Update 1 you may have got the following error.

Could not load type 'Microsoft.TeamFoundation.Framework.Server.TeamFoundationApplication' from assembly 'Microsoft.TeamFoundation.Framework.Server

It appears Microsoft have changed the name of the class from Microsoft.TeamFoundation.Framework.Server.TeamFoundationApplication to Microsoft.TeamFoundation.Framework.Server.TeamFoundationApplicationCore

You’ll have to change your code to reference the renamed code in order for it to work again. I hope this helps someone.

Tags: TFS

Tuesday, March 26, 2013 5:46:16 PM (GMT Standard Time, UTC+00:00)  #    Comments [0]

# Wednesday, February 27, 2013

I’ve posted this more as a reminder to myself next time I create TFSJobAgent plugins.

If you ever encounter the error below when adding a new TFSJobAgent plugin:

TF53010: The following error has occurred in a Team Foundation component or extension:
Date (UTC): 2/25/2013 2:04:52 PM
Machine: VSTSR-VM2012RTM
Application Domain: TfsJobAgent.exe
Assembly: Microsoft.TeamFoundation.Framework.Server, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a; v4.0.30319
Service Host:
Process Details:
  Process Name: TFSJobAgent
  Process Id: 8996
  Thread Id: 6656

Detailed Message: There was an error during job agent execution. The operation will be retried. Similar errors in the next five minutes may not be logged.
Exception Message: An item with the same key has already been added. (type ArgumentException)
Exception Stack Trace:    at System.Collections.Generic.Dictionary`2.Insert(TKey key, TValue value, Boolean add)
   at Microsoft.TeamFoundation.Framework.Server.TeamFoundationExtensionUtility.LoadExtensionTypeMap[T](String pluginDirectory)
   at Microsoft.TeamFoundation.Framework.Server.JobApplication.SetupInternal()
   at Microsoft.TeamFoundation.Framework.Server.JobServiceUtil.RetryOperationsUntilSuccessful(RetryOperations operations, Int32& delayOnExceptionSeconds)

The above error was caused by me having more than one copy of my component in the plugins folder

C:\Program Files\Microsoft Team Foundation Server 11.0\Application Tier\TFSJobAgent\plugins

You’re probably wondering how this is possible? I actually had another folder in the plugins folder with a copy of my component. The TFSJobAgent uses reflection to find all components that implement the ITeamFoundationJobExtension interface at runtime this includes all subfolders under the plugins directory.

Tags: TFS

Wednesday, February 27, 2013 9:30:22 AM (GMT Standard Time, UTC+00:00)  #    Comments [0]

# Monday, February 25, 2013

There are some really useful new tools for administrators in TFS 2012, I’ve only just seen this now on Grant Holliday's blog but you can now do things such as look at the TFS Activity Log and one of the most important ones for me is TFS Job Monitoring.

You can get to all of these from your TFS installation by going to the following link.


For more information visit Grants blog. Enjoy!

Tags: TFS

Monday, February 25, 2013 2:14:57 PM (GMT Standard Time, UTC+00:00)  #    Comments [0]

# Friday, February 1, 2013

If you haven’t see in it already, check it out here. TFS will host Git Repositories (100% compatible), see Brian Harry’s blog for more details and links to tutorials.


There is also more about it on the Team Foundation Blog

Tags: TFS

Friday, February 1, 2013 4:28:05 PM (GMT Standard Time, UTC+00:00)  #    Comments [0]

# Tuesday, January 29, 2013

TFS has some pretty powerful functionality out of the box one of them is the merge functionality. However I’ve always felt it missed what I thought was a rather nice feature to have, and that is the ability to see a complete list of PBI’s that would be affected by a merge from one branch to another. Now you could argue that you’ll get this anyway by just looking at your PBI’s. However if you find yourself cherry picking change sets  from one branch to the other e.g. you want a finished component to use in your own branch another team has just completed or you only want to release a certain PBI to live, its nice to know if that PBI turns up.

I must point out before continuing that this approach is far from infallible and relies on good housekeeping on the part of your developers i.e. they associate their checked in change sets with work items in TFS. You could also argue that TFS already gives you the PBI’s associated with a change set. It does but you have to go through a bit of pain to actually get to it, that is you open each change set and then find the work items associated with that change set and then the PBI associated with that work item.

The Code
I have put this together as a console app you will need to reference the following in your app in order for this to work. Be sure if you have both the TFS 2010 and TFS 2012 clients installed that you do not mix DLL versions!

  • Microsoft.TeamFoundation.Client
  • Microsoft.TeamFoundation.VersionControl.Client
  • Microsoft.TeamFoundation.WorkItemTracking.Client

Firstly we just do a bit of setup such as the TFS server URL and credentials, the from and to branches etc.

   1:  //The url of your tfs server
   2:  // Don't forget if you have more than one collect you will have to indicate that in the URL!
   3:  private const string TfsServer = "http://localhost:8080/tfs";
   5:  //Replace these with your tfs username and password
   6:  private const string TfsUserName = "tfsusername";
   7:  private const string TfsPassword = "tfspassword";
   9:  //This is the branch that you are merging from
  10:  private const string FromBranch = "$/TestSource/Dev";
  12:  //This is the branch you are merging to.
  13:  private const string ToBranch = "$/TestSource/Main";
  15:  //In my TFS the PBI is called a Product Backlog Item it may be called 
  16:  // something else depending on the template you use.
  17:  const string ProductBackLogItem = "Product Backlog Item";
  18:  const string SprintBacklogTask = "Sprint Backlog Task";
  20:  static readonly List<int> WorkItemCache = new System.Collections.Generic.List<int>();
  22:  static WorkItemStore workItemStore;
  24:  static TfsTeamProjectCollection tfsTeamProjectCollection;
  25:  static TfsTeamProjectCollection GetTeamProjectCollection()
  26:  {
  27:      var tfsTeamProjectCollection = new TfsTeamProjectCollection(
  28:         new Uri(TfsServer),
  29:     new System.Net.NetworkCredential(TfsUserName, TfsPassword));
  30:      tfsTeamProjectCollection.EnsureAuthenticated();
  32:      return tfsTeamProjectCollection;
  33:  }
  35:  static void Main(string[] args)
  36:  {
  37:      tfsTeamProjectCollection = GetTeamProjectCollection();
  39:      //First we get the version control server.
  40:      var versionControl = tfsTeamProjectCollection.GetService<VersionControlServer>();
  42:      workItemStore = new WorkItemStore(tfsTeamProjectCollection);
  44:      // Second we get a list of merge candidates between our two branches (very simple)
  45:      var mergeCandidates = 
  46:          versionControl.GetMergeCandidates(FromBranch, ToBranch, RecursionType.Full);
  48:      //Thirdly we get a list of workitems from our changesets (using some recursion)
  49:      var workItems = GetWorkItemsForChangesets(mergeCandidates);
  51:      //And last we output these all to the screen
  52:      foreach (var workItem in workItems)
  53:      {
  54:          Console.WriteLine(string.Format("{0} {1}", workItem.Id, workItem.Title));
  55:      }
  57:      Console.WriteLine("Complete");
  58:      Console.ReadLine();
  59:  }

We get our TFS collection first and then pull out our change sets from TFS using the VersonControlServer as you can see TFS gives us this functionality straight out of the box.

Next we go and get our workitems from TFS by iterating through our merge candidates and then workitems. If we find a PBI at this level we store them. If we find Tasks we then look inside them for more PBI’s

   1:  //In the example below I have deliberately not used LINQ statements so you can see what is happening more clearly.
   2:  // These lines of code can quite easily be condensed
   3:  static IEnumerable<WorkItem> GetWorkItemsForChangesets(IEnumerable<MergeCandidate> mergeCand)
   4:  {
   5:      var workItems = new List<WorkItem>();
   7:      foreach (var itemMerge in mergeCand)
   8:      {
   9:          var changeSet = itemMerge.Changeset;
  11:          var workItemsCollection = changeSet.WorkItems;
  13:          foreach (WorkItem item in workItemsCollection)
  14:          {
  15:              if (ProductBackLogItem == item.Type.Name)
  16:              {
  17:                  if (!WorkItemCache.Contains(item.Id))
  18:                  {
  19:                      WorkItemCache.Add(item.Id);
  20:                      workItems.Add(item);
  21:                  }
  22:              }
  24:              if (item.WorkItemLinks.Count > 0 && SprintBacklogTask == item.Type.Name)
  25:                  {
  26:                      WorkItemCache.Add(item.Id);
  27:                      var collectedWorkItems = GetProductBacklogItems(item);
  29:                      if (collectedWorkItems != null && collectedWorkItems.Count > 0)
  30:                      {
  31:                          workItems.AddRange(collectedWorkItems);
  32:                      }
  33:                  }
  36:          }
  37:      }
  39:      return workItems;
  40:  }

The code below will take a Task item and check its links for a parent item such as a PBI.

   1:  static List<WorkItem> GetProductBacklogItems(WorkItem workItem)
   2:  {
   3:      var workItems = new List<WorkItem>();
   5:      foreach (WorkItemLink workItemLinks in workItem.WorkItemLinks)
   6:      {
   7:          //We only want parent items so we look for "Implements"
   8:          if (workItemLinks.TargetId > 0 && (workItemLinks.LinkTypeEnd.Name == "Implements"))
   9:          {
  10:              var tempWorkItem = workItemStore.GetWorkItem(workItemLinks.TargetId);
  12:              if (ProductBackLogItem == tempWorkItem.Type.Name)
  13:              {
  14:                  if (!WorkItemCache.Contains(tempWorkItem.Id))
  15:                  {
  16:                      WorkItemCache.Add(tempWorkItem.Id);
  17:                      workItems.Add(tempWorkItem);
  18:                  }
  19:              }
  20:          }
  21:      }
  23:      return workItems;
  24:  }

That's all there is to it. I’m pretty sure the code can be refactored to work better. I would state to be careful when using the code above. Searching through workitems can take a lot of time so if you do end up using it try it out on a test TFS or use your code within a debugger so can see what its doing.

Tags: TFS

Tuesday, January 29, 2013 1:03:54 PM (GMT Standard Time, UTC+00:00)  #    Comments [0]

# Monday, January 28, 2013

I was recently brought into a client site where they had made use of PSAKE to handle their build process. The build would be kicked off from the traditional Workflow in TFS using an Invoke Process. Everything was working perfectly until they spotted that when the build failed there was no way of viewing which unit tests had failed from within TFS. In short PowerShell was giving precious little to the TFS summary view.

The question was how could we get that rich logging information you got in the build summary when doing a traditional build using Workflow? Setting up a traditional build and observing how MSBUILD is called from TFS starts to shed some light on the situation

C:\Windows\Microsoft.NET\Framework64\v4.0.30319\MSBuild.exe /nologo /noconsolelogger "C:\Builds\1\Scratch\Test Build\Sources\user\Test\Build.proj" /m:1 /fl /p:SkipInvalidConfigurations=true  /p:OutDir="C:\Builds\1\Scratch\Test Build\Binaries\\" /p:VCBuildOverride="C:\Builds\1\Scratch\Test Build\Sources\user\Test\Build.proj.vsprops" /dl:WorkflowCentralLogger,"C:\Program Files\Microsoft Team Foundation Server 2010\Tools\Microsoft.TeamFoundation.Build.Server.Logger.dll";"Verbosity=Normal;BuildUri=vstfs:///Build/Build/111;InformationNodeId=6570;
http://mytfshost:8080/tfs/Test%20Collection;"*WorkflowForwardingLogger,"C:\Program Files\Microsoft Team Foundation Server 2010\Tools\Microsoft.TeamFoundation.Build.Server.Logger.dll";"Verbosity=Normal;"


In the above example I have highlighted the section I discovered is responsible for the summary view you usually see when kicking off a build from TFS. I discovered this with a bit of guesswork and some reflector usage to see what was going on inside MSBUILD. Googling for the WorkflowCentralLogger gives precious little back about how it works and more about the errors people have encountered with it.

Getting to the solution
You will be forgiven for thinking the answer to the problem is just adding the missing WorkflowCentralLogger switch (with arguments) to your MSBUILD command line in PowerShell/PSAKE. Sadly its not that simple. See the InformationNodeId in the above command line? This appears to tell the WorkFlowCentralLogger where it needs to append its logging information. Passing it into the Invoke Process was my first thought, the problem is you’re not going to find anything that will give it to you, I wasn’t able to find it anywhere.

So how do you get it to work then?
The answer is, you need to build a Custom Workflow Activity. A custom workflow activity will have access to the current Context. To use this you need to inherit the class “CodeActivity”. Its up to you how you use this Custom Workflow Activity, you have one of two ways.

  • Place it above the Invoke Process in your workflow, get the InformationNodeId and pass this as an OutArgument to the Invoke Process below it (not tested fully)
  • Or invoke Powershell from within the Custom Activity using a runspace and pass it the code context. (fully tested)
   3:  namespace MyWorkflowActivities
   4:  {
   5:      using System;
   6:      using System.Collections.Generic;
   7:      using System.Linq;
   8:      using System.Text;
   9:      using System.Collections.ObjectModel;
  10:      using System.Management.Automation;
  11:      using System.Management.Automation.Runspaces;
  12:      using System.IO;
  13:      using System.Activities;
  14:      using System.Collections;
  15:      using System.Globalization;
  17:      using Microsoft.TeamFoundation.Build.Client;
  18:      using Microsoft.TeamFoundation.Build.Workflow.Activities;
  19:      using Microsoft.TeamFoundation.Build.Workflow.Services;
  21:      public OutArgument<string> InformationNodeIdOut { get; set; }
  23:      [BuildActivity(HostEnvironmentOption.All)]
  24:      public sealed class GetInformationNodeId : CodeActivity
  25:      {
  26:          protected override void Execute(CodeActivityContext context)
  27:          {
  29:              context.TrackBuildMessage("Getting the Information Node Id", BuildMessageImportance.Low);
  30:              IActivityTracking activityTracking = context.GetExtension<IBuildLoggingExtension>().GetActivityTracking((ActivityContext) context);
  31:              string informationNodeId = activityTracking.Node.Id.ToString("D", (IFormatProvider)CultureInfo.InvariantCulture);
  33:              context.SetValue<string>(this.InformationNodeIdOut, informationNodeId);
  34:          }
  35:      }
  37:  }

The code above illustrates the first solution. Its a lot simpler and you’ll have to pass that node id to MSBUILD when you construct its command line in PowerShell. Line 30 and 31 is where all the magic takes place, I managed to find this line using reflector in MSBUILD. If you have never written a custom activity before Ewald Hofman has a short summary of one here

The diagram below illustrates where GetInformationNodeId (code above) sits just above the InvokeProcess which calls PowerShell.



The second solution, which I actually went with is slightly more complex and I’ll blog about how I did that in another article. You might be wondering what are the immediate benefits of one over the other? The beauty of going with the second solution is you can make use of the code activity context within your PowerShell scripts. So for example instead of writing your PowerShell events out to the host you could wrap that call in context.TrackBuildMessage (as illustrated on line 29 above).

I’d be interested to hear about other peoples experiences.


Monday, January 28, 2013 3:51:49 PM (GMT Standard Time, UTC+00:00)  #    Comments [0]