# Sunday, 22 January 2017

Yesterday I migrated one of our TFS collections to VSTS using  Microsoft's migration guide for moving from TFS to VSTS . I won’t lie, it was a pretty long process and it took a lot of going back and fourth to make sure I fully understood the guide which is a PDF 58 pages long. The guide comes with several checklists and things you need to check and prep before your migrations.

A very rough outline of what happens is that you run a check against your TFS using the tool provided to ensure everything is exportable, if there are problems you go about fixing them following suggestions from the tool and then running the check again until you are ready to go. Next you you will run a prep that will generate some files you will need to map your users across followed by making a database backup as a DACPAC package and entering your import invite codes (provided by Microsoft). These are then uploaded to an Azure storage account and you kick off the migration process which uses these assets to import your data into a brand new VSTS instance.

I won’t go into details about how to do the migration as this is covered in the guide, however I will highlight some things you should take into account before you migrate from TFS to VSTS which is done using a tool provided in the guide called the TFSMigrator.

Azure Active Directory

You are going to have to make sure you have this in place or have at least thought about it. If you use Active Directory in your organisation a good thing to look at is replicating this to Azure, your migration is going to need this. If you are not using Active Directory but just accounts on the box as I did for this migration, you can easily map these across to Azure Active Directory accounts. If you have Office 365, then you already have access to an Azure Active Directory setup (depending on your subscription) and you can make use of this. The reason Azure directory is important, is that this is how VSTS will authenticate your users once you have migrated across to VSTS.

Plan for some downtime to make backups

Even when doing a test migration as I did, you need to plan for some downtime. One of the reasons for this is that you will need to generate a DACPAC project of your TFS Collection. In order to do this you have to take the TFS Collection Offline and then detach it from TFS. If you have not done this before you may be put off by the ominous warnings from the TFS Admin Console asking you to tick a box stating you have made a backup of your TFS databases.

After you have detached your TFS Collection and made a DACPAC of it, you can then reattach your collection so your team can continue working as usual.

Learn what a DACPAC is

Yes I had never used one before. The guide will give you some details with a sample command line to use to create one. Effectively DACPACs are short for Data-tier Application Package. These are generated from SQL Server itself. It is basically a way of exporting your whole TFS Collection database with everything that it needs to be re-created. “tables, views, and instance objects, including logins – associated with a user’s database”. The DACPAC package will be uploaded to an Azure storage blob that the migration tool uses.

Learn about Azure Storage Accounts and SAS

While I have used Azure Storage Accounts before , I found this part quite complicated and it took me a while to get it right. Basically the DACPAC package your create from your TFS Collection database gets uploaded to an Azure Storage account along with a mapping file for user accounts. The hardest part I found was having to workout how to create an SAS token URL to the where I had stored these in an Azure storage account. The guide will provide you with a link to some PowerShell you can sue that will generate this URL for you. I am not sure why Azure couldn’t create this link for you (I did try) but eventually used the PowerShell provided that worked first time.

Azure PowerShell tools

Make sure you have the Azure PowerShell tools installed, you will need these for running some PowerShell to generate an SAS token url to your Azure Storage account (see above).

Final Notes

I would recommend reading the guide fully before getting started. Also note that currently you have to request an import code in order to use the service. You will get two of these, one is for a dry run to ensure it works and the next one is for your production import. This is when you are fully committed and feel confident it all went to plan in the dry run.



.
Tags: TFS | VSTS

Sunday, 22 January 2017 11:18:16 (GMT Standard Time, UTC+00:00)  #    Comments [0]


# Thursday, 08 December 2016

It has been a while since I last blogged about MS Visual Studio Team Services Release Hub. The last time I blogged about Release Hub the product was very much rough around the edges and quite a few of its parts were in early preview.

Release Hub has matured quite a bit since then, however deploying to production or test environments in the real world can be considerably different compared to the examples of using Release Hub available online. The items I find many customers I go and see have difficulty with is where the documentation is very much sparse or spread around many older versions of the product is:

  • Tokenisation – its surprising how difficult this can sometimes be and documentation following the whole process from start to finish isn’t readily available. (I will cover some of this in this article)
  • WINRM – Setting up Windows Remoting  which is used for many of the deployment tasks on a test environment is easy. On a production environment where everything has to be very carefully managed through change control processes this can be more challenging.  (I hope to cover some of this in a future article)

The sample I have put together here is more for my own reference. But if you have any suggestions or improvements I would love to hear from you. I will try to expand on this article a bit more with more examples that don’t fit the norm in future blog articles.

The scenario I am going through here is an ASP.NET website that is created from a build and that same build needs to be deployed to more than one environment with its configuration changed for each environment.

The steps involved will be to

  1. Prepare an ASP.NET project for web deployment and tokenisation
  2. Prepare a build to produce the assets needed for deployment
  3. Create a Release that consumes the build mentioned above and replaces configuration variables based on the environment being deployed to

Preparing an existing ASP.NET web project for Release Hub

Before you can deploy a web project you need to prepare your project for deployment.  You may have already used this functionality to deploy directly to an Azure website from Visual Studio or to an on premise server. This functionality can also be used to help create deployment packages that we will use later with Release Hub.

Step 1
Right click on your web project and select Publish. Don’t worry this wont publish your site but it will enable us to setup a deployment profile for it that we will use later.

image

Step 2
From the dropdown that appears select “New Custom Profile” and type in a name for your new profile and select Ok. In this case our profile is called “Website1WebPackage”

image

Step 3
In the dropdown box that appears next select “Web Deploy Package”, here type in a name for your deployment package. We will be using MS Web Deploy to deploy our site later but in order to do this we need to set our site up to create a deployment package. In addition to this we are placing in a token called __SITENAME__ this will be replaced later at deployment time when we actually deploy our application. I will talk more about this later.

image

Step 4
Here the publish wizard will display any database connections string which you can also replaces with tokens of your own. Tokens start with “__” and end with “__” and are in capitals.

image

Step 5
You can now hit the publish button. All this will do is create a webdeploy zip package in the root of your project . You should now see the following files

image

We are only really interested in the website1.webpackage.SetParameters.xml and in website1.webpackage.zip. These files when using the correct switches on your build, will be generated each time. if you open up the parameters file you will notice it contains the tokens we created earlier.

image

Step 6
In the root of your web project create a parameters.xml file . You will see in our parameters file we are using an xpath match to replace settings in our web.config file. The scope is basically looking for a database connection string called DefaultConnecction and we are saying that when you find that value replace it with __DBCONNECTION__  we are doing the same with another key in our web.config called MailAddress.

<?xml version="1.0" encoding="utf-8" ?>
<parameters>
<parameter name="DefaultConnection" description="DB Connection" defaultValue="__DBCONNECTION__" tags="">
  <parameterEntry kind="XmlFile" scope="\\web.config$" match="/configuration/connectionStrings/add[@name='DefaultConnection']/@connectionString"  />
</parameter>

<parameter name="EmailAddress" description="MailAddress" defaultValue="__EMAILADDRESS__" tags="">
  <parameterEntry kind="XmlFile" scope="\\web.config$" match="/configuration/appSettings/add[@key='MailAddress']/@value"  />
</parameter>
 </parameters>

You can see how the parameters above relates to the web.config below

image

 

Step  7
Publish your project again by right clicking on the project and selecting publish and then the profile you created earlier. If you know check the setparamters file. You will notice the new tokens we added in the parameters.xml file are also in here. This file automatically updates with these tokens when you run the publish profile and is key to how we can replace variables in our configuration files.

image

We are now ready check in our code and to create a build. Ensure you check in the parameters.xml file and your new publish profile (highlighted below)

image

 

Create a build

Step 1
You may already have a build for your solution, if so you can alter this build to produce the assets you need for deploying your solution.  Below I have setup an out of the box Visual Studio build pointing to my solution however I have added some arguments to my build.

image

Those arguments are:

/p:DeployOnBuild=true;PublishProfile=Website1WebPackage /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true

Note above that our publish profile is set to the one we created earlier in the tutorial when we prepared our ASP.NET project with a publish profile we called it “Website1WebPackage”. We are also telling MSBuild that we want it to create a package for us and that we want everything to be in a single file.

Step 2
Click on the Copy Files to task and in the contents textbox you will see we have two entries. We are telling this task that all we want from the finished build is website1webpackage.zip and the website1webpackage.SetParameters.xml file we covered in the earlier steps. These files are automatically generated by our build after we had setup a publish profile on our build to create them in the earlier steps.

image

Step 3
Run your build and at the end of the build if you look at its artefacts you should have the following files. We will use these in our release to help with tokenisation.

image

Create a Release

Step 1
Go into Release Hub and create an empty release.

image

In this example I am using the build I created in the previous steps.

image

Step 2
If you don’t already have it, you will need to go to the VSTS market place and select a Tokenisation task. I like to use the following https://marketplace.visualstudio.com/items?itemName=TotalALM.totalalm-tokenization but there are several more you can choose from.

Step 3
Add your tokenisation task. In mine I have set the working directory of my solution as the target path using the following VSTS token $(System.DefaultWorkingDirectory). I have set the Target Filenames to the SetParameters file that our build we created in the previous steps is generating.

image

Step 4
In the environment I was working we weren’t allowed to use Windows File Copy as it was considered insecure. However we did have WINRM available to us. Provided you have PowerShell 5 installed it is possible to copy files from a PowerShell command line to your destination server. You can skip this task and use the Windows FileCopy task if this is open on your network.

image

In my example I have done just that using the PowerShell task. The PowerShell I use is below and tokenised by variables stored in VSTS in the variables tab.

$password = ConvertTo-SecureString "$(password)" -AsPlainText -Force
$cred= New-Object System.Management.Automation.PSCredential ("$(username)", "$password")
$session = New-PSSession -ComputerName myserver01
Copy-Item '$(System.DefaultWorkingDirectory)\WebApp1 Build\drop\WebApplication1\website1webpackage.zip' -Destination 'c:\drops' -ToSession $session
Copy-Item '$(System.DefaultWorkingDirectory)\WebApp1 Build\drop\WebApplication1\website1webpackage.SetParameters.xml' -Destination 'c:\drops' -ToSession $session

I am basically using PowerShells Copy-Item command here to get the files to the server for me to a folder on the c drive of the server called “drops”. I get the path to the files by temporarily making use of a Windows File Copy task to show me the path variables and then delete it after

image

Step 5
Now that I have setup my WinRM file copy I can then use the IIS WInRM Task to deploy to my web server

image

In the example I am using the package files that were copied to the web server in the previous step.

Step 6
Remember those tokens you setup in the previous steps? Now is the time to start giving them values. Click on the Variables tab and start putting some entries in for those tokens. You will also notice that the username and password we use in our release tasks are also stored here and we can refer to them as $(username) or $(password).

image

Step 6
You should now be able to run your Release and deploy.

Step 7 (Optional)
If you have more than one environment, you can clone the existing environment and then replace the server names with the next environments server names.

image



.
Tags: Release | VSTS

Thursday, 08 December 2016 23:25:19 (GMT Standard Time, UTC+00:00)  #    Comments [0]


# Tuesday, 04 October 2016

I ran across this error when installing a new Release Agent and it got to the Azure File Copy stage. Many of the solutions on the Internet to this problem point to it being caused by incorrect times on the Agent machine or the target server. However all servers had the correct time and were in the same time zone.

My problem appeared to be caused by my token Endpoint from VSTS connecting me to Azure. When I renewed this endpoint certificate the Azure File Copy task magically worked. The only difference I could see from the previous agent I had, was that my new Agent was a new Virtual Machine compared to the older one which was a Classic Azure Virtual Machine.

I hope this helps someone.



.
Tags: VSTS

Tuesday, 04 October 2016 17:46:37 (GMT Daylight Time, UTC+01:00)  #    Comments [0]