# Sunday, January 22, 2017

Yesterday I migrated one of our TFS collections to VSTS using  Microsoft's migration guide for moving from TFS to VSTS . I won’t lie, it was a pretty long process and it took a lot of going back and fourth to make sure I fully understood the guide which is a PDF 58 pages long. The guide comes with several checklists and things you need to check and prep before your migrations.

A very rough outline of what happens is that you run a check against your TFS using the tool provided to ensure everything is exportable, if there are problems you go about fixing them following suggestions from the tool and then running the check again until you are ready to go. Next you you will run a prep that will generate some files you will need to map your users across followed by making a database backup as a DACPAC package and entering your import invite codes (provided by Microsoft). These are then uploaded to an Azure storage account and you kick off the migration process which uses these assets to import your data into a brand new VSTS instance.

I won’t go into details about how to do the migration as this is covered in the guide, however I will highlight some things you should take into account before you migrate from TFS to VSTS which is done using a tool provided in the guide called the TFSMigrator.

Azure Active Directory

You are going to have to make sure you have this in place or have at least thought about it. If you use Active Directory in your organisation a good thing to look at is replicating this to Azure, your migration is going to need this. If you are not using Active Directory but just accounts on the box as I did for this migration, you can easily map these across to Azure Active Directory accounts. If you have Office 365, then you already have access to an Azure Active Directory setup (depending on your subscription) and you can make use of this. The reason Azure directory is important, is that this is how VSTS will authenticate your users once you have migrated across to VSTS.

Plan for some downtime to make backups

Even when doing a test migration as I did, you need to plan for some downtime. One of the reasons for this is that you will need to generate a DACPAC project of your TFS Collection. In order to do this you have to take the TFS Collection Offline and then detach it from TFS. If you have not done this before you may be put off by the ominous warnings from the TFS Admin Console asking you to tick a box stating you have made a backup of your TFS databases.

After you have detached your TFS Collection and made a DACPAC of it, you can then reattach your collection so your team can continue working as usual.

Learn what a DACPAC is

Yes I had never used one before. The guide will give you some details with a sample command line to use to create one. Effectively DACPACs are short for Data-tier Application Package. These are generated from SQL Server itself. It is basically a way of exporting your whole TFS Collection database with everything that it needs to be re-created. “tables, views, and instance objects, including logins – associated with a user’s database”. The DACPAC package will be uploaded to an Azure storage blob that the migration tool uses.

Learn about Azure Storage Accounts and SAS

While I have used Azure Storage Accounts before , I found this part quite complicated and it took me a while to get it right. Basically the DACPAC package your create from your TFS Collection database gets uploaded to an Azure Storage account along with a mapping file for user accounts. The hardest part I found was having to workout how to create an SAS token URL to the where I had stored these in an Azure storage account. The guide will provide you with a link to some PowerShell you can sue that will generate this URL for you. I am not sure why Azure couldn’t create this link for you (I did try) but eventually used the PowerShell provided that worked first time.

Azure PowerShell tools

Make sure you have the Azure PowerShell tools installed, you will need these for running some PowerShell to generate an SAS token url to your Azure Storage account (see above).

Final Notes

I would recommend reading the guide fully before getting started. Also note that currently you have to request an import code in order to use the service. You will get two of these, one is for a dry run to ensure it works and the next one is for your production import. This is when you are fully committed and feel confident it all went to plan in the dry run.



.
Tags: TFS | VSTS

Sunday, January 22, 2017 11:18:16 AM (GMT Standard Time, UTC+00:00)  #    Comments [0]