Monthly Archives: March 2007


Last night I had a dream about the future of TFS. Wouldn't it be nice if we had one single portal on which all my team projects and their web interfaces resides. That you could easily search across libraries and projects and relate those projects by using site hierarchies. Reports and other functionality will be really exist in those sites and all options are fully integrated. I know it's a dream, but some dreams ....

If you have read Brian Harry's blog about the future of TFS, there is possibility MS is aiming at this goal too. He mentions Office SharePoint Server 2007. And when I think of MOSS in combination with TFS, this picture immediately pops up in my head.

TFS portal on which all team projects and their sub projects resides

All Team sites have all the nice features of WSS v3.

For Example:

  • Customize your sites quickly
  • Better control with master pages
  • Send e-mail to a SharePoint site
  • Get mobile access to a SharePoint list
  • Poll people more effectively with improved surveys
  • Receive updates about lists and libraries with RSS Feeds
  • Track your work with improved calendars
  • Copy documents easily to another location
  • Better management of versions
  • Better document recovery

Team Project running on WSS v3 site

Note: You can already use WSS v3 in a TFS v1 environment. To do so follow this guide.

Another thing which was fully integrated in my dream was the reporting functionality. Reporting Services can be integrated in WSS v3 and Office SharePoint Server 2007 as you can read in a old blog of my. This way all permissions can be set within WSS and will give you a consolidated place to manage security. Nowadays when using Team Foundation Server you have to set rights in TFS, SharePoint and Reporting Services. Wouldn't it be nice to only support TFS and SharePoint.

Reporting fully integrated in SharePoint

Just a couple of days ago MS acquired acquired DevBiz Business Solutions, the makers of the popular TeamPlain Web Access for Team System. To read the complete article about this acquiring check this blog of Brian Harry. Again a great tool for TFS, but also a new web site with its own management and its own look and feel. In my dream release 3 of this product will fulfill my wishes and TeamPlain uses SharePoint as its container.

TeamPlain integrated in SharePoint

Personally I don't think my dream will be realized in Orcas, but hopefully in Rosario. Another interesting part left from the UI of a standard TFS environment is an integration with Outlook. Who doesn't want to sync their Tasks with TFS? So come' on guys acquire Personify Design and you can release TeamLook as a VSTS Power Tool as well. I know there's a free Outlook Addin utility from the SRL Team, works great by the way, but sometimes you just want to have dreams. 😉



Let's take a quick look at what these extensions can do for you. As you can expect from me I'm only interesting in the Web Part section of this product. For all other options check out this blog called Walkthrough - Windows SharePoint Services 3.0 Tools: Visual Studio 2005 Extensions. About handling Web Parts, Mart Muller wrote an excellent blog about how you Create a SharePoint 2007 webpart step by step. With the new extensions you only have be aware of the steps Mart wrote, but you don't have to execute those anymore.

After installing the product you have the ability to select a Web Part project from the SharePoint project types. Web Parts were first introduced in Windows SharePoint Services v2. The concept was so compelling that the model was improved and added to the second generation of ASP.NET: ASP.NET 2.0. This wasn't the only addition to ASP.NET 2.0. Other improvements and enhancements allowed the Windows SharePoint Services team to revamp their underlying architecture that resulted in it being rebuilt on top of ASP.NET 2.0 (previously, WSS v2 had it's own rendering engine).


After selecting this project, it will create a base project with a template file as shown below. As you can see it will inherit from the ASP.NET 2.0 WebPart class (System.Web.UI.WebControls.WebParts.WebPart), but you can also build Web Parts that inherit the WSS v3 WebPart class (Microsoft.SharePoint.WebPartPages.WebPart). To decide from which Web Part you should derive check Creating Web Parts in Windows SharePoint Services. In most cases you could stick to default settings. As we can expect from a custom control we do need to set it's rendering. After doing so, check you settings of the project.

    1 using System;

    2 using System.Runtime.InteropServices;

    3 using System.Web.UI;

    4 using System.Web.UI.WebControls.WebParts;

    5 using System.Xml.Serialization;


    7 using Microsoft.SharePoint;

    8 using Microsoft.SharePoint.WebControls;

    9 using Microsoft.SharePoint.WebPartPages;


   11 namespace My_First_Extension_Web_Part

   12 {

   13     [Guid("e17149f4-56e6-4c5d-909c-3cec0a1d0f27")]

   14     public class My_First_Extension_Web_Part : System.Web.UI.WebControls.WebParts.WebPart

   15     {

   16         public My_First_Extension_Web_Part()

   17         {

   18             this.ExportMode = WebPartExportMode.All;

   19         }


   21         protected override void Render(HtmlTextWriter writer)

   22         {

   23             // TODO: add custom rendering code here.

   24             writer.Write("My First Extensions Web Part!");

   25         }

   26     }

   27 }

The extensions introduced a new Tab for a SharePoint Solution. In this tab you can set the settings for its solution, the feature and even the element itself. This will give you the ability to easilly set these settings without creating additional files. Besides this extra tab, the project will also sign this assembly with a strong name key, so it can be easily added to the GAC. Note: This key is for every newly created project on the same machine identical.

In the next screen dump you can see the project is deployed to the web server, which contains SharePoint. It will create a *.wsp file, which will automatically add the Web Part to the GAC, mark your Web Part as safe in the web.config of the SharePoint site, populate your Web Part gallery and reset its server.  

------ Build started: Project: My First Extension Web Part, Configuration: Debug Any CPU ------

C:WINDOWSMicrosoft.NETFrameworkv2.0.50727Csc.exe /noconfig /nowarn:1701,1702 /errorreport:prompt /warn:4 /define:DEBUG;TRACE /reference:"C:Program FilesCommon FilesMicrosoft SharedWeb Server Extensions12ISAPIMicrosoft.SharePoint.dll" /reference:C:WINDOWSMicrosoft.NETFrameworkv2.0.50727System.dll /reference:C:WINDOWSMicrosoft.NETFrameworkv2.0.50727System.Web.dll /reference:C:WINDOWSMicrosoft.NETFrameworkv2.0.50727System.Xml.dll /debug+ /debug:full /keyfile:PropertiesTemporary.snk /optimize- /out:"objDebugMy First Extension Web Part.dll" /target:library PropertiesAssemblyInfo.cs "My First Extension Web PartMy First Extension Web Part.cs"


Compile complete -- 0 errors, 0 warnings

My First Extension Web Part -> D:ProjectsWebPartsMy First Extension Web PartbinDebugMy First Extension Web Part.dll

------ Deploy started: Project: My First Extension Web Part, Configuration: Debug Any CPU ------

------ Generate My First Extension Web Part.wsp file and setup batch file------

Creating solution ...

Operation completed successfully.


Creating setup batch file ...

Operation completed successfully.


------ Add and deploy My First Extension Web Part.wsp to the SharePoint ------

Adding solution ...

Operation completed successfully.


Deploying solution ...

Operation completed successfully.


------ Activate features in solution if necessary ------

Activating feature My_First_Extension_Web_Part ...

Operation completed successfully.


Restarting IIS ...

 After deploying you can add your newly created web part to your SharePoint site like you've done thousands of times before.

It's good to have these new extensions for Web Parts. Offcourse you still have to know the steps mentioned by Mart, but you don't have to follow them exactly anymore. I personally think it's a great effort of the SharePoint team and like to thanks you guys.



Why do we need virtual machines. A virtual machine (VM), sometimes called a hardware virtual machine, is that of a number of discrete identical execution environments on a single computer, each of which runs an operating system. This can allow applications written for one OS to be executed on a machine which runs a different OS, or provide execution "sandboxes" which provide a greater level of isolation between processes than is achieved when running multiple processes on the same instance of an OS. Furthermore it gives you the ability to test software before it's RTM'd.

In most of our classes at Class-A we use VM's too, which gives us the opportunity to have several classes a week and use the newest software of MS without influencing the base system. Every machine has got its own set of VM's, dedicated and branded for this specific machine. This way we can easily give network access to the VM's, without disturbing each other. The most annoying part is when you finished a setup of such a dedicated VM, you have to clean it manually.

When investigating 'Orcas' I found myself a nice tool on the harddrive of the VPC, called Clean Virtual Machine.

With this tool you're able to clean the virtual machine resources very quickly—in 10 or 15 seconds—so new virtual machines are quickly made available to the next user. Besides cleaning you can also test if your VM is cleared, optimized and contains the latest version of Virtual Machine Additions.  

I  immediately adopted it, and will use it for all my other VM preparations. While searching the Internet, I couldn't find any information related to this product, so it must be an internal MS product. If you interested in it look for VMClean.exe on your 'Orcas' VPC.


Technorati tags: , , ,


Like the most of you I wanted to have a close look at some new features of Orcas. While starting with the newer applications I became more and more frustrated by the minute. In the most basal screens I got al kind of strange errors. The funny part is, I'm using the VPC Microsoft is delivering on their Website. As with .NET 3.0, .NET 3.5 still relies on .NET 2.0 and I think this is were some of the problems are originated from.

Update from Alex:

If you have trouble or failures like Mike and a lot of others did, don't use the Virtual PC image, but instead create your own. Download the self-extracting installers for VS Orcas and TFS Orcas. Also, do NOT install VS2005 and Orcas side-by-side. This is the cause of all your pain and error messages (well, not all, but the a lot of the ones related to this subject). Why, o why did Microsoft install VS2005 Team Suite together with Orcas when even the release notes say this causes these issues.

Alex, you hit the nail on the head.

Here are some screendumps.

Microsoft .NET Framework 2.0 Configuration (Snap-in failed to initialize)

Project Properties (Microsoft.VisualStudio.Shell.WindowPane.OnCreate())

Workflow Designer (Microsoft.VisualStudio.Shell.WindowPane.GetService(System.Type))



This is so MS. I hope to find a solution for these problems, but I think I already know it (April CTP).


Technorati tags:

Who isn't interested in the new features of Orcas. I really am and especially all the features of TFS. Brian Harry posted a couple of months ago a great blog about all the new features. Since the latest CTP of Orcas you can check it out yourself.

In this blog I explain the new features of Team Build within Orcas.

  • Improved ability to manage multiple build machines.

  • Improved ability to specify what source, versions of source, etc to include in a build.
  • Simplified ability to specify what tests get run as part of a build.

    Unfortunately I couldn't show you this feature because I couldn't create any unit tests with Orcas.

  • The ability to story build definitions anywhere in the version control hierarchy.
  • Support for retention policies. Specify how builds should be retained on the build server.
  • Continuous Integration – There are many components to this, including build queuing and queue management, drop management (so that users can set policies for when builds should be automatically deleted), and build triggers that allows configuration of exactly how when CI builds should be triggered, for example – every checkin, rolling build (completion of one build starts the next), etc.
  • Stop and delete builds from within VS.
  • Support multi-threaded builds with the new MSBuild.
  • Improved extensibility of the build targets – such as ability to easily execute targets before and after each solution/project is built.

    I couldn't find this option jet.

The team completely redesigned the Team Build to fulfill all changes requested by their customers. One of the nicest things is you can now reopen your build definition in the wizard instead as XML file. I personally think they did a great job. Keep up the good work guys.


Technorati tags: , ,


Who isn't interested in the new features of Orcas. I really am and especially all the features of TFS. Brian Harry posted a couple of months ago a great blog about all the new features. Since the latest CTP of Orcas you can check it out yourself.

In this blog I explain the new features of Version Control within Orcas.

  • Destroy- The version control destroy operation provides administrators with the ability to remove files and folders from the version control system. The destroyed files and folders cannot be recovered once they are destroyed. Destroy allows administrators to achieve SQL server disk space usage goals without constantly needing to add more disks to the data tier machine. Destroy also facilitates removing versioned file contents that must be permanently removed from the system for any other reason.

    The TFS team extended the TF command and added an instruction destroy. This command can only be run by an administrator.


  • Annotate - Annotate is a feature that allows developers to inspect a source code file and see at line-by-line level of detail who last changed each section of code. It brings together changeset data with difference technology to enable developers to quickly learn change history inside a source file.

    This option is already a part of Microsoft TFS PowerToys which you can use in your TFS today.

  • Folder Diff - Team Foundation Server now supports compare operations on folders, whereby the contents of the folder are recursively compared to identify files that differ. Folder diff can compare local folders to local folders, local folders to server folders, and server folders to server folders. It’s a great way of identifying differences between branches, files that you’ve changed locally, and files that have changed between two points in time.

    Update: Is also included in the Microsoft TFS PowerToys. This is not the same product as in TFS Power Toys. Check the Folder Difference blog of Tan Phan about the exact differences. Through out April 2007, he'll be posting details about the new feature Folder Difference currently in the Visual Studio Orcas.

  • Get Latest on Checkout - As an optional setting on a team project or on an individual basis, you can have Team Foundation Server always download the latest version of a file when you check it out. This helps ensure that you don’t have to merge your changes with somebody else’s when you check the file back in.

    This feature is implemented because lots and lots of people requested it. There are still good reasons why the Team didn't include this in the first place. I won't discuss it here, but just read it at all the great forums about this topic.


  • Performance improvements – A variety of Version Control performance enhancements that will improve virtually all aspects of version control performance.  The gains for smaller servers/projects (< 10,000 files) will be modest.  The gains for larger projects (particularly where the file count approaches 100,000’s) will be substantial.

  • Scale improvements – Fixed out of memory problems on the server when operating on more than a few hundred thousand files at a time.

The biggest changes to Version Control are on the improvements part, which you can't see in your IDE, but you can measure it. I didn't expect rocket since in Orcas, but it should have been nice. Maybe Microsoft gave us so much every time, you expect to get much improvements every time.


Technorati tags: , ,


As most of you TFS addicts, I also immediately installed the new TFS Power Tools. Like Marcel de Vries wrote in his blog "A must have is the new embedded in Visual Studio Process Template Editor". This editor is almost same as the one I used before, but it's using Visual Studio as a container to modify the process. I compared the code of both products and must say that most of the code isn't changed at all. They renamed some variables and replaced strings containing messages to use resources. The only new part they added is the new Domain Specific Language (DSL) for the workflow of a WorkItem. As you can see it is a great improvement for showing the complete process of WorkItem.

I took the liberty of 'stealing' the picture Marcel used in his post.

The only difference I could find between the two tools was PTE gave me the ability to select which user groups are allowed to do a specified transition. All groups which I added to process template are immediately available to select in each transition. In the Power Tools it will only give you a group of users when you're connected to a Team Foundation Server.

VSTS Customization Toolkit

Team Foundation Server Power Tools 1.2

Another annoying thing is the new tool will create an extra dialog for each transition you select. Every dialog becomes it's own instance within your taskbar.

Again the team delivered a great tool, but I hope they will fix the small bug and change the appearance of each dialog. Maybe the can lock it so you can't click on another transition unless you closed the dialog. I didn't have the time to dig into to code to see were things go wrong. When I have some extra time I'll update this blog. Furthermore I hope TFS in Orcas will use WF for their workflows and show all states and transitions in the Workflow Designer. For a complete outline of the Power Tools check the blog of Buck Hodges.



How often do you bump into problems with users and security when you move a SQL Server database to another server. The most annoying problem is when you try to create a user under Security, and the user exists within the moved database. When you try to apply user mapping and permissions you get an error like this:

Error 15023: User, group or role '%s' already exists in the current database.

Normally you don't have to reapply the user permissions because SQL Server uses a special background process to associate user accounts in a database with logins on the server. This could take some time, but luckily there’s a quick way to do this:

use database

sp_change_users_login 'auto_fix', 'UserName'

This will output the following result.

The row for user ' ' will be fixed by updating its login link to a login already in existence.
The number of orphaned users fixed by updating users was 1.
The number of orphaned users fixed by adding new logins and then updating users was 0.

Microsoft has a knowledge base article that explains the problem and solution Q240872.


Technorati tags:


If you’re going to analyze Visual SourceSafe databases for migration to Team Foundation source control, and to migrate Visual SourceSafe databases to Team Foundation source control, you’ll eventually meet VSSConverter. VSSConverter is a command line utility that uses XML configuration files to transfer SourceSafe projects to TFS. Eyal Post of EPocalipse Software has built a GUI front-end for the VSSConverter tool, which is the command-line utility that helps you migrate source from Visual SourceSafe to Team Foundation Version Control. In this post I will describe which steps need to be taken.

Converting your projects
Before you can use VSSconverter you need to have the correct tools and rights. VSSConverter depends on Visual Studio Team Explorer, which will help you to connect to TFS and Source Control to determine its projects.

Admin password of SourceSafe
If you want to retrieve data from your SourceSafe environment, you need to have full control to do so. In some case it's necessary to hack SourceSafe, because no one remembers the password of the Admin account. Luckily you can easily overwrite the passwords for a SourceSafe environment, which are stored in the um.dat file. This file, which can be found in the root, contains all passwords of each user and are binary stored. Open the um.dat in a hexadecimal editor and try to determine the offset of your Admin account, as the file is ordered alphabetically by login. If you haven't any user name that appears alphabetically before "Admin" it will appear at offset 80. Hack the file and remove the Admin password from offset 80 the bytes are (all numbers are hex) and change the bytes to exactly what's typed below and include all bytes even the one from offset a0.

0:80  55 55 bc 7f 41 64 6d 69 6e 00 00 00 00 00 00 00
0:90  00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
0:a0  00 00 00 00 90 6e 00 00 a8 01 00 00 00 00 00 00

You now have full access to your SourceSafe database.

SQL Server
Furthermore you need to have SQL Server rights and client tools to connect to the database server, else you will receive the error.

Need a connection and correct rights on SQL Server

ERROR: TF60022:  Unable to connect to Microsoft SQL Server.

VSSConverter requires either SQL Express or SQL Server for migration. By default it looks for SQL Express which is installed with VS 2005 in a default configuration. For overriding the default you can specify another SQL Server. 

User Mapping
After selecting my project source and destination the tool will give you the ability to map the users from SourceSafe to TFS. It's nice to see that only the VSS users which involves the project are given. After selecting the correct TFS users my migration still failed.

ERROR: TF60014:  The username WINNT://SERVER/Mike in the user map file C:tempvssToTfsUsermap.xml is invalid.
User mapping

ERROR: TF60087: Failed to initialize user mapper


I don't know why, but in mine case I needed to remove WinNT:// per user as you can see in the code below. You can do this by editing the UserMap.xml.

<?xml version="1.0" encoding="utf-8"?>
<UserMappings xmlns:xsi="" xmlns:xsd="">
<UserMap From="MIKE" To="SERVERMike" />
<UserMap From="ADMIN" To="SERVERTFSSetup" />

Team Foundation Server
The last step which has to be taken is give the user who migrates all projects the ability check in other users' changes security rights on TFS. If you forgot to do so this error will occur.

ERROR: TF14098: Access Denied: User SERVERmike needs Checkin, CheckinOther permission(s) for $/Playground/Source/dummy.txt.

After correcting all steps I was able to convert one of my SourceSafe projects. You could easily migrate several projects, but I think you should only migrate those which are still usable and important. All other projects can be left in SourceSafe and create a backup of it on disk or cd.

The Microsoft Team for TFS delivered a great product to really upgrade your sources to a fully transactional and more secure source control environment. It's not hard to setup a migration, but some steps are not fully documented. One of the nicest things about the migration, it will give you the ability to still see your previous history. Files checked in at the same time, will now show up as complete changeset.



On a recent assignment they requested me to implement two Integration Services instances on the same machine. Their serverpark isn't that big, but they want to simulate a separated test en development environment. With DTS this wasn't a real problem, but how do you configure SSIS to use multiple instances, since SSIS can only be installed once on a single machine and totally works different then DTS. Before I explain how you can configure SSIS you first need to know the differences between DTS and SSIS. 

Traditional (DTS) warehouse loading

  • Integration process simply conforms data and loads the database server
  • The database performs aggregations, sorting and other operations
  • Database competes for resources from user queries
  • This solution does not scale very well


Warehouse loading with SSIS

  • SQL Server Integration Services conforms the data
  • But also aggregates and sorts, and loads the database
  • This frees-up the database server for user queries

SSIS includes a configuration file for configuring the Integration Services service. By default, the file is located in the folder, Program FilesMicrosoft SQL Server90DTSBinn, and the file name is MsDtsSrvr.ini.xml.

The default configuration file contains the following settings:

  • The root folders to display for Integration Services in Object Explorer of SQL Server Management Studio are the MSDB and File System folders.
  • The packages in the file system that the Integration Services service manages are located in %Program Files%Microsoft SQL Server90DTSPackages.

You can modify the configuration file to display additional root folders in Object Explorer, or to specify a different folder or additional folders in the file system to be managed by Integration Services service. The example below shows how I configured the configuration file to use more than one MSDB database, which are stored in separated database instances.

As you can see in the Management Studio I can store my packages in different databases as well a multiple File Systems.

The registry key HKEY_LOCAL_MACHINESOFTWAREMicrosoftMSDTSServiceConfigFile specifies the location and name for the configuration file that Integration Services service uses. The default value of the registry key is C:Program FilesMicrosoft SQL Server90DTSBinn MsDtsSrvr.ini.xml. You can update the value of the registry key to use a different name and location for the configuration file.

The SQL Team did a great job for letting you choose how many instances you want to use on one machine. They're fully configurable and administrable and you're able to create a real test and development environment on one machine. Great job guys.