CD Pipelines For .NET in ThoughtWorks GO!

I’ve mentioned in many previous posts that we utilize Thoughtworks GO! to orchestrate our CD pipelines.  We implemented GO! about three years ago.  At the time we were experimenting with CD improvements and ThoughtWorks had recently open sourced GO! so it was the low-cost/risk option.  Three years later we have 50+ pipelines and are deploying something to PROD every 4 hours which to us makes it a very successful improvement.  In this post, I will describe how we configure GO! to accelerate value delivery to our customers.

Pipeline Templates

A large part of our success with GO! is due to our use of the pipeline templates feature.  All of our pipelines use a single generic pipeline template that is driven by a json configuration file.  This made it so much easier for developers to create a new pipeline.  They didn’t have to become GO! experts or create a custom pipeline for each product.  Everyone used the same template in a standardized way. Additionally, when we found a bug in the template or implemented a new feature all the pipelines benefitted from the template change.

The basic flow is as follows:

  1. Get the deploy.json file from TFS source control
  2. Queue the packaging builds list in the deploy.json file and store their resulting packages as artifacts in the pipeline
  3. Deploy to each environment
    1. Deploy each of the packages to the target servers listed in the deploy.json file for the environment
    2. Execute the TFS test builds listed in the deploy.json file for the environment

Step 1 - Get json; Step 2 - Queue packaging builds and store packages as artifacts in pipeline; Step 3.1 - Deploy packages to target servers in json file; Step 3.2 - Queue test builds for environment

I detailed the PowerShell script we used for queueing builds in a previous post.  We actually use that script for both queueing the packaging builds and also for queueing the test builds (unit, integration, feature, performance, etc.).

The script we use to deploy the packages basically calls msdeploy.exe with the appropriate arguments for each target environment. I will detail this script in a future post.


Materials are the primary input to the pipeline.  It specifies the source code you wish to orchestrate.  GO! supports many sources from which you can pull source code:

Add Materials Options - Subversion, Git, Mercurial, Performance, Team Foundation Server, Pipeline, Package

Our source is all managed in TFS, so we use the Team Foundation Server material type.  Here you can see we are pulling the FeatureFlag source code and also the PowerShell scripts we utilize in the pipeline.  For each material, you must specify the server, username, password and code path to pull.

Shows 2 TFS Materials

Template Configuration

You likely noticed the “Kanban1” label in the screenshot above.  That is our pipeline template name.  We named it to represent the template for teams using the Kanban process at a level 1 (beginner) maturity level.  Pipeline templates are optional in GO! but are super helpful when you want to use the same process/steps for many product pipelines (see section above).

The screenshot below show the stages, jobs and tasks within the template.  If you choose not to use templates, your stages and tasks would be directly associated to your pipeline.  Our stages roughly mimic our Kanban board stages of DEV, QA, Mock and PROD, each with a doing and done stage.

Showing the stages of our pipeline template and the tasks configuration for a job

Stages, jobs and tasks can be confusing if you do not have a clear understanding of their differences:

  • Stage – highest level of work within a pipeline;  stages are executed serially.
  • Job – child of stage;  jobs are executed in parallel within the context of the stage.
  • Task child of job;  tasks are executed serially within the context of the job.

The screenshot above shows the tasks for the PackageAndDeployToDEV job in the In-DEV stage of the Kanban1 pipeline template.  One task for queueing the package builds via and another for deploying those packages to our DEV environment.

In this workflow, progression to each stage in the pipeline is manually triggered except for the first stage. “In-DEV” is automatically triggered by watching the TFS material for changes.  Any changes within the configured source control path will automatically kickoff a new pipeline run.  This is accomplished by setting the material to poll for changes and setting the stage type to “On Success”:

Shows option to "Poll for new changes"

Showing Stage Type set to "On Success"


An artifact is a file or folder that we wish to make available to the pipeline users via the GO! user interface and/or utilize in future stages/jobs of the pipeline.  We can define artifacts in any job by specifying the source file/folder from the working directory and a destination file/folder.  In this case, we are pulling two artifacts (MSDeployPackages folder and the BuildUri.txt file) to be output to the root artifacts directory during the In-Dev.PackageAndDeployToDEV stage/job.

Showing 2 artifacts

In later stages we can fetch artifacts using the Fetch Artifact task type based on stage, job and artifact name.  The example below is pulling the deployment packages from the first stage and deploying them to the QA environment.  We perform the same steps for the In-Mock and In-Prod stages as well.

Showing custom Fetch Artifact and Custom Command task types


The second task in the job is a Custom Command task type.  This task type can call any command line tool available on the agent.  You can see in the properties the task is calling the PowerShell command with a set of arguments.  The first argument is referencing one of our PowerShell scripts from the second material in our pipeline.

There are several different types of tasks available in GO! by default but others can be added via plugins as well:

Task options = Ant, NAnt, Rake, Fetch Artifact, More...

Environment Variables

You likely also noticed the tokens (surrounded by percentage signs) listed after the PowerShell script in the task above.

Command: powershell Arguments: Commands\DeployPackage.ps1 -configName:%GO_PIPELINE_NAME% -TFSUri:%TFSURL% -environmentName:%QA% -verbose;

There are about a dozen environment variables that GO! sets automatically.  %GO_PIPELINE_NAME% is one example.  You can also create or override environment variables at the pipeline, stage or job levels.  They also have Secure Variables which will be hidden from the UI and logs if you need to setup a variable for a password or other secure content.

Showing sample environments from our pipeline template

Environment Variables are a valuable feature especially in the combination with pipeline templates.  This allows us to use a generic template and still customize key bits of information for specific product pipelines.  Parameters provide similar functionality as well.  In hindsight, we probably should have used parameters but either will work in this use case.


As you can see we have invested in GO! and developed a CD process which works really well for our use cases.  Our specific implementation however is very opinionated (by design) to support our development teams and the .NET applications/systems they are building.  In my opinion, GO! could work equally as well or better for Java developers using Jenkins or Rails developers using Github.  It’s a VERY flexible tool with many capabilities/plugins and a large user community.  Best of all its FREE!

For more information, you can find really great documentation for GO! at –

If I was starting from scratch to build a release orchestration, I would probably start by trying the new VSTS release automation system especially for .NET teams but GO! would be a close second option.


I hope this content was helpful.  I have a few more post ideas related to our implementation of GO!  If you would like to learn more on this topic or have other suggestions, please let me know by leaving a comment below.

Leave a Reply