With each project, it will be necessary sooner or later to set up an automated deployment (Continuous Integration). This is especially true if there are multiple people working on a project and there are multiple environments (stages). In this specific example, a Wizdom project is automated with the help of Visual Studio Team Services. The goal is to publish the branding as well as custom modules artifacts in o365 and Azure.

Project setup

The project was divided into 2 visual studio projects.

  • The branding project includes CSS, JS and Wizdom templates
  • The custom modules project includes more complex structures such as WebServices, WebParts, Forms and Scripts

The technologies used for this project:

  • Visual Studio 2017
  • Visual Studio Team Services (VSTS)
  • Npm
  • GIT Extensions
  • Git
    • Bitbucket Repository
    • VSTS Git Repository

Creating the Build Configuration

To map the deployment process, a new project must be created under https://tenantname.visualstudio.com. Each project contains processes, with a process mapping the process for an environment. A process uses an existing GIT repository as a source, as well as a branch name as a unique identifier. If a process is created, it must be configured first. This includes: Agent Queue Here, the agent’s operating system is selected. In our case, we use “Hosted VS 2017.” Get Sources Here is the repository to attach. It is important to specify the name of the Branch. More variable here, global process variables can be deposited. To get more logging, it is advisable to enter the entry system.debug = true triggers Here the Interval and the Branch is defined, in which the repository is examined for changes.

Create the Actions for the Custom Modules

Now that the process has been created, it is important to define the steps that are being performed. These are: Node v8.11.2 So it is ensured that the correct node version is installed on the agent.

NuGet Restore run NPM Install Gulp Task to insert Build Solution Web.config settings This is not present in the standard. In MarketPlace, however, there is the tool BDC “Build & Release Tools.” After its installation, there is an action “Set web.config” FTP upload of the artifacts Custom Modules are hosted on the Azure App Server and uploaded via FTP.

Create the actions for branding

The branding is not hosted on the app server, but on the Azure Storage. The steps needed for this:

  • NPM Install
  • Run Gulp Task. The Sass files are compiled
  • Azure File Copy of css files
  • Azure File Copy
  • Azure File Copy of wizdom templates
  • Azure File Copy of wizdom site templates

The Azure File Copy Tasks must be configured correctly, which is not so trivial. Here are the settings for the css files. According to the same pattern, the other File Copy can also be configured: Source: The path in the solution, for example, D:a1sprojectnameWizdom Intranetcss Azure Connection Type: Azure Resource Manager Azure Subscription: Click the Manage button and configure a new endpoint. Destination Type: Azure Blob RM Storage Account: The Storage account name. This can be found out via the PublishingProfile file Container Name: wizdom365public Blob Prefix: CustomStyles/templates/Additional Arguments:/Pattern: * .css/SetContentType: text/css Storage Container URI: TargetContainer

Execution and testing

Once everything is configured, testing can begin. If a commit is now pushed to a branch, so, after a short delay, the process starts. After a few minutes, the process is gone through and the result can be controlled. In the overview, older versions can be controlled and the Build History can be viewed. Unfortunately, it is not possible to debug the process at runtime.


  • Repetitive tasks can be outsourced to task groups. Variables can be referenced in the tasks using placeholders $ (CustomVariable)
  • NPM Install takes a very long time in the Agent queue “Hosted VS2017.” It is much faster to use the queue “Hosted macOS.” Only downside, some tasks are not supported under macOS.
  • With the help of the Azure Storage Explorer tool, the contents of the storage can be viewed. This is helpful to check if the artifacts have been deployed correctly