Configuring Azure DevOps project and task sync
The following videos walk-through the setup process covered in more detail below:
After setting up the dataflow, don't forget to setup the External System entry:
Note
There is no audio in these videos
Data Flow | Query Name | Sync Table |
---|---|---|
Project | sensei_devops_project | sensei_devops_project |
Task | sensei_devops_task_delta | sensei_devops_task_delta |
Task | sensei_devops_assignment | sensei_devops_assignment |
To connect a Dataflow to your Azure DevOps instance, you can follow these steps:
- Navigate to https://make.powerapps.com and log in with your administrative account
- Ensure that you have selected the Power Apps Environment that contains your Altus installation
- From the Quick Launch menu, select to expand Data and then select Dataflows
If the sample Dataflows already exist,
- Locate the Dataflow that you wish to configure, open the ellipsis (…) menu and then select Edit (then continue with the steps below from 'Populate the following parameters...':
If the sample Dataflows do not exist,
- Select New dataflow > Import template (Preview)
- Select the .pqt file that you wish to upload the Dataflow template for. (Links to the example Dataflows are in the Initial Setup section)
- Click Create
- Populate the following parameters
Parameter | Description | Example |
---|---|---|
DevOpsOrgName | Name of your Azure DevOps instance. Typically the same as your domain from the DevOps url. | YOUR-COMPANY |
EmailAddress | The email address that you use to sign into Azure DevOps. | you@your-company.com |
ApiToken | The PAT (Personal Access Token) that you created with your Azure DevOps account. See this help page from Microsoft on how to create an API Token. | 8yy15DMCYVzpAtY0fd1886fr |
SenseiIQUrl | The url of the Power Apps environment that hosts Altus. | https://orgabc123.crm.dynamics.com |
ProjectNameParameter | For the Epics or Features dataflow only. Nominate the project to source the work items from. | My Project Name |
WorkItemTypeParameter | For the Epics or Features dataflow only. Declare if this dataflow should sync work items of type "Epic" or "Feature" | Epic |
- Since DevOps uses access tokens we need to enable a special check box. Select Options from the ribbon bar (Project Options)
- Check 'Allow combining data from multiple sources'
- Select the dataset query from under the Queries heading, then select to Configure connection
- If there is no existing connection for your Dataflow, select 'Create new connection'
- Enter the following details:
Column | Value |
---|---|
Connection name | {Enter a name for your connection (if not populated automatically)} |
Authentication kind | Anonymous |
Privacy level | - |
Select Connect
If successful, data from the external system will load to the preview in the Power Query window.
Select the Next button in the bottom right of screen
Ensure that the Parameters Queries are configured with the Load setting set to Do not load (this information does not need to logged anywhere).
Follow these steps for each remaining query in your dataflow:
- Select a Query a query from the left hand panel.
- Select Load to existing table. And select the correct destination table (see reference in this article)
- Check the 'Delete rows that no longer exist in the query output' checkbox.
- In the right hand panel map all of the columns present. Click 'Auto Map' to complete this quickly.
- Select to run your Dataflow on a schedule. The frequency depends on both your business requirements as well as the volume of data that needs to be sent, particularly if you are synchronizing tasks. We recommend running dataflows once per day.
- Verify that your Dataflow runs successfully.
Use the screenshots below to verify your column mappings:
sensei_devops_project Dataflow Mapping:
- sensei_devops_task_delta Dataflow Mapping:
- sensei_devops_assignment Dataflow Mapping:
Note
If the Dataflow contains a query called Tasks_Raw, select 'Do not load' as the Load setting for that Query. (It is used internally to the Dataflow only).
Once the dataflow is running and populating the sync table, you're ready to complete the second part of the configuration process, which consists of the following steps:
- Enable the Disabled External System Record(s) by navigating to the 'Settings' area, then to 'External Systems'. Change the view to 'Inactive External Systems' and select the record for the system you wish to activate and click 'Activate' from the ribbon.
Once the record is active, review the column mappings by opening the External System record and navigate to the ‘Projects’ tab. Altus has mapped the standard columns that we typically see our clients using in DevOps to columns that exist for an Altus Project. If there are new or additional columns that you require to be mapped, first, ensure that your data flow is configured to load the new column data into a new column on the target table, then create a ‘Metadata Mapping’ for the column desired to a column in the sensei_project table. This will ensure that the data retrieved by the data flow will carry through to the Altus project that is linked to the external project.
Configure the ‘Project URL Pattern’ and optionally 'Task URL Pattern' to provide a direct link to the items in the external system. The “Pattern” for Azure DevOps projects needs to be updated with the instance name, eg:
https://devops.azure.com/YOURORGNAME/{ID}/_boards
Use the tables below as a reference for the default configuration of each of the metadata mapping tables.
Project
Project Sync Entity Name: sensei_devops_project
Project URL Pattern: https://devops.azure.com/ORGNAME/{ID}/_boards
Name | External Column Name | External Column Type |
---|---|---|
sensei_description | sensei_projectdescription | |
sensei_externalprojectid | sensei_devops_projectid_key | ID |
sensei_name | sensei_name | Name |
Task
Task Sync Entity Name: sensei_devops_task
Task Sync Delta Entity Name: sensei_devops_task_delta
Task URL Pattern: https://devops.azure.com/ORGNAME/{sensei_devops_projectid_key}/_workitems/edit/{ID}
Name | External Column Name | External Column Type |
---|---|---|
sensei_bucket | sensei_bucket | Bucket |
sensei_externalprojectid | sensei_devops_projectid_key | Parent Project ID |
sensei_externaltaskid | sensei_devops_taskid_key | ID |
sensei_name | sensei_name | Name |
sensei_order | sensei_order | Bucket Order |
sensei_wiplimit | sensei_wiplimit | Bucket WIP |
Assignment
Assignment Sync Entity Name: sensei_devops_assignment
Name | External Column Name | External Column Type |
---|---|---|
sensei_externalassignmentid | sensei_devops_assignmentid_key | ID |
sensei_externalresourceid | sensei_devops_resourceid_key | Resource ID |
sensei_externaltaskid | sensei_devops_taskid_key | Parent Task ID |
sensei_name | sensei_name | Name |
sensei_unit | sensei_unit |
DevOps and Percent Complete
Because DevOps is a Kanban type task management tool, there is no field that can be directly imported as a Percent Complete value. Altus does not try assume a translation for Percent Complete in DevOps because this will differ from organisation to organisation.
The result of this is that because the resulting Task in Altus will not be populated with a value for Percent Complete, DevOps tasks will not follow the same pattern as other task types regarding whether or not those tasks display in the 'My Active Project Tasks' and 'All Active Project Tasks' views.
The following steps can be followed if there is a requirement to map a Percent Complete value for DevOps Tasks;
Note
This section assumes that you have already previously configured the DevOps Dataflows and that you are now making changes to those Dataflows, Sync Tables and Metadata Mappings.
Define your Percent Complete Calculations
First, determine how you wish to calculate a Percent Complete value for DevOps tasks. You can choose to define the Percent Complete value any way you wish. As long as you can 'calculate' your value in a Dataflow, you can translate it into Altus. In this example, we will be translating Percent Complete based on the Board Column of the item as follows:
Board Column Percent Complete In Progress 50 Done 100 {Any other column} 0 Add Columns to the DevOps Task Sync Tables
Before we go ahead and alter the Dataflow, let's first go ahead and create the fields that will store the percent complete value in the DevOps Task Sync tables. In an unmanaged Solution (e.g. 'Enhancements') in your environment, add references to the existing tables 'DevOps Task' and 'DevOps Task Delta' (no need to add any of the other properties, just a reference to the Table).
Select the 'DevOps Task' table
Select 'Columns'
Select 'New'
Enter the following details, then press Save
Property Value Display name Percent Complete Data type Number > Whole number Take note of the internal Name field for the newly created column. You will need to refer to this later.
Note
Each environment has a different default publisher prefix - and you can also create your own.
Repeat these steps for the DevOps Task Delta table, ensuring that you have named the column identically to the one in the DevOps Task table.
Create DevOps Task Metadata Mapping
Next, let's define the mapping between the new Percent Complete columns and the task Percent Complete column. As an Altus Admin User, open the Altus App and navigate to the Settings area.
Then from the left menu, select External Systems.
Then select Azure DevOps.
Navigate to the Tasks tab.
Select to create a New Metadata Mapping record.
Enter the following details, then click Save & Close:
Field Name Value External System Azure DevOps
Mapping Type Task
Altus Field Name sensei_percentcomplete
External Field Name {Enter the field Name of the field you created earlier in the DevOps Task table}
External Field Type {leave blank}
Verify that your new mapping column appears in the list of Task Metadata Mappings.
Update DevOps Task Dataflow
Next, select to edit the Azure DEVOPS - Task Sync Dataflow in your environment
Select the Tasks_Raw query
Ensure that the last Applied Step is selected in the right pane and then from the ribbon select Add Column > Conditional column
As mentioned, this example deals with translating the board column values (which are in the sensei_bucket column) to a Percent Complete value. Our conditional column rules for this example were configured as follows:
Note that the column name was set as the same name of the column we added earlier to the sync tables. You should now see your new column in the query output.
Next, select the sensei_devops_task_delta query from the Queries pane
From the Home tab, select Advanced editor
On the line that defines DynamicsUrl, add the name of the field that you created on the DevOps Task Sync table earlier so that it is returned by that query.
On the line that defines UpdateRows, add the name of the field that you created on the DevOps Task Sync table earlier to both sides of the join.
Click OK
Click Next
Select the sensei_devops_task_delta query and either press 'Auto map', or manually map the field in your query to the field you created in your sync table.
Ensure that the Tasks_Raw query remains set to 'Do not load'
Click Publish
Tasks sync'd into Altus from DevOps should now correctly populate tasks with the Percent Complete value as per your mapping in the Dataflow.
DLP Policy Details
This integration will create/utilise the following Power Platform connections types:
Downloads
Dataflows for the Project and Task Templates can be found on the Initial setup page.