Table of Contents

Configuring Jira project and task sync

Note

Currently only Jira Cloud is supported for Task Sync. While it is possible to synchronize items from On Premises Jira servers, it will require some customization of the API calls within the data flow.

The following videos walk-through the setup process covered in more detail below:

After setting up the dataflow, don't forget to setup the External System entry:

Note

There is no audio in these videos

Data Flow Query Name Sync Table
Project sensei_jira_project sensei_jira_project
Task sensei_jira_task_delta sensei_jira_task_delta
Task sensei_jira_assignment sensei_jira_assignment

To connect a Dataflow to your Jira Cloud instance, you can follow these steps:

  • Navigate to https://make.powerapps.com and log in with your administrative account
  • Ensure that you have selected the Power Apps Environment that contains your Altus installation
  • From the Quick Launch menu, select to expand Data and then select Dataflows

If the sample Dataflows already exist,

  • Locate the Dataflow that you wish to configure, open the ellipsis (…) menu and then select Edit (then continue with the steps below from 'Populate the SourceUrl...':

If the sample Dataflows do not exist,

  • Select New dataflow > Import template (Preview)

This image displays the Power Apps import template menu

  • Select the .pqt file that you wish to upload the Dataflow template for. (Links to the example Dataflows are in the Initial Setup section, above.)
  • Click Create

This image displays the Dataflow import page and it's fields

  • Populate the following parameters
Parameter Description Example
JiraUrl The url of your Jira instance. https://your-organization.atlassian.net/
EmailAddress The email address that you use to sign into Jira Core. you@your-company.com
ApiToken The API token that you created with your Jira account. See this help page from Atlassian on how to create an API Token. 8yy15DMCYVzpAtY0fd1886fr
SenseiIQUrl The url of the Power Apps environment that hosts Altus https://orgabc123.crm.dynamics.com
PropertiesParameter A comma delimited list of custom project properties to fetch. Only available if you are using the "advanced properties" template. department,startDate,endDate,proposalCost,financialBenefit
Note

Currently Jira Cloud doesn't provide a native interface for adding custom properties to your Jira projects. However, there are third party add-ins available via the Atlassian Marketplace that add this functionality.

  • Since Jira uses access tokens we need to enable a special check box. Select Options from the ribbon bar (project options)
  • Check 'Allow combining data from multiple sources'

This image displays the Project Options pane and it's fields

  • Select the dataset query from under the Queries heading, then select to Configure connection
  • If there is no existing connection for your Dataflow, select 'Create new connection'
  • Enter the following details:
Column Value
Connection name {Enter a name for your connection (if not populated automatically)}
Authentication kind Anonymous
Privacy level None
Note

Jira Core uses Basic Authentication for access to it's Web REST API, a feature that is not supported at the time of this writing by Dataflows. So long as you provide a correct Jira instance url and API token, the connection should be established even if this interface shows an error or you are prompted to re-enter the connection values upon subsequent edits of this Dataflow. If it does prompt you to re-enter authentication, make sure that you have checked 'Allow combining data from multiple sources' from the Project Options screen and that your access token setting is correct.

  • Select Connect

  • If successful, data from the external system will load to the preview in the Power Query window.

  • Select the Next button in the bottom right of screen

  • Ensure that the Parameters Queries are configured with the Load setting set to Do not load (this information does not need to logged anywhere).

  • Follow these steps for each remaining query in your dataflow:

    • Select a Query a query from the left hand panel.
    • Select Load to existing table. And select the correct destination table (see reference in this article)
    • Check the 'Delete rows that no longer exist in the query output' checkbox.
    • In the right hand panel map all of the columns present. Click 'Auto Map' to complete this quickly.
    • Select to run your Dataflow on a schedule. The frequency depends on both your business requirements as well as the volume of data that needs to be sent, particularly if you are synchronizing tasks. We recommend running dataflows once per day.
    • Verify that your Dataflow runs successfully.
  • Use the screenshots below to verify your column mappings:

  • sensei_jira_project Dataflow Mapping: This image displays the Power Query Column mapping configuration page for projects

  • sensei_jira_task_delta Dataflow Mapping: This image displays the Power Query Column mapping configuration page for tasks

  • sensei_jira_assignment Dataflow Mapping: This image displays the Power Query Column mapping configuration page for assignments

Note

If the Dataflow contains a query called Tasks_Raw, select 'Do not load' as the Load setting for that Query. (It is used internally to the Dataflow only).

Once the dataflow is running and populating the sync table, you're ready to complete the second part of the configuration process, which consists of the following steps:

  • Enable the Disabled External System Record(s) by navigating to the 'Settings' area, then to 'External Systems'. Change the view to 'Inactive External Systems' and select the record for the system you wish to activate and click 'Activate' from the ribbon.

This image shows the Inactive External Systems page and how to Activate an item

  • Once the record is active, review the column mappings by opening the External System record and navigate to the ‘Projects’ tab. Altus has mapped the standard columns that we typically see our clients using in Project for the Web and Project Online to columns that exist for a Altus project. If there are new or additional columns that you require to be mapped, first, ensure that your data flow is configured to load the new column data into a new column on the target table, then create a ‘Metadata Mapping’ for the column desired to a column in the sensei_project table. This will ensure that the data retrieved by the data flow will carry through to the Altus project that is linked to the External project.

  • Configure the ‘Project URL Pattern’ and optionally 'Task URL Pattern' to provide a direct link to the items in the external system. The “Pattern” for Jira projects needs to be updated with the instance name, eg: your-domain.atlassian.net

Use the tables below as a reference for the default configuration of each of the metadata mapping tables.

Project

Project Sync Table Name: sensei_jira_project

Project URL Pattern: https://YOURDOMAIN.atlassian.net/browse/{ID}

Name External Column Name External Column Type JIRA Field
sensei_description sensei_projectdescription description
sensei_externalprojectid sensei_jira_projectid_key ID key
sensei_name sensei_name Name name

Project dataFlow

Endpoints

  • Get project Paginated
    • Query: https://orgname.atlassian.net/rest/api/3/project/search?&expand=description,lead,projectKeys

Out of the box settings

  • JiraIDPerPage (Items per Page) = 50

Task

Task Sync Table Name: sensei_jira_task

Task Sync Delta Table Name: sensei_jira_task_delta

Task URL Pattern: https://YOURDOMAIN.atlassian.net/browse/{ID}

Name External Column Name External Column Type JIRA Field
sensei_bucket sensei_bucket Bucket
sensei_externalprojectid sensei_jira_projectid_key Parent Project ID key (Project)
sensei_externaltaskid sensei_jira_taskid_key ID key (Issue)
sensei_name sensei_name Name fields.summary
sensei_order sensei_order Bucket Order
sensei_wiplimit sensei_wiplimit Bucket WIP

Task dataFlow

Endpoints

  • Get project paginated
    • Query: https://orgname.atlassian.net/rest/api/3/project/search?&$select=name,id,key
  • Search for issues using JQL (GET)
    • Query: https://orgname.atlassian.net/rest/api/3/search?&jql=project={projectId} AND assignee != EMPTY & fields=key,summary,status
  • Get all boards
    • Query: https://orgname.atlassian.net/rest/agile/1.0/board?&projectKeyOrId={projectId}&maxResults=1
  • Get configuration
    • Query: https://orgname.atlassian.net/rest/agile/1.0/board/{boardId}/configuration

Out of the box settings

  • JiraIDPerPage (Items per Page) = 50

Assignment

Assignment Sync Table Name: sensei_jira_assignment

Name External Column Name External Column Type JIRA Field
sensei_externalassignmentid sensei_jira_assignmentid_key ID key_assignee.accountId
sensei_externalresourceid sensei_jira_resourceid_key Resource ID assignee.accountId
sensei_externaltaskid sensei_jira_taskid_key Parent Task ID key
sensei_name sensei_name Name assignee.displayName
sensei_unit sensei_unit N/A

Task dataFlow

Endpoints

Out of the box settings

  • JiraIDPerPage (Items per Page) = 50

Troubleshooting / Known Issues

When setting up the task sync dataflow, we have noticed there can be timeout issues when mapping the sensei_jira_task_delta table if there is a large amount of data to be processed (we believe the dataflow is attempting to analyse the input in its entirety).

To get around the timeout issue, we have seen that by moving back to the Power Query screen, and then forward again will usually prompt the dataflow to map the tables correctly, however it has been observed this may take a few attempts.

Furthermore, based upon our observations it would be best for the consultant to discuss with the client the requirements regarding the tasks being brought across to Altus. As there may not be a requirement to pull across all historical tasks, which will lower the total amount of data.

DLP Policy Details

This integration will create/utilise the following Power Platform connections types:

This image shows connections in use by the dataflow

Downloads

Dataflows for the Project and Task Templates can be found on the Initial setup page.