I have Powerbi receiving all the records from a Jira project via a function. I then use another function to step through all the Keys and get their full changelog not just 40 per key. Has anyone gotten this right?
I can get my functions to work in PowerBI Desktop
But it fails in PowerBI Service when Publishing the report with error:
"This dataset includes a dynamic data source. Since dynamic data sources aren't refreshed in the Power BI service, this dataset won't be refreshed. Learn more: https://aka.ms/dynamic-data-sources."
It seems that if I have two exact URLS on another attempted scripts which feeds into the same table , I also get a Query firewall error stopping me from refreshing it locally even when changing the privacy. Which I believe will fail if I mark the datasource organizational or none in the Powerbi Service too.
Thank you for the help
Hi @D van zyl and welcome to the Atlassian Community!
I'm not sure I'm getting you 100% right.
But here is how I did something similar to get all work items from a jira project and their work logs. I don't remember the details exactly but the steps were:
For both endpoints I used the pagination parameters. You can not always have an endpoint returning all results. Sometimes you really need to use pagination.
I hope this helps. And if it doesn't: please elaborate/ reformulate your issue a bit :-)
Thanks , Here is my code thus far to explain the error relating the Atlassians new Endpoint for Jira (Sept 2025). Pagination works fine using my parameters and Function 1 & 2. This as a Datasource in PowerBi is fine as well including the Powerbi Service when published. When I call the table code at the bottom of the snippet , I also then call Functions 3 and 4, this also works fine and I get the whole changelog. This is locally with Powerbi desktop. When publishing it , the refreshes fail due to the datasource having a issue as two source functions are being called in the same table. The PowerBi service errors , This dataset includes a dynamic data source. Since dynamic data sources aren't refreshed in the Power BI service, this dataset won't be refreshed. The dynamic call to get each keys confluence changelog is the issue. Do you have a fix ?
1. Function: FetchPage
let
FetchPage = (baseUrl as text, jql as text, pageSize as number, optional token as nullable text) as table =>
let
queryParams =
if token = null then
[
jql = jql,
maxResults = Text.From(pageSize),
fields = "*all",
expand = "changelog"
]
else
[
jql = jql,
maxResults = Text.From(pageSize),
nextPageToken = token,
fields = "*all",
expand = "changelog"
],
contents = Web.Contents(
URL&"/rest/api/3/search/jql?jql="&FILTER,
[
//RelativePath = "rest/api/3/search", // DO NOT USE AS THE POWERBI SERVICE ONLINE FAILS REFRESHES
Query = queryParams,
Headers = [Accept = "application/json"]
]
),
json = Json.Document(contents),
issues = try json[issues] otherwise {},
table = if List.Count(issues) > 0 then
Table.FromList(issues, Splitter.SplitByNothing(), {"Column1"})
else
#table({"Column1"}, {}),
nextToken = try json[nextPageToken] otherwise null
in
table meta [nextPageToken = nextToken]
in
FetchPage
let
FetchPages = (baseUrl as text, jql as text, pageSize as number) as table =>
let
Source = GenerateByPage(
(previous) =>
let
token = if previous = null then null else Value.Metadata(previous)[nextPageToken],
table = if previous = null or token <> null then FetchPage(baseUrl, jql, pageSize, token) else null
in
if table <> null and Table.RowCount(table) > 0 then table else null,
type table [Column1=any]
)
in
Source
in
FetchPages
3. Function FetchChangelog
let
FetchChangelog = (baseUrl as text, issueKey as text, startAt as number, optional pageSize as nullable number) as record =>
let
size = if pageSize = null then 100 else pageSize,
contents = Web.Contents(
baseUrl,
[
RelativePath = "rest/api/3/issue/" & issueKey & "/changelog",
Query = [
startAt = Text.From(startAt),
maxResults = Text.From(size)
],
Headers = [Accept = "application/json"]
]
),
json = Json.Document(contents)
in
json
in
FetchChangelog
4. Function: FetchAllChangelog
let
FetchAllChangelog = (baseUrl as text, issueKey as text, optional pageSize as nullable number) as table =>
let
size = if pageSize = null then 100 else pageSize,
Pages = List.Generate(
() => [page = FetchChangelog(baseUrl, issueKey, 0, size), start = 0],
each [page][isLast] <> true,
each [page = FetchChangelog(baseUrl, issueKey, [start] + size, size),
start = [start] + size],
each [page]
),
AllPages = { FetchChangelog(baseUrl, issueKey, 0, size) } & Pages,
Values = List.Combine(List.Transform(AllPages, each _[values])),
TableOut = Table.FromList(Values, Splitter.SplitByNothing(), {"Changelog"})
in
TableOut
in
FetchAllChangelog
5. Parameter: URL
value= https://somecompany.atlassian.net
6. Parameter: FILTER
value= project=CC
7. Table Code
let
Source = FetchPages(URL, FILTER, 100),
#"Expanded Column1" = Table.ExpandRecordColumn(Source, "Column1", {"expand", "id", "self", "key", "fields"
//, "changelog" // Use if you want max 40 items from it
}, {"expand", "id", "self", "key", "fields"
//, "changelog" // Use if you want max 40 items from it
}),
//The following caused FIrewall query issues , to get all changelog rather use Getissues with fullchangelog
// Start Change log Fix to get past 40 item limit
WithChangelog = Table.AddColumn(
#"Expanded Column1",
"changelog",
each FetchAllChangelog(URL, [key], 100)
)
in
WithChangelog
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Doing some Chat GPT , it seems that getting all changelogs for a published report is now impossible , without a person using Python or another solution to do it for you and then be a data source which your Published PowerBI Report can read from. What a pain.
"There is no native way to have a fully refreshable Power BI dataset that loops through all issues and fetches every changelog directly from Jira’s REST API inside Power BI M code.
You can preview or run it locally, but for scheduled refresh in the Service, the changelog step must be pre-aggregated somewhere static."
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hmm now that you mention it: that might be how I solved it.
Get the issues in a table and get the worklogs (changelogs in your case). Then aggregate it.
I can not check it for you this week since I'm at the Team Europe event and on another PC. I can post an update early next week.
Sorry!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.