Requirement: I'm looking for a way to link my Power BI that uses a data lake as a data source based on environment. We want to connect our Power BI semantic model to a data lake based on environments (dev/staging/prod). The dynamic parameters should be set in a CI/CD pipeline via script. For the data lake url, this has been done successfully, but i'm struggling to setup the access keys using a service principles.
What I did:
I tried using the PowerBI API with the keys directly but with the error
$dataSourceCredentialsUrl = ".0/my/groups/${workspaceId}/datasets/${datasetId}/Default.UpdateDatasources"
$dataSourceCredentials = @{
dataSourceConnections = @(
@{
connectionDetails = @{
url = ";
}
credentialDetails = @{
credentialType = "Key"
credentials = @{
key = $clientSecret
}
privacyLevel = "None"
}
}
)
}
Invoke-RestMethod -Uri $dataSourceCredentialsUrl -Method Post -Body (ConvertTo-Json $dataSourceCredentials) -ContentType 'application/json' -Headers $headers
Error: ""API is not accessible for application".
Actually i don't want to put the keys somewhere in a script but rather add some connection in M query with parameters (as used also with the data lake urls in my case) and just switch out the Azure Key Vaults based on environment.
Does anyone have an idea or better approach?
Requirement: I'm looking for a way to link my Power BI that uses a data lake as a data source based on environment. We want to connect our Power BI semantic model to a data lake based on environments (dev/staging/prod). The dynamic parameters should be set in a CI/CD pipeline via script. For the data lake url, this has been done successfully, but i'm struggling to setup the access keys using a service principles.
What I did:
I tried using the PowerBI API with the keys directly but with the error
$dataSourceCredentialsUrl = "https://api.powerbi/v1.0/my/groups/${workspaceId}/datasets/${datasetId}/Default.UpdateDatasources"
$dataSourceCredentials = @{
dataSourceConnections = @(
@{
connectionDetails = @{
url = "https://vtsonlinetestdatalake.dfs.core.windows"
}
credentialDetails = @{
credentialType = "Key"
credentials = @{
key = $clientSecret
}
privacyLevel = "None"
}
}
)
}
Invoke-RestMethod -Uri $dataSourceCredentialsUrl -Method Post -Body (ConvertTo-Json $dataSourceCredentials) -ContentType 'application/json' -Headers $headers
Error: ""API is not accessible for application".
Actually i don't want to put the keys somewhere in a script but rather add some connection in M query with parameters (as used also with the data lake urls in my case) and just switch out the Azure Key Vaults based on environment.
Does anyone have an idea or better approach?
Share Improve this question edited Nov 21, 2024 at 7:57 Thomas Gebetsberger asked Nov 21, 2024 at 7:40 Thomas GebetsbergerThomas Gebetsberger 112 bronze badges 13 | Show 8 more comments1 Answer
Reset to default 0Just giving this as an answer.
Since adls gen2
is not supported, need to use the Update Parameters In Group API call like mentioned here.
Here is the sample on how to make update parameters api call.
$body = @{
updateDetails = @(
@{
name = "ADLS_Connection"
newValue = "https://<prod_account>.dfs.core.windows/<prod_container>"
}
)
}
Invoke-RestMethod -Uri $apiUrl -Method POST -Body ($body | ConvertTo-Json -Depth 10) -Headers $headers -ContentType "application/json"
You need to give the name of the field you want to change in name
and updated value in the newValue
.
https://api.powerbi/v1.0/my/groups/{WorkspaceID}/datasets/{datasetId}/Default.UpdateDatasources
and let me know if it works? – Rukmini Commented Nov 21, 2024 at 9:06