Earn recognition and rewards for your Microsoft Fabric Community contributions and become the hero our community deserves.
Learn moreSee when key Fabric features will launch and what’s already live, all in one place and always up to date. Explore the new Fabric roadmap
As per the doc python can access onelake progrmatically. So I tried this to access the files in a directory like
from azure.storage.filedatalake import (
DataLakeServiceClient,
DataLakeDirectoryClient,
FileSystemClient
)
from azure.identity import DefaultAzureCredential
# Set your account, workspace, and item path here
ACCOUNT_NAME = "onelake"
WORKSPACE_NAME = "xxx"
DATA_PATH = "lakehouse_staging.Lakehouse/Files/directory_one"
def test():
#Create a service client using the default Azure credential
account_url = f"https://{ACCOUNT_NAME}.dfs.fabric.microsoft.com"
token_credential = DefaultAzureCredential()
service_client = DataLakeServiceClient(account_url, credential=token_credential)
#Create a file system client for the workspace
file_system_client = service_client.get_file_system_client(WORKSPACE_NAME)
#List a directory within the filesystem
paths = file_system_client.get_paths(path=DATA_PATH)
for path in paths:
print(path.name + '\n')
test()
But I get this instead
DefaultAzureCredential failed to retrieve a token from the included credentials.
Attempted credentials:
EnvironmentCredential: EnvironmentCredential authentication unavailable. Environment variables are not fully configured.
Visit https://aka.ms/azsdk/python/identity/environmentcredential/troubleshoot to troubleshoot this issue.
ManagedIdentityCredential: ManagedIdentityCredential authentication unavailable, no response from the IMDS endpoint.
SharedTokenCacheCredential: SharedTokenCacheCredential authentication unavailable. No accounts were found in the cache.
AzureCliCredential: Azure CLI not found on path
AzurePowerShellCredential: PowerShell is not installed
AzureDeveloperCliCredential: Azure Developer CLI could not be found. Please visit https://aka.ms/azure-dev for installation instructions and then,once installed, authenticate to your Azure account using 'azd auth login'.
To mitigate this issue, please refer to the troubleshooting guidelines here at https://aka.ms/azsdk/python/identity/defaultazurecredential/troubleshoot.
DefaultAzureCredential failed to retrieve a token from the included credentials.
I am a workspace admin but not tenant admin. What am I doing wrong?
Solved! Go to Solution.
Hi @smpa01 ,
Thanks for providing those details and code examples, your understanding is logical, but here’s where the key difference lies:
Although OneLake is built on ADLS Gen2 and supports the same APIs and SDKs, it does not support SAS token authentication. OneLake exclusively uses Microsoft Entra ID (Azure AD)-based authentication, which is more secure and aligned with Fabric’s enterprise governance model.
So while you can continue to use DataLakeServiceClient, you must authenticate using a supported credential type like ClientSecretCredential or DefaultAzureCredential with a registered service principal.
To enable this:
While the APIs are compatible, the authentication model is not, and that’s the main reason your current Databricks pattern doesn’t apply directly to OneLake.
If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.
Thankyou.
Hi @smpa01 ,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Hi @smpa01 ,
Thank you for reaching out to the Microsoft Fabric Community Forum.
You're correct, DefaultAzureCredential() relies on several credential sources, and by default, it attempts to use local development credentials such as Azure CLI or Visual Studio sign-in. This is why you're seeing failures when none of those sources are available in your environment.
For production or automated scenarios, you'll need to use a credential method that supports non-interactive authentication. One common approach (outside the scope of Microsoft Fabric support) involves authenticating through a workload identity, such as a service principal or managed identity, depending on where your script is hosted.
While the Fabric team doesn't support Azure identity configuration directly, you can refer to Azure SDK identity docs for setting up a secure authentication method suitable for production environments.
Additionally, ensure that the identity being used has the correct OneLake access permissions for the Fabric workspace and item. You must be assigned the appropriate role in the Fabric workspace to programmatically access the Lakehouse files via OneLake APIs.
I hope this will resolve your issue, if you need any further assistance, feel free to reach out.
If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.
Thankyou.
@v-tsaipranay I will tell you what is my confusion here.
The client I works for, uses databricks and provisions each account with an ADLS Gen 2 storage as an external location. You can write files to the storage container with the provided SAS token from databricks notebook. Such as
from azure.storage.filedatalake import DataLakeServiceClient
import json
import io
# https://learn.microsoft.com/en-us/python/api/overview/azure/storage-file-datalake-readme?view=azure-python
blob_account = "account"
blob_container = "container"
scope = "scope"
sas_token = "token"
# Account URL for Data Lake Storage Gen2
account_url = f"https://{blob_account}.dfs.core.windows.net"
# COMMAND ---------
# Initialize the DataLakeServiceClient
datalake_service_client = DataLakeServiceClient(account_url=account_url, credential=sas_token)
# Get the FileSystemClient for the container
fs_client = datalake_service_client.get_file_system_client(file_system=blob_container)
# List all files and directories in the container
print("Listing files and directories:")
paths = fs_client.get_paths()
for path in paths:
print(path)
Now, the following is what the doc says and I was under the impression that I can extend the same methods to Fabric as well but I am somehow can't programmatically access one lake the way I am able to in the above scenario. If I could, I can just recycle all the libraries in fabric I have already written to programmatically access ADLS Gen2 for databricks.
Also,
Additionally, ensure that the identity being used has the correct OneLake access permissions for the Fabric workspace and item. You must be assigned the appropriate role in the Fabric workspace to programmatically access the Lakehouse files via OneLake APIs. - I am the workspace Admin (not tenant admin though) and I have a SPN that I programatically use with Fabric and Power BI rest api. I am not sure what role are you referring to?
Hi @smpa01 ,
Thanks for providing those details and code examples, your understanding is logical, but here’s where the key difference lies:
Although OneLake is built on ADLS Gen2 and supports the same APIs and SDKs, it does not support SAS token authentication. OneLake exclusively uses Microsoft Entra ID (Azure AD)-based authentication, which is more secure and aligned with Fabric’s enterprise governance model.
So while you can continue to use DataLakeServiceClient, you must authenticate using a supported credential type like ClientSecretCredential or DefaultAzureCredential with a registered service principal.
To enable this:
While the APIs are compatible, the authentication model is not, and that’s the main reason your current Databricks pattern doesn’t apply directly to OneLake.
If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.
Thankyou.
Hi @smpa01 ,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hi @smpa01
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hii @smpa01 try modifying or creating environment in workspace setting then try again.
Did it help you?
That implies - it can't be programatically used and put on PROD?
Ensure you’re authenticated via Azure CLI/PowerShell
az login --tenant <your-tenant-id>
This creates local credentials that `DefaultAzureCredential` can detect
User | Count |
---|---|
12 | |
5 | |
3 | |
3 | |
3 |
User | Count |
---|---|
8 | |
7 | |
6 | |
6 | |
4 |