Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

See when key Fabric features will launch and what’s already live, all in one place and always up to date. Explore the new Fabric roadmap

Reply
smpa01
Super User
Super User

Accessing one lake

As per the doc  python can access onelake progrmatically. So I tried this to access the files in a directory like

from azure.storage.filedatalake import (
    DataLakeServiceClient,
    DataLakeDirectoryClient,
    FileSystemClient
)
from azure.identity import DefaultAzureCredential

# Set your account, workspace, and item path here
ACCOUNT_NAME = "onelake"
WORKSPACE_NAME = "xxx"
DATA_PATH = "lakehouse_staging.Lakehouse/Files/directory_one"

def test():
    #Create a service client using the default Azure credential

    account_url = f"https://{ACCOUNT_NAME}.dfs.fabric.microsoft.com"
    token_credential = DefaultAzureCredential()
    service_client = DataLakeServiceClient(account_url, credential=token_credential)

    #Create a file system client for the workspace
    file_system_client = service_client.get_file_system_client(WORKSPACE_NAME)
    
    #List a directory within the filesystem
    paths = file_system_client.get_paths(path=DATA_PATH)

    for path in paths:
        print(path.name + '\n')

test()

 

But I get this instead

DefaultAzureCredential failed to retrieve a token from the included credentials.
Attempted credentials:
	EnvironmentCredential: EnvironmentCredential authentication unavailable. Environment variables are not fully configured.
Visit https://aka.ms/azsdk/python/identity/environmentcredential/troubleshoot to troubleshoot this issue.
	ManagedIdentityCredential: ManagedIdentityCredential authentication unavailable, no response from the IMDS endpoint.
	SharedTokenCacheCredential: SharedTokenCacheCredential authentication unavailable. No accounts were found in the cache.
	AzureCliCredential: Azure CLI not found on path
	AzurePowerShellCredential: PowerShell is not installed
	AzureDeveloperCliCredential: Azure Developer CLI could not be found. Please visit https://aka.ms/azure-dev for installation instructions and then,once installed, authenticate to your Azure account using 'azd auth login'.
To mitigate this issue, please refer to the troubleshooting guidelines here at https://aka.ms/azsdk/python/identity/defaultazurecredential/troubleshoot.
DefaultAzureCredential failed to retrieve a token from the included credentials.

 

I am a workspace admin but not tenant admin. What am I doing wrong?

Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs
1 ACCEPTED SOLUTION

Hi @smpa01 ,

 

Thanks for providing those details and code examples, your understanding is logical, but here’s where the key difference lies:

Although OneLake is built on ADLS Gen2 and supports the same APIs and SDKs, it does not support SAS token authentication. OneLake exclusively uses Microsoft Entra ID (Azure AD)-based authentication, which is more secure and aligned with Fabric’s enterprise governance model.

So while you can continue to use DataLakeServiceClient, you must authenticate using a supported credential type like ClientSecretCredential or DefaultAzureCredential with a registered service principal.

To enable this:

  1. Use your SPN and authenticate via Entra ID (no SAS tokens).
  2. Ensure your SPN has at least Viewer or Member role assigned to the Lakehouse item in the Fabric workspace via “Manage Access”.
  3. Use endpoint:
    https://<workspace-name>.dfs.fabric.microsoft.com

While the APIs are compatible, the authentication model is not, and that’s the main reason your current Databricks pattern doesn’t apply directly to OneLake.

 

If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.

 

Thankyou.

View solution in original post

10 REPLIES 10
v-tsaipranay
Community Support
Community Support

Hi @smpa01 ,

 

May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.

 

Thank you.

 

v-tsaipranay
Community Support
Community Support

Hi @smpa01 ,

Thank you for reaching out to the Microsoft Fabric Community Forum.

 

You're correct, DefaultAzureCredential() relies on several credential sources, and by default, it attempts to use local development credentials such as Azure CLI or Visual Studio sign-in. This is why you're seeing failures when none of those sources are available in your environment.

 

For production or automated scenarios, you'll need to use a credential method that supports non-interactive authentication. One common approach (outside the scope of Microsoft Fabric support) involves authenticating through a workload identity, such as a service principal or managed identity, depending on where your script is hosted.

 

While the Fabric team doesn't support Azure identity configuration directly, you can refer to Azure SDK identity docs for setting up a secure authentication method suitable for production environments.

 

Additionally, ensure that the identity being used has the correct OneLake access permissions for the Fabric workspace and item. You must be assigned the appropriate role in the Fabric workspace to programmatically access the Lakehouse files via OneLake APIs.

 

I hope this will resolve your issue, if you need any further assistance, feel free to reach out.

If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.

 

Thankyou.

@v-tsaipranay I will tell you what is my confusion here.

The client I works for, uses databricks and provisions each account with an ADLS Gen 2 storage as an external location.  You can write files to the storage container with the provided SAS token from databricks notebook. Such as

from azure.storage.filedatalake import DataLakeServiceClient
import json
import io

# https://learn.microsoft.com/en-us/python/api/overview/azure/storage-file-datalake-readme?view=azure-python
blob_account = "account"
blob_container = "container"
scope = "scope"
sas_token = "token"

# Account URL for Data Lake Storage Gen2
account_url = f"https://{blob_account}.dfs.core.windows.net"

# COMMAND ---------
# Initialize the DataLakeServiceClient
datalake_service_client = DataLakeServiceClient(account_url=account_url, credential=sas_token)

# Get the FileSystemClient for the container
fs_client = datalake_service_client.get_file_system_client(file_system=blob_container)

# List all files and directories in the container
print("Listing files and directories:")
paths = fs_client.get_paths()
for path in paths:
    print(path)

 

Now, the following is what the doc says and I was under the impression that I can extend the same methods to Fabric as well but I am somehow can't programmatically access one lake the way I am able to in the above scenario. If I could, I can just recycle all the libraries in fabric I have already written to programmatically access ADLS Gen2 for databricks.

smpa01_0-1747248221841.png

 

Also,

Additionally, ensure that the identity being used has the correct OneLake access permissions for the Fabric workspace and item. You must be assigned the appropriate role in the Fabric workspace to programmatically access the Lakehouse files via OneLake APIs. - I am the  workspace Admin (not tenant admin though) and I have a SPN that I programatically use with Fabric and Power BI rest api. I am not sure what role are you referring to?

 

 

Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs

Hi @smpa01 ,

 

Thanks for providing those details and code examples, your understanding is logical, but here’s where the key difference lies:

Although OneLake is built on ADLS Gen2 and supports the same APIs and SDKs, it does not support SAS token authentication. OneLake exclusively uses Microsoft Entra ID (Azure AD)-based authentication, which is more secure and aligned with Fabric’s enterprise governance model.

So while you can continue to use DataLakeServiceClient, you must authenticate using a supported credential type like ClientSecretCredential or DefaultAzureCredential with a registered service principal.

To enable this:

  1. Use your SPN and authenticate via Entra ID (no SAS tokens).
  2. Ensure your SPN has at least Viewer or Member role assigned to the Lakehouse item in the Fabric workspace via “Manage Access”.
  3. Use endpoint:
    https://<workspace-name>.dfs.fabric.microsoft.com

While the APIs are compatible, the authentication model is not, and that’s the main reason your current Databricks pattern doesn’t apply directly to OneLake.

 

If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.

 

Thankyou.

Hi @smpa01 ,

 

I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.


Thank you.

Hi @smpa01 

I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.


Thank you.

Hakesh
Regular Visitor

Hii @smpa01 try modifying or creating environment in workspace setting then try again. 

Did it help you?

Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs
smpa01
Super User
Super User

That implies -  it can't be programatically used and put on PROD?

Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs
nilendraFabric
Community Champion
Community Champion

Ensure you’re authenticated via Azure CLI/PowerShell
az login --tenant <your-tenant-id>

This creates local credentials that `DefaultAzureCredential` can detect

Helpful resources

Announcements
May FBC25 Carousel

Fabric Monthly Update - May 2025

Check out the May 2025 Fabric update to learn about new features.

May 2025 Monthly Update

Fabric Community Update - May 2025

Find out what's new and trending in the Fabric community.