Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
dimitragav10_
Frequent Visitor

Read files Lakehouse

Hello,

I’m working on a process where I use a PowerShell script to retrieve data from API activity events, which generates JSON files. These files are then uploaded to Lakehouse files. From there, a notebook processes the uploaded data and inserts it into tables.

I’d like to automate the step of uploading the JSON files to the Lakehouse files dynamically, instead of doing it manually each time. What would be the recommended approach to achieve this?

Is there a way to connect the Lakehouse directly to read files from Azure Blob Storage (or another storage service) so that the data ingestion can be streamlined without manual uploads?

Thanks in advance.

1 ACCEPTED SOLUTION

Hi @dimitragav10_,

Thanks for the update and the detailed error. The error “The remote name could not be resolved” typically points to a DNS or network access issue when trying to create a One Lake Shortcut to Azure Blob Storage. Here’s a checklist to help resolve it:

Verify Storage Account Accessibility: Open this URL in a browser: https://<your-storage-account>.blob.core.windows.net/

If the page does not load or times out, the storage account is either firewalled or not publicly accessible. Review Firewall Settings in the Azure Portal: Navigate to Storage Account > Networking. Ensure public access is allowed or configure access using a Private Endpoint + Managed VNet (currently in preview for Microsoft Fabric).

Verify Shortcut Configuration: Use only the storage account name and container name in the shortcut dialog do not paste full URLs. Example: Storage account: mystorageacct. Container: my-json-data.

Test DNS Resolution: On your machine, run: 

Resolve-DnsName mystorageacct.blob.core.windows.net


If this fails, Fabric cannot resolve it either, indicating a networking or name resolution issue. Try Different Authentication Methods: Test with both Azure AD (ensure your user or service principal has Storage Blob Data Reader permissions) and account key (ensure the key is valid and access is enabled).

Kindly refer to the below following link for more information:
Set up your Azure Blob Storage connection - Microsoft Fabric | Microsoft Learn

This issue is discussed and resolved in the Fabric Community forum here:
Solved: Dataflow refresh Failed - AzureBlobs failed to get the response

In that thread, users reported the same error and resolved it by addressing firewall/network restrictions and DNS accessibility.

If your storage account is private or behind a firewall, try temporarily enabling public access to test the shortcut creation. If it works, then move forward with setting up secure access using private endpoints or managed networks.

If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.

Thank you for using Microsoft Community Forum.

View solution in original post

10 REPLIES 10
jennratten
Super User
Super User

Hello @dimitragav10_  - the explanation provided by @Cookistador is a valid method for ingesting data into lakehouses (and copying data from one location to the other, generally speaking).  In your specific scenario, I agree with @ObungiNiels, creating a shortcut would be better; one of the many benefits of Fabric is being able to reference data that is in other locations without having to copy it.  The article linked below gives a good walk through of how to create the shortcut.

https://support.fabric.microsoft.com/pl-pl/blog/new-shortcut-type-for-azure-blob-storage-in-onelake-... 

 

 

 

 

 

 

 

 

 

 

 

If this post helps to answer your questions, please consider marking it as a solution so others can find it more quickly when faced with a similar challenge.

Proud to be a Microsoft Fabric Super User

v-kpoloju-msft
Community Support
Community Support

Hi  @dimitragav10_,

Thank you for reaching out to the Microsoft fabric community forum. Additionally, thank you to @ObungiNiels@Cookistador, for his input regarding this issue. I have identified several alternative workarounds that may assist in resolving the thread.

You're absolutely on the right track with automating the ingestion of JSON files into your Lakehouse. Manually uploading files each time can indeed be a bottleneck but this process can be fully automated using either One Lake Shortcuts or Azure Data Factory (ADF).

Use One Lake Shortcuts to Azure Blob Storage: This is the simplest and most efficient approach if your PowerShell script is already saving JSON files to Azure Blob Storage or ADLS Gen2.

  • You create a shortcut in your Lakehouse that points directly to your Azure storage container.
  • Any new files written to Blob will automatically appear in the Lakehouse file browser, ready for processing by your notebook or pipeline.

Kindly refer to the below mentioned link for more information:

Use Azure Data Factory to Copy Files into Lakehouse: If you'd like more control (e.g., pre-processing, filtering), you can build an ADF pipeline that automatically copies JSON files from Blob Storage into your Lakehouse.

  • Configure a scheduled or triggered pipeline in ADF.
  • The pipeline picks up new JSON files from your Blob container and writes them directly into the Lakehouse Files location.
  • Your notebook can then pick up and process the ingested files.

Kindly please refer to the below following links for more information:

If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.

Thank you for using Microsoft Community Forum.

Hello @v-kpoloju-msft 
Thank you for your reply!

My intention is indeed to use the OneLake Shortcuts, but I'm encountering an issue while trying to establish the connection to my Azure Blob Storage. Each time I attempt to set it up—regardless of the authentication method used—I receive the following error:

An exception occurred: DataSource.Error: AzureBlobs failed to get the response: 'The remote name could not be resolved:'

Do you have any ideas on what might be causing this?

 

Hi @dimitragav10_,

Thanks for the update and the detailed error. The error “The remote name could not be resolved” typically points to a DNS or network access issue when trying to create a One Lake Shortcut to Azure Blob Storage. Here’s a checklist to help resolve it:

Verify Storage Account Accessibility: Open this URL in a browser: https://<your-storage-account>.blob.core.windows.net/

If the page does not load or times out, the storage account is either firewalled or not publicly accessible. Review Firewall Settings in the Azure Portal: Navigate to Storage Account > Networking. Ensure public access is allowed or configure access using a Private Endpoint + Managed VNet (currently in preview for Microsoft Fabric).

Verify Shortcut Configuration: Use only the storage account name and container name in the shortcut dialog do not paste full URLs. Example: Storage account: mystorageacct. Container: my-json-data.

Test DNS Resolution: On your machine, run: 

Resolve-DnsName mystorageacct.blob.core.windows.net


If this fails, Fabric cannot resolve it either, indicating a networking or name resolution issue. Try Different Authentication Methods: Test with both Azure AD (ensure your user or service principal has Storage Blob Data Reader permissions) and account key (ensure the key is valid and access is enabled).

Kindly refer to the below following link for more information:
Set up your Azure Blob Storage connection - Microsoft Fabric | Microsoft Learn

This issue is discussed and resolved in the Fabric Community forum here:
Solved: Dataflow refresh Failed - AzureBlobs failed to get the response

In that thread, users reported the same error and resolved it by addressing firewall/network restrictions and DNS accessibility.

If your storage account is private or behind a firewall, try temporarily enabling public access to test the shortcut creation. If it works, then move forward with setting up secure access using private endpoints or managed networks.

If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.

Thank you for using Microsoft Community Forum.

Hi @dimitragav10_,

May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.

Thank you.

Hi @dimitragav10_,

I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.

Hi @dimitragav10_,

I hope this information is helpful. Please let me know if you have any further questions or if you'd like to discuss this further. If this answers your question, please Accept it as a solution and give it a 'Kudos' so others can find it easily.
Thank you.

ObungiNiels
Resolver III
Resolver III

Hi @dimitragav10_ ,

I'd recommend levering shortcuts to connect your lakehouse to your blob storage with a shortcut. Azure Blob Storage is in preview as destination for a shortcut and Azure Datalake Gen2 have been available for a while now. 

When you open your lakehouse and navigate to your File section, click on the three dots and the menu option "New shortcut" should appear. From there, you can follow the instructions in the GUI. 

Please let me know if this resolves your query. 🙂 

Kind regards,

Niels 

Thank you for your response.
I've already attempted to connect using Shortcut, but I'm having trouble configuring the connection to my Blob. Every time I try to establish the connection—regardless of the authentication method I use—I encounter the following error:

An exception occurred: DataSource.Error: AzureBlobs failed to get the response: 'The remote name could not be resolved:'

Am I missing something?

Thank you again!

Cookistador
Super User
Super User

Hello @dimitragav10_ 

 

Your PowerShell script still generates JSON files and places them into Azure Blob Storage. This is your "landing zone."
Fabric Data Pipeline monitors or is triggered to ingest these files via a copy data activity

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.