Earn recognition and rewards for your Microsoft Fabric Community contributions and become the hero our community deserves.
Learn moreSee when key Fabric features will launch and what’s already live, all in one place and always up to date. Explore the new Fabric roadmap
Hi all,
Let me start with this - I am 99.99% new to Fabric...
What I have done so far is load some tables from Azure Synapse into Fabric in a Lakehouse via a Dataflow Gen2.
What I also want to do is load a custom SQL function I have created in T-SQL in Azure Synapse into Fabrinc in this Lakehouse and be able to use it there.
When I do that, again via Dataflow Gen2, I don't have the option to choose a 'Data destination':
I can use the function in this screen, but I can't load it to my lakehouse and I can't use it in any current SQL query.
Hopefully someone can advise what is it I doing wrong.
Thanks,
Maria
Solved! Go to Solution.
Hi @mtomova
Please go through this official documentaion Preprocess data with a stored procedure before loading into Lakehouse - Microsoft Fabric | Microsoft... it might helps you to solve your problem.
Thank you!
Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
Hi, thank you for the link you have shared.
Unfortunatelly, that doesn't seem to be working for me, because my SQL function is not creating a table.
I have created the function in Fabric itself and I can use it when I pass parameters, but when I try to use it in another query (my function counts working days between two dates) the query just runs forever with no result.
Thanks,
Maria
Hi @mtomova,
Thank you for following up to the conversation. I understand that you're facing an issue where your SQL function in Fabric works when passing parameters directly but runs indefinitely when used inside another query.
This is likely happening due to how Fabric’s SQL engine handles functions, particularly with recursive or row-by-row operations. Unlike Azure Synapse, Fabric does not fully optimize T-SQL scalar functions, which can cause performance issues when applied to large datasets.
Instead of using a function to count working days dynamically, you can create a working days calendar table and use a direct query. You can create a Calendar Table that marks weekends and holidays as non-working days and once the table is created, you can use a simple query instead of a fucntion to count working days between two dates.
If I misunderstand your needs or you still have problems on it, please feel free to let us know.
Best Regards,
Hammad.
Community Support Team
If this post helps then please mark it as a solution, so that other members find it more quickly.
Thank you.
Hi @v-mdharahman ,
thanks for the idea. I will probably explore it, because as you have said T-SQL scalar functions cause performance issues in Fabric..
Thanks,
Maria
Hi @mtomova,
Thanks for reaching out to the Microsoft fabric community forum.
It looks like you want to migrate a custom T-SQL function from Azure Synapse Analytics to Microsoft Fabric's Lakehouse using Dataflow Gen2. This issue arises because Microsoft Fabric does not support directly storing or executing user-defined functions (UDFs) created in T-SQL within a Lakehouse. Unlike Synapse, where you can define and use SQL functions, Fabric’s Lakehouse architecture is based on Delta Lake, which does not natively support T-SQL functions like a traditional SQL database.
As @suparnababu8 already responded to your query, please go through his response and mark it as solution if it answers your query.
I would also take a moment to thank @suparnababu8, for actively participating in the community forum and for the solutions you’ve been sharing in the community forum. Your contributions make a real differenc
If I misunderstand your needs or you still have problems on it, please feel free to let us know.
Best Regards,
Hammad.
Community Support Team
If this post helps then please mark it as a solution, so that other members find it more quickly.
Thank you.
Dado que Dataflow Gen2 usa Power Query en lugar de SQL se debe reescribir la función en este lenguaje. 1.-Si la función realiza cálculos en columnas, se implementa la lógica con columnas calculadas en Power Query. 2.- Si la función es más compleja, por ejemplo transformaciones condicionales o agregaciones , usar funciones personalizadas en el lenguaje de Power Query. Estas son funciones definidas por el usuario para reutilizar lógica y aplicar transformaciones dinámicamente en Dataflow Gen2 o Power BI . Se puede usar para cálculos, limpieza de datos o transformaciones avanzadas. 3.-Si la función usa expresiones escalares o de tabla, se puede usar columnas calculadas, consultas referenciadas o transformaciones en Dataflow Gen2 . 4.- Si la función T-SQL es muy compleja, por ejemplo procesamiento en varias tablas o cálculos avanzados, se puede usar Notebooks en Fabric con PySpark o Spark SQL para replicarla. 5.- Si la función depende de SQL avanzado y no se puede convertir fácilmente se puede mover a un Data Warehouse en Fabric, donde puedes usar Stored Procedures en T-SQL con compatibilidad parcial.
Hi @mtomova
Please go through this official documentaion Preprocess data with a stored procedure before loading into Lakehouse - Microsoft Fabric | Microsoft... it might helps you to solve your problem.
Thank you!
Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
User | Count |
---|---|
62 | |
41 | |
14 | |
10 | |
4 |
User | Count |
---|---|
76 | |
58 | |
17 | |
11 | |
8 |