Edit

Share via


Use AI tools and models in Azure Functions

Azure Functions provides serverless compute resources that integrate with AI and Azure services to streamline building cloud-hosted intelligent applications. This article provides a survey of the breadth of AI-related scenarios, integrations, and other AI resources that you can use in your function apps.

Some of the inherent benefits of using Azure Functions as a compute resource for your AI-integrated tasks include:

  • Rapid, event-driven scaling: you have compute resources available when you need them. With certain plans, your app scales back to zero when no longer needed. For more information, see Event-driven scaling in Azure Functions.
  • Model Context Protocol (MCP) servers: Functions enables you to easily create and deploy remote MCP servers to make your data and functions available to AI agents and large language models (LLMs).
  • Built-in support for Azure OpenAI: the OpenAI binding extension greatly simplifies interacting with Azure OpenAI for both retrieval-augmented generation (RAG) and agentic workflows.
  • Broad language and library support: Functions lets you interact with AI using your choice of programming language, plus you're able to use a broad variety of AI frameworks and libraries.
  • Orchestration capabilities: while function executions are inherently stateless, the Durable Functions extension lets you create complex workflows that your AI agents require.

This article is language-specific, so make sure you choose your programming language at the top of the page.

The combination of built-in bindings and broad support for external libraries provides you with a wide range of potential scenarios for augmenting your apps and solutions with the power of AI.

The following sections introduce some key AI integration scenarios supported by Functions.

Retrieval-augmented generation

Because Functions can handle multiple events from various data sources simultaneously, it's an effective solution for real-time AI scenarios, like RAG systems that require fast data retrieval and processing. Rapid event-driven scaling reduces the latency your customers experience, even in high-demand situations.

Here are some reference samples for RAG-based scenarios:

For RAG, you can use SDKs, including Azure Open AI and Azure SDKs, to build your scenarios. ::: zone-end

Shows you how to create a friendly chat bot that issues simple prompts, receives text completions, and sends messages, all in a stateful session using the OpenAI binding extension.

Remote MCP servers

The Model Context Protocol (MCP) provides a standardized way for AI models and agents to communicate with external systems to determine how to best make use of their capabilities. An MCP server lets an AI model or agent (client) make these determinations more efficiently. You can use an MCP server to publicly expose specific resources as tools, which are then called by agents to accomplish specific tasks.

When you build or host your remote MCP servers in Azure Functions, you get dynamic scaling, serverless pricing models, and platform security features.

Functions supports these options for creating and hosting remote MCP servers:

  • Use the MCP binding extension to create and host custom MCP servers as you would any other function app.
  • Self host MCP servers created using the official MCP SDKs. This hosting option is currently in preview.

Here's a comparison of the current MCP server hosting options provided by Functions:

Feature MCP binding extension Self-hosted MCP servers
Current support level GA Preview*
Programming model Functions triggers and bindings Standard MCP SDKs
Stateful execution Supported Not currently supported
Languages currently supported C# (isolated process)
Python
TypeScript
JavaScript
Java
C# (isolated process)
Python
TypeScript
JavaScript
Java
Other requirements None Streamable HTTP transport
How implemented MCP binding extension Custom handlers

*Configuration details for self-hosted MCP servers will change during the preview.

Here are some options to help you get started hosting MCP servers in Functions:

Options MCP binding extensions Self-hosted MCP servers
Documentation MCP binding extension n/a
Samples Remote custom MCP server Weather server
Copilot prompts
(Visual Studio Code)
n/a Setup prompt
Templates HelloTool n/a
Options MCP binding extensions Self-hosted MCP servers
Documentation MCP binding extensions n/a
Samples Remote custom MCP server Weather server
Copilot prompts
(Visual Studio Code)
n/a Setup prompt
Options MCP binding extensions Self-hosted MCP servers
Documentation MCP binding extensions n/a
Samples Remote custom MCP server Weather server
Copilot prompts
(Visual Studio Code)
n/a Setup prompt
Options MCP binding extensions Self-hosted MCP servers
Documentation MCP binding extensions n/a
Samples n/a Weather server
Options MCP binding extensions Self-hosted MCP servers
Documentation MCP binding extensions n/a
Samples Not yet available Not yet available

PowerShell isn't currently supported for either MCP server hosting options.

Currently consider the deployment helper chat prompt experimental.

Function calling

Function calling gives your AI agent the ability to dynamically invoke specific AI tools or APIs based on the context of a conversation or task. These MCP-enabled behaviors let your agents interact with external systems, retrieve data, and perform other actions.

Functions is ideal for implementing function calling in agentic workflows. In addition to scaling efficiently to handle demand, binding extensions simplify the process of using Functions to connect agents with remote Azure services. If there's no binding for your data source or you need full control over SDK behaviors, you can manage your own client SDK connections in your app.

Here are some reference samples for function calling scenarios:

Uses an Azure AI Foundry Agent Service client to call a custom remote MCP server implemented using Azure Functions.

Uses function calling features for agents in Azure AI SDKs to implement custom function calling.

Agentic workflows

It's common for AI-driven processes to autonomously determine how to interact with models and other AI assets. However, there are many cases where you need a higher level of predictability or where the steps are well defined. These directed agentic workflows are composed of an orchestration of separate tasks or interactions that agents are required to follow.

The Durable Functions extension helps you take advantage of the strengths of Functions to create multi-step, long-running operations with built-in fault tolerance. These workflows work well for your directed agentic workflows. For example, a trip planning solution might first gather requirements from the user, search for plan options, obtain user approval, and finally make required bookings. In this scenario, you can build an agent for each step and then coordinate their actions as a workflow using Durable Functions.

For more workflow scenario ideas, see Application patterns in Durable Functions.

AI tools and frameworks for Azure Functions

Functions lets you build apps in your preferred language and using your favorite libraries. Because of this flexibility, you use a wide range of AI libraries and frameworks in your AI-enabled function apps.

Here are some of the key Microsoft AI frameworks of which you should be aware:

Framework/library Description
Azure AI Foundry Agent Service A fully managed service for building, deploying, and scaling AI agents with enterprise-grade security, built-in tools, and seamless integration with Azure Functions.
Azure AI Services SDKs By working directly with client SDKs, you can use the full breadth of Azure AI services functionality directly in your function code.
OpenAI binding extension Easily integrate the power of Azure OpenAI in your functions and let Functions manage the service integration.
Semantic Kernel Lets you easily build AI agents and models.

Functions also lets your apps reference third-party libraries and frameworks, which means that you can use all of your favorite AI tools and libraries in your AI-enabled functions.