[MS] Building a Model Context Protocol Server with Semantic Kernel - devamazonaws.blogspot.com

This is second MCP related blog post that is part of a series of blog posts that will cover how to use Semantic Kernel (SK) with the Model Context Protocol (MCP). This blog post demonstrates how to build an MCP server using MCP C# SDK and SK, expose SK plugins as MCP tools and call the tools from client side via SK. Here are a few reasons why you might want to build an MCP server with SK:
  • Interoperability: Existing SK plugins need to be reused and exposed as MCP tools so they can be consumed by non-SK applications or by SK for a different platform.
  • Content safety: Each tool call needs to be validated before it is executed using SK Filters.
  • Observability: Tool call-related logs, traces, and metrics need to be collected(see: SK observability) and saved to an existing monitoring infrastructure.
For more information about MCP, please refer to the documentation. The sample described below uses the official ModelContextProtocol nuget package. Its runnable source code is available in the Semantic Kernel repository.

Build MCP Server and Expose SK Plugins as MCP Tools

Let's start by declaring SK plugins that will be exposed as MCP tools by the server in the next step:
internal sealed class DateTimeUtils
{
    [KernelFunction, Description("Retrieves the current date time in UTC.")]
    public static string GetCurrentDateTimeInUtc()
    {
        return DateTime.Now.ToString("yyyy-MM-dd HH:mm:ss");
    }
}

internal sealed class WeatherUtils
{
    [KernelFunction, Description("Gets the current weather for the specified city and specified date time.")]
    public static string GetWeatherForCity(string cityName, string currentDateTimeInUtc)
    {
        return cityName switch
        {
            "Boston" => "61 and rainy",
            "London" => "55 and cloudy",
            ...
        };
    }
}
Now we will create an MCP server and import the SK plugins:
// Create a kernel builder and add plugins
IKernelBuilder kernelBuilder = Kernel.CreateBuilder();
kernelBuilder.Plugins.AddFromType<DateTimeUtils>();
kernelBuilder.Plugins.AddFromType<WeatherUtils>();

// Build the kernel
Kernel kernel = kernelBuilder.Build();

var builder = Host.CreateEmptyApplicationBuilder(settings: null);
builder.Services
    .AddMcpServer()
    .WithStdioServerTransport()
    // Add kernel functions to the MCP server as MCP tools
    .WithTools(kernel.Plugins.SelectMany(p => p.Select(f => f.AsAIFunction())));
await builder.Build().RunAsync();
The code above creates kernel and imports SK plugins. Then it creates an MCP server and converts the SK plugins functions to AI functions which eventually are exposed as MCP tools by the server. The code can be further extended to register function invocation filters to intercept function calls and validate them before execution in addition to enabling observability to collect tool call-related logs, traces, and metrics.

Build MCP Client and Use MCP tools in SK

Next we will create an MCP client:
await using var mcpClient = await McpClientFactory.CreateAsync(
    new McpServerConfig()
    {
        Id = "MCPServer",
        Name = "MCPServer",
        TransportType = TransportTypes.StdIo,
        TransportOptions = new()
        {
            // Point the client to the MCPServer server executable
            ["command"] = Path.Combine("..", "..", "..", "..", "MCPServer", "bin", "Debug", "net8.0", "MCPServer.exe");
        }
    },
    new McpClientOptions()
    {
        ClientInfo = new() { Name = "MCPClient", Version = "1.0.0" }
    }
);
Then we get MCP tools from the server, import them to SK, add OpenAI chat completion service and build the kernel:
IList<AIFunction> tools = await mcpClient.GetAIFunctionsAsync().ConfigureAwait(false);

IKernelBuilder kernelBuilder = Kernel.CreateBuilder();
kernelBuilder.Plugins.AddFromFunctions("Tools", tools.Select(aiFunction => aiFunction.AsKernelFunction()));
kernelBuilder.Services.AddOpenAIChatCompletion(serviceId: "openai", modelId: config["OpenAI:ChatModelId"] ?? "gpt-4o-mini", apiKey: apiKey);

Kernel kernel = kernelBuilder.Build();
The code above iterates over the MCP tools represented by AI functions, convert them to SK functions and add them to the kernel so they can be used by AI models. Now, we can prompt the AI model and the AI model will automatically call the imported kernel functions which will delegate the calls to the MCP tools to answer the prompt:
// Enable automatic function calling
OpenAIPromptExecutionSettings executionSettings = new()
{
    Temperature = 0,
    FunctionChoiceBehavior = FunctionChoiceBehavior.Auto(options: new() { RetainArgumentTypes = true })
};

// Execute a prompt using the MCP tools. The AI model will automatically call the appropriate MCP tools to answer the prompt.
var prompt = "What is the likely color of the sky in Boston today?";
var result = await kernel.InvokePromptAsync(prompt, new(executionSettings)).ConfigureAwait(false);
Console.WriteLine(result);
After calling the MCP tools, AI model will return the following result:
The likely color of the sky in Boston today is gray, as it is currently rainy.

What's Next?

We recommend trying out the sample code in the Semantic Kernel repository to get started. Next, we plan to create more samples to demonstrate how to use Semantic Kernel for building MCP prompts and resources on the server side, and how to use them with the SK client side.
Post Updated on March 28, 2025 at 01:32PM
Thanks for reading
from devamazonaws.blogspot.com

Comments

Popular posts from this blog

Scenarios capability now generally available for Amazon Q in QuickSight - devamazonaws.blogspot.com

Research and Engineering Studio on AWS Version 2024.08 now available - devamazonaws.blogspot.com

Amazon EC2 C6id instances are now available in AWS Europe (Paris) region - devamazonaws.blogspot.com