Mar 11, 2020

Running .NET Core 3.1 on AWS Lambda

AWS Lambda supports multiple languages through the use of runtimes. To use languages that are not natively supported, you can implement custom runtime, which is a program that invokes the lambda function's handler method. The runtime should be included in the deployment package in the form of an executable file named bootstrap. Here is the list of things which you need to do in order to run a .NET Core 3.1 on AWS lambda.

bootstrap

Since this is not a supported runtime, you need to include a bootstrap file which is a shell script that Lambda host calls to start the custom runtime.
#!/bin/sh
/var/task/YourApplicationName

Changes to project file

You need a couple of NuGet packages from Amazon.LambdaAspNetCoreServer and RuntimeSupport. AspNetCoreServer provides the functionality to convert API Gateway’s request and responses to ASP.NET Core’s request and responses and RuntimeSupport provides support for using custom .NET Core Lambda runtimes in Lambda

<PackageReference Include="Amazon.Lambda.AspNetCoreServer" Version="4.1.0" />
<PackageReference Include="Amazon.Lambda.RuntimeSupport" Version="1.1.0" /> 

Apart from that, you need to make sure to include bootstrap in the package and change the project output type to exe.

<OutputType>Exe</OutputType>

<ItemGroup>
    <Content Include="bootstrap">
      <CopyToOutputDirectory>Always</CopyToOutputDirectory>
    </Content>
</ItemGroup> 

Add Lambda entry point

This class extends from APIGatewayProxyFunction which contains the method FunctionHandlerAsync which is the actual Lambda function entry point. In this class override the init method where you need to configure startup class using the UseStartup<>() method. If you have any special requirements, you can use FunctionHandlerAsync, where you can write your own handler. One example will be lambda warmer, where you don't want the actual code to be executed, rather you would want to respond directly from this method. The following code snippet is just for reference purpose, with provisioned concurrency supported in AWS lambda, you can achieve the same


public override async Task<APIGatewayProxyResponse> FunctionHandlerAsync(APIGatewayProxyRequest request, ILambdaContext lambdaContext)
        {
            if (request.Resource == "WarmingLambda")
            {
                if (string.IsNullOrEmpty(containerId)) containerId = lambdaContext.AwsRequestId;
                Console.WriteLine($"containerId - {containerId}");

                var concurrencyCount = 1;
                int.TryParse(request.Body, out concurrencyCount);

                Console.WriteLine($"Warming instance { concurrencyCount}.");
                if (concurrencyCount > 1)
                {
                    var client = new AmazonLambdaClient();
                    await client.InvokeAsync(new Amazon.Lambda.Model.InvokeRequest
                    {
                        FunctionName = lambdaContext.FunctionName,
                        InvocationType = InvocationType.RequestResponse,
                        Payload = JsonConvert.SerializeObject(new APIGatewayProxyRequest
                        {
                            Resource = request.Resource,
                            Body = (concurrencyCount - 1).ToString()
                        })
                    });
                }
                
                return new APIGatewayProxyResponse { };
            }
        
            return await base.FunctionHandlerAsync(request, lambdaContext);

        }

Update Main function

In NET Core 2.1 which is native Lambda runtime, the LambdaEntryPoint is loaded by Lambda through reflection(through the handler configuration) but with custom runtime, this needs to be loaded by the main function. To make sure the ASP.NET Core project works locally using Kestrel, you can check if AWS_LAMBDA_FUNCTION_NAME environment variable exists.


if (string.IsNullOrEmpty(Environment.GetEnvironmentVariable("AWS_LAMBDA_FUNCTION_NAME")))
{
CreateHostBuilder(args).Build().Run();
}
else
{
var lambdaEntry = new LambdaEntryPoint();
var functionHandler = (Func<APIGatewayProxyRequest, ILambdaContext, Task<APIGatewayProxyResponse>>)(lambdaEntry.FunctionHandlerAsync);
using (var handlerWrapper = HandlerWrapper.GetHandlerWrapper(functionHandler, new JsonSerializer()))
using (var bootstrap = new LambdaBootstrap(handlerWrapper))
{
bootstrap.RunAsync().Wait();
}
}

Add defaults file

.NET Lambda command-line tools and VS deployment wizard use a file called aws-lambda-tools-defaults.json for settings to use for packaging Lambda project into a zip file ready for deployment and for deployment. Deployment under the hood uses cloud formation. Run following to explore more about tool
dotnet lambda help

Cli Command

dotnet lambda package --output-package lambda-build/deploy-package.zip
dotnet lambda help