Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

A Rust runtime for AWS Lambda

License

NotificationsYou must be signed in to change notification settings

awslabs/aws-lambda-rust-runtime

Build Status

This package makes it easy to run AWS Lambda Functions written in Rust. This workspace includes multiple crates:

  • Docslambda-runtime is a library that provides a Lambda runtime for applications written in Rust.
  • Docslambda-http is a library that makes it easy to write API Gateway proxy event focused Lambda functions in Rust.
  • Docslambda-extension is a library that makes it easy to write Lambda Runtime Extensions in Rust.
  • Docslambda-events is a library with strongly-typed Lambda event structs in Rust.
  • Docslambda-runtime-api-client is a shared library between the lambda runtime and lambda extension libraries that includes a common API client to talk with the AWS Lambda Runtime API.

The Rust runtime client is an experimental package. It is subject to change and intended only for evaluation purposes.

Getting started

The easiest way to start writing Lambda functions with Rust is by usingCargo Lambda, a related project. Cargo Lambda is a Cargo plugin, or subcommand, that provides several commands to help you in your journey with Rust on AWS Lambda.

The preferred way to install Cargo Lambda is by using a package manager.

1- Use Homebrew onMacOS:

brew tap cargo-lambda/cargo-lambdabrew install cargo-lambda

2- UseScoop on Windows:

scoop bucket add cargo-lambda https://github.com/cargo-lambda/scoop-cargo-lambdascoop install cargo-lambda/cargo-lambda

Or PiP on any system with Python 3 installed:

pip3 install cargo-lambda

Alternative, install the pip package as an executable usinguv

uv tool install cargo-lambda

See other installation options inthe Cargo Lambda documentation.

Your first function

To create your first function, run Cargo Lambda with thesubcommandnew. This command will generate a Rust package with the initial source code for your function:

cargo lambda new YOUR_FUNCTION_NAME

Example function

If you'd like to manually create your first function, the code below shows you a simple function that receives an event with afirstName field and returns a message to the caller.

use lambda_runtime::{service_fn,LambdaEvent,Error};use serde_json::{json,Value};#[tokio::main]asyncfnmain() ->Result<(),Error>{let func =service_fn(func);    lambda_runtime::run(func).await?;Ok(())}asyncfnfunc(event:LambdaEvent<Value>) ->Result<Value,Error>{let(event, _context) = event.into_parts();let first_name = event["firstName"].as_str().unwrap_or("world");Ok(json!({"message": format!("Hello, {}!", first_name)}))}

Understanding Lambda errors

when a function invocation fails, AWS Lambda expects you to return an object that can be serialized into JSON structure with the error information. This structure is represented in the following example:

{"error_type":"the type of error raised","error_message":"a string description of the error"}

The Rust Runtime for Lambda uses a struct calledDiagnostic to represent function errors internally. The runtime implements the conversion of several general error types, likestd::error::Error, intoDiagnostic. For these general implementations, theerror_type is the name of the value type returned by your function. For example, if your function returnslambda_runtime::Error, theerror_type will be something likealloc::boxed::Box<dyn core::error::Error + core::marker::Send + core::marker::Sync>, which is not very descriptive.

Implement your own Diagnostic

To get more descriptiveerror_type fields, you can implementFrom for your error type. That gives you full control on what theerror_type is:

use lambda_runtime::{Diagnostic,Error,LambdaEvent};#[derive(Debug)]structErrorResponse(&'staticstr);implFrom<ErrorResponse>forDiagnostic{fnfrom(error:ErrorResponse) ->Diagnostic{Diagnostic{error_type:"MyErrorType".into(),error_message: error.0.to_string(),}}}asyncfnhandler(_event:LambdaEvent<()>) ->Result<(),ErrorResponse>{Err(ErrorResponse("this is an error response"))}

We recommend you to use thethiserror crate to declare your errors. You can see an example on how to integratethiserror with the Runtime's diagnostics in ourexample repository

Anyhow, Eyre, and Miette

Popular error crates like Anyhow, Eyre, and Miette provide their own error types that encapsulate other errors. There is no direct transformation of those errors intoDiagnostic, but we provide feature flags for each one of those crates to help you integrate them with your Lambda functions.

If you enable the featuresanyhow,eyre, ormiette in thelambda_runtime dependency of your package. The error types provided by those crates can have blanket transformations intoDiagnostic. These features expose anFrom<T> for Diagnostic implementation that transforms those error types into aDiagnostic. This is an example that transforms ananyhow::Error into aDiagnostic:

use lambda_runtime::{Diagnostic,LambdaEvent};asyncfnhandler(_event:LambdaEvent<Request>) ->Result<(),Diagnostic>{Err(anyhow::anyhow!("this is an error").into())}

You can see more examples on how to use these error crates in ourexample repository.

Graceful shutdown

lambda_runtime offers a helper to simplify configuring graceful shutdown signal handling,spawn_graceful_shutdown_handler(). This requires thegraceful-shutdown feature flag and only supports Unix systems.

You can use it by passing aFnOnce closure that returns an async block. That async block will be executedwhen the function receives aSIGTERM orSIGKILL.

Note that this helper is opinionated in a number of ways. Most notably:

  1. It spawns a task to drive your signal handlers
  2. It registers a 'no-op' extension in order to enable graceful shutdown signals
  3. It panics on unrecoverable errors

If you prefer to fine-tune the behavior, refer to the implementation ofspawn_graceful_shutdown_handler() as a starting point for your own.

For more information on graceful shutdown handling in AWS Lambda, see:aws-samples/graceful-shutdown-with-aws-lambda.

Complete example (cleaning up a non-blocking tracing writer):

use lambda_runtime::{service_fn,LambdaEvent,Error};use serde_json::{json,Value};#[tokio::main]asyncfnmain() ->Result<(),Error>{let func =service_fn(func);let(writer, log_guard) = tracing_appender::non_blocking(std::io::stdout());    lambda_runtime::tracing::init_default_subscriber_with_writer(writer);let shutdown_hook = ||asyncmove{      std::mem::drop(log_guard);};    lambda_runtime::spawn_graceful_shutdown_handler(shutdown_hook).await;    lambda_runtime::run(func).await?;Ok(())}asyncfnfunc(event:LambdaEvent<Value>) ->Result<Value,Error>{let(event, _context) = event.into_parts();let first_name = event["firstName"].as_str().unwrap_or("world");Ok(json!({"message": format!("Hello, {}!", first_name)}))}

Building and deploying your Lambda functions

If you already have Cargo Lambda installed in your machine, run the next command to build your function:

cargo lambda build --release

There are other ways of building your function: manually with the AWS CLI, withAWS SAM, and with theServerless framework.

1. Cross-compiling your Lambda functions

By default, Cargo Lambda builds your functions to run on x86_64 architectures. If you'd like to use a different architecture, use the options described below.

1.1. Build your Lambda functions

Amazon Linux 2023

We recommend you to use the Amazon Linux 2023 (such asprovided.al2023) because it includes a newer version of GLIBC, which many Rust programs depend on. To build your Lambda functions for Amazon Linux 2023 runtimes, run:

cargo lambda build --release --arm64

2. Deploying the binary to AWS Lambda

Fora custom runtime, AWS Lambda looks for an executable calledbootstrap in the deployment package zip. Rename the generated executable tobootstrap and add it to a zip archive.

You can find thebootstrap binary for your function under thetarget/lambda directory.

2.1. Deploying with Cargo Lambda

Once you've built your code with one of the options described earlier, use thedeploy subcommand to upload your function to AWS:

cargo lambda deploy

WarningMake sure to replace the execution role with an existing role in your account!

This command will create a Lambda function with the same name of your rust package. You can change the nameof the function by adding the argument at the end of the command:

cargo lambda deploy my-first-lambda-function

NoteSee other deployment options inthe Cargo Lambda documentation.

You can test the function with theinvoke subcommand:

cargo lambda invoke --remote \  --data-ascii'{"command": "hi"}' \  --output-format json \  my-first-lambda-function

NoteCLI commands in the examples use Linux/MacOS syntax. For different shells like Windows CMD or PowerShell, modify syntax when using nested quotation marks like'{"command": "hi"}'. Escaping with a backslash may be necessary. SeeAWS CLI Reference for more information.

2.2. Deploying with the AWS CLI

You can also use the AWS CLI to deploy your Rust functions. First, you will need to create a ZIP archive of your function. Cargo Lambda can do that for you automatically when it builds your binary if you add theoutput-format flag:

cargo lambda build --release --arm64 --output-format zip

You can find the resulting zip file intarget/lambda/YOUR_PACKAGE/bootstrap.zip. Use that file path to deploy your function with theAWS CLI:

$ aws lambda create-function --function-name rustTest \  --handler bootstrap \  --zip-file fileb://./target/lambda/basic/bootstrap.zip \  --runtime provided.al2023\# Change this to provided.al2 if you would like to use Amazon Linux 2  --role arn:aws:iam::XXXXXXXXXXXXX:role/your_lambda_execution_role \  --environment Variables={RUST_BACKTRACE=1} \  --tracing-config Mode=Active

WarningMake sure to replace the execution role with an existing role in your account!

You can now test the function using the AWS CLI or the AWS Lambda console

$ aws lambda invoke  --cli-binary-format raw-in-base64-out \  --function-name rustTest \  --payload'{"command": "Say Hi!"}' \  output.json$ cat output.json# Prints: {"msg": "Command Say Hi! executed."}

Note--cli-binary-format raw-in-base64-out is a required argument when using the AWS CLI version 2.More Information

2.3. AWS Serverless Application Model (SAM)

You can use Lambda functions built in Rust with theAWS Serverless Application Model (SAM). To do so, you will need to install theAWS SAM CLI, which will help you package and deploy your Lambda functions in your AWS account.

You will need to create atemplate.yaml file containing your desired infrastructure in YAML. Here is an example with a single Lambda function:

AWSTemplateFormatVersion:'2010-09-09'Transform:AWS::Serverless-2016-10-31Resources:HelloWorldFunction:Type:AWS::Serverless::FunctionProperties:MemorySize:128Architectures:["arm64"]Handler:bootstrapRuntime:provided.al2023Timeout:5CodeUri:target/lambda/basic/Outputs:FunctionName:Value:!Ref HelloWorldFunctionDescription:Name of the Lambda function

You can then deploy your Lambda function using the AWS SAM CLI:

sam deploy --guided

At the end,sam will output the actual Lambda function name. You can use this name to invoke your function:

$ aws lambda invoke  --cli-binary-format raw-in-base64-out \  --function-name HelloWorldFunction-XXXXXXXX\# Replace with the actual function name  --payload'{"command": "Say Hi!"}' \  output.json$ cat output.json# Prints: {"msg": "Command Say Hi! executed."}

Local development and testing

Testing your code with unit and integration tests

AWS Lambda events are plain structures deserialized from JSON objects.If your function handler uses the standard runtime, you can useserde to deserializeyour text fixtures into the structures, and call your handler directly:

#[test]fntest_my_lambda_handler(){let input = serde_json::from_str("{\"command\":\"Say Hi!\"}").expect("failed to parse event");let context = lambda_runtime::Context::default();let event = lambda_runtime::LambdaEvent::new(input, context);my_lambda_handler(event).await.expect("failed to handle event");}

If you're usinglambda_http to receive HTTP events, you can also createhttp_lambda::Requeststructures from plain text fixtures:

#[test]fntest_my_lambda_handler(){let input =include_str!("apigw_proxy_request.json");let request = lambda_http::request::from_str(input).expect("failed to create request");let response =my_lambda_handler(request).await.expect("failed to handle request");}

Local dev server with Cargo Lambda

Cargo Lambda provides a local server that emulates the AWS Lambda control plane. This server works on Windows, Linux, and MacOS. In the root of your Lambda project. You can run the following subcommand to compile your function(s) and start the server.

cargo lambda watch

Now you can use thecargo lambda invoke to send requests to your function. For example:

cargo lambda invoke<lambda-function-name> --data-ascii'{ "command": "hi" }'

Running the command on a HTTP function (Function URL, API Gateway, etc) will require you to use the appropriate scheme. You can find examples of these schemeshere. Otherwise, you will be presented with the following error.

Error: serde_json::error::Error  × data did notmatch any variant of untagged enumLambdaRequest

An simpler alternative is to cURL the following endpoint based on the address and port you defined. For example:

curl -v -X POST \'http://127.0.0.1:9000/lambda-url/<lambda-function-name>/' \  -H'content-type: application/json' \  -d'{ "command": "hi" }'

WarningDo not remove thecontent-type header. It is necessary to instruct the function how to deserialize the request body.

You can read more about howcargo lambda watch andcargo lambda invoke work on the project'sdocumentation page.

Lambda Debug Proxy

Lambdas can be run and debugged locally using a specialLambda debug proxy (a non-AWS repo maintained by @rimutaka), which is a Lambda function that forwards incoming requests to one AWS SQS queue and reads responses from another queue. A local proxy running on your development computer reads the queue, calls your Lambda locally and sends back the response. This approach allows debugging of Lambda functions locally while being part of your AWS workflow. The Lambda handler code does not need to be modified between the local and AWS versions.

Tracing and Logging

The Rust Runtime for Lambda integrates with theTracing libraries to provide tracing and logging.

By default, the runtime emitstracing events that you can collect viatracing-subscriber. It also enabled a feature calledtracing that exposes a default subscriber with sensible options to send logging information to AWS CloudWatch. Follow the next example that shows how to enable the default subscriber:

use lambda_runtime::{run, service_fn, tracing,Error};#[tokio::main]asyncfnmain() ->Result<(),Error>{    tracing::init_default_subscriber();run(service_fn(|event| tracing::info!(?event))).await}

The subscriber usesRUST_LOG environment variable to determine the log level for your function. It also usesLambda's advanced logging controls, if configured.

By default, the log level to emit events isINFO. Log atTRACE level for more detail, including a dump of the raw payload.

AWS event objects

This project includes Lambda event struct definitions,aws_lambda_events. This crate can be leveraged to provide strongly-typed Lambda event structs. You can create your own custom event objects and their corresponding structs as well.

Custom event objects

To serialize and deserialize events and responses, we suggest using theserde library. To receive custom events, annotate your structure with Serde's macros:

use serde::{Serialize,Deserialize};use serde_json::json;use std::error::Error;#[derive(Serialize,Deserialize)]pubstructNewIceCreamEvent{pubflavors:Vec<String>,}#[derive(Serialize,Deserialize)]pubstructNewIceCreamResponse{pubflavors_added_count:usize,}fnmain() ->Result<(),Box<Error>>{let flavors =json!({"flavors":["Nocciola","抹茶","आम"]});let event:NewIceCreamEvent = serde_json::from_value(flavors)?;let response =NewIceCreamResponse{flavors_added_count: event.flavors.len(),};    serde_json::to_string(&response)?;Ok(())}

Supported Rust Versions (MSRV)

The AWS Lambda Rust Runtime requires a minimum of Rust 1.81.0, and is not guaranteed to build on compiler versions earlier than that.

Security

SeeCONTRIBUTING for more information.

License

This project is licensed under the Apache-2.0 License.

About

A Rust runtime for AWS Lambda

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Packages

No packages published

[8]ページ先頭

©2009-2025 Movatter.jp