Skip to content

Cargo Lambda Watch

The watch subcommand emulates the AWS Lambda control plane API. Run this command at the root of a Rust workspace and cargo-lambda will use cargo-watch to hot compile changes in your Lambda functions.

cargo lambda watch
cargo lambda watch

The function is not compiled until the first time that you try to execute it. See the invoke command to learn how to execute a function. Cargo will run the command cargo run --bin FUNCTION_NAME to try to compile the function. FUNCTION_NAME can be either the name of the package if the package has only one binary, or the binary name in the [[bin]] section if the package includes more than one binary.

The following video shows how you can use this subcommand to develop functions locally:

Environment variables

If you need to set environment variables for your function to run, you can specify them in the metadata section of your Cargo.toml file.

Use the section package.metadata.lambda.env to set global variables that will applied to all functions in your package:

toml
[package]
name = "basic-lambda"

[package.metadata.lambda.env]
RUST_LOG = "debug"
MY_CUSTOM_ENV_VARIABLE = "custom value"
[package]
name = "basic-lambda"

[package.metadata.lambda.env]
RUST_LOG = "debug"
MY_CUSTOM_ENV_VARIABLE = "custom value"

If you have more than one function in the same package, and you want to set specific variables for each one of them, you can use a section named after each one of the binaries in your package, package.metadata.lambda.bin.BINARY_NAME:

toml
[package]
name = "lambda-project"

[package.metadata.lambda.env]
RUST_LOG = "debug"

[package.metadata.lambda.bin.get-product.env]
GET_PRODUCT_ENV_VARIABLE = "custom value"

[package.metadata.lambda.bin.add-product.env]
ADD_PRODUCT_ENV_VARIABLE = "custom value"

[[bin]]
name = "get-product"
path = "src/bin/get-product.rs"

[[bin]]
name = "add-product"
path = "src/bin/add-product.rs"
[package]
name = "lambda-project"

[package.metadata.lambda.env]
RUST_LOG = "debug"

[package.metadata.lambda.bin.get-product.env]
GET_PRODUCT_ENV_VARIABLE = "custom value"

[package.metadata.lambda.bin.add-product.env]
ADD_PRODUCT_ENV_VARIABLE = "custom value"

[[bin]]
name = "get-product"
path = "src/bin/get-product.rs"

[[bin]]
name = "add-product"
path = "src/bin/add-product.rs"

You can also set environment variables on a workspace

toml
[workspace.metadata.lambda.env]
RUST_LOG = "debug"

[workspace.metadata.lambda.bin.get-product.env]
GET_PRODUCT_ENV_VARIABLE = "custom value"
[workspace.metadata.lambda.env]
RUST_LOG = "debug"

[workspace.metadata.lambda.bin.get-product.env]
GET_PRODUCT_ENV_VARIABLE = "custom value"

These behave in the same way, package environment variables will override workspace settings, the order of precedence is:

  1. Package Binary
  2. Package Global
  3. Workspace Binary
  4. Workspace Global

You can also use the flag --env-vars to add environment variables. This flag supports a comma separated list of values:

cargo lambda watch --env-vars FOO=BAR,BAZ=QUX
cargo lambda watch --env-vars FOO=BAR,BAZ=QUX

The flag --env-var allows you to pass several variables in the command line with the format KEY=VALUE. This flag overrides the previous one, and cannot be combined.

cargo lambda watch --env-var FOO=BAR --env-var BAZ=QUX
cargo lambda watch --env-var FOO=BAR --env-var BAZ=QUX

The flag --env-file will read the variables from a file and add them to the function during the deploy. Each variable in the file must be in a new line with the same KEY=VALUE format:

cargo lambda watch --env-file .env
cargo lambda watch --env-file .env

Function URLs

The emulator server includes support for Lambda function URLs out of the box. Since we're working locally, these URLs are under the /lambda-url path instead of under a subdomain. The function that you're trying to access through a URL must respond to Request events using lambda_http, or raw ApiGatewayV2httpRequest events.

You can create functions compatible with this feature by running cargo lambda new --http FUNCTION_NAME.

To access a function via its HTTP endpoint, start the watch subcommand cargo lambda watch, then send requests to the endpoint http://localhost:9000. You can add any additional path, or any query parameters.

WARNING

Your function MUST have the apigw_http feature enabled in the lambda_http dependency for Function URLs to work. The payload that AWS sends is only compatible with the apigw_http format, not with the apigw_rest format.

Multi-function projects

If your project includes several functions under the same package, you can access them using the function's name as the prefix in the request path http://localhost:9000/lambda-url/FUNCTION_NAME. You can also add any additional path after the function name, or any query parameters.

Lambda response streaming

When you work with function URLs, you can stream responses to the client with Lambda's support for Streaming Responses.

Start the watch command in a function that uses the Response Streaming API, like the example function in the Runtime's repository:

cargo lambda watch
cargo lambda watch

Then use cURL to send requests to the Lambda function. You'll see that the client starts printing the response as soon as it receives the first chunk of data, without waiting to have the complete response:

curl http://localhost:9000
curl http://localhost:9000

Enabling features

You can pass a list of features separated by comma to the watch command to load them during run:

cargo lambda watch --features feature-1,feature-2
cargo lambda watch --features feature-1,feature-2

Debug with breakpoints

You have two options to debug your application, set breakpoints, and step through your code using a debugger like GDB or LLDB.

The first option is to let Cargo Lambda start your function and manually attach your debugger to the newly created process that hosts your function. This option automatically terminates the function's process, rebuilds the executable and restarts it when your code changes. The debugger must be reattached to the process when the function every time the function boots.

The second option is to let Cargo Lambda provide the Lambda runtime APIs for your function by setting the flag --only-lambda-apis, and manually starting the lambda function from your IDE in debug mode. This way, the debugger is attached to the new process automatically by your IDE. When you modify your function's source code, let your IDE rebuild and relaunch the function and reattach the debugger to the new process.

The drawback of the second option is that essential environment variables are not provided automatically to your function by Cargo Lambda, but have to be configured in your IDE's launch configuration. If you provide a function name when you invoke the function, you must replace _ with that name.

These environment variables are also mentioned as info messages in the log output by cargo-lambda.

Ignore changes

If you want to run the emulator without hot reloading the function every time there is a change in the code, you can use the flag --ignore-changes:

cargo lambda watch --ignore-changes
cargo lambda watch --ignore-changes

Release mode

You can also run your code in release mode if needed when the emulator is loaded:

cargo lambda watch --release
cargo lambda watch --release

Working with extensions

You can boot extensions locally that can be associated to a function running under the watch command.

In the terminal where your Lambda function code lives, run Cargo Lambda as usual cargo lambda watch.

In the terminal where your Lambda extension code lives, export the runtime api endpoint as an environment variable, and run your extension with cargo run:

This will make your extension to send requests to the local runtime to register the extension and subscribe to events. If your extension subscribes to INVOKE events, it will receive an event every time you invoke your function locally. If your extension subscribes to SHUTDOWN events, it will receive an event every time the function is recompiled after code changes.

WARNING

At the moment Log and Telemetry extensions don't receive any data from the local runtime.

The following video shows you how to use the watch subcommand with Lambda extensions:

Released under the MIT License.