Apex lets you build, deploy, and manage AWS Lambda functions with ease. With Apex you can use languages that are not natively supported by AWS Lambda, such as Golang, through the use of a Node.js shim injected into the build. A variety of workflow related tooling is provided for testing functions, rolling back deploys, viewing metrics, tailing logs, hooking into the build system and more.


On macOS, Linux, or OpenBSD run the following:

curl https://raw.githubusercontent.com/apex/apex/master/install.sh | sh

Note that you may need to run the sudo version below, or alternatively chown /usr/local:

curl https://raw.githubusercontent.com/apex/apex/master/install.sh | sudo sh

On Windows download binary.

If already installed, upgrade with:

apex upgrade

AWS credentials

Before using Apex you need to first give it your account credentials so that Apex can manage resources. There are a number of ways to do that, which are outlined here.

Via environment variables

Using environment variables only, you must specify the following:

If you have multiple AWS projects you may want to consider using a tool such as direnv to localize and automatically set the variables when you’re working on a project.

Via ~/.aws files

Using the ~/.aws/credentials and ~/.aws/config files you may specify AWS_PROFILE to tell apex which one to reference. However, if you do not have a ~/.aws/config file, or “region” is not defined, you should set it with the AWS_REGION environment variable. To read more on configuring these files view Configuring the AWS CLI.

Here’s an example of ~/.aws/credentials:

aws_access_key_id = xxxxxxxx
aws_secret_access_key = xxxxxxxxxxxxxxxxxxxxxxxx

Here’s an example of ~/.aws/config:

[profile example]
output = json
region = us-west-2

Via profile flag

If you have both ~/.aws/credentials and ~/.aws/config you may specify the profile directly with apex --profile <name> when issuing commands. This means you do not have to specify any environment variables, however you must provide it with each operation:

$ apex --profile myapp-prod deploy

Via project configuration

You may store the profile name in the project.json file itself as shown in the following snippet. This is ideal since it ensures that you do not accidentally have a different environment set.

  "profile": "myapp-prod"


Precedence for loading the AWS credentials is:

Getting started

Apex can help initialize a basic project to get you started. First specify your AWS credentials as mentioned in the previous section, then run apex init:

$ export AWS_PROFILE=sloths-stage
$ apex init

You’ll be presented with a few prompts, the project’s default Lambda IAM role & policy will be created, then you’re ready to go!

             _    ____  _______  __
            / \  |  _ \| ____\ \/ /
           / _ \ | |_) |  _|  \  /
          / ___ \|  __/| |___ /  \
         /_/   \_\_|   |_____/_/\_\

  Enter the name of your project. It should be machine-friendly, as this
  is used to prefix your functions in Lambda.

    Project name: sloths

  Enter an optional description of your project.

    Project description: My slothy project

  [+] creating IAM sloth_lambda_function role
  [+] creating IAM sloth_lambda_logs policy
  [+] attaching policy to lambda_function role.
  [+] creating ./project.json
  [+] creating ./functions

  Setup complete, deploy those functions!

    $ apex deploy

Now try invoking the sample function:

$ apex invoke hello


Apex supports command and function name autocompletion. To enable this functionality you’ll need to add the following shell script to your profile or similar:

_apex()  {
  local cur="${COMP_WORDS[COMP_CWORD]}"
  local opts="$(apex autocomplete -- ${COMP_WORDS[@]:1})"
  COMPREPLY=( $(compgen -W "${opts}" -- ${cur}) )
  return 0

complete -F _apex apex

Structuring projects

A “project” is the largest unit of abstraction in Apex. A project consists of collection of AWS Lambda functions, and all apex(1) operations have access to these functions.


Projects have a project.json file in the root directory. This file contains details about your project, as well as defaults for functions, if desired. Here’s an example of a project.json file declaring a default AWS IAM “role” and “memory” for all functions.

  "name": "node",
  "description": "Node.js example project",
  "role": "arn:aws:iam::293503197324:role/lambda",
  "memory": 512

Multiple Environments

Multiple environments are supported with the –env flag. By default project.json and function.json are used, however when –env is specified project.ENV.json and function.ENV.json will be used, falling back on function.json for cases when staging and production config is the same. For example your directory structure may look something like the following:

├── bar
│   ├── function.stage.json
│   ├── function.prod.json
│   └── index.js
└── foo
    ├── function.stage.json
    ├── function.prod.json
    └── index.js

If you prefer your “dev” or “staging” environment to be the implied default then leave the files as project.json and function.json:

├── bar
│   ├── function.json
│   ├── function.prod.json
│   └── index.js
└── foo
    ├── function.json
    ├── function.prod.json
    └── index.js

It’s important to note that Apex supports symlinked files and directories. Apex will read the links and pull in these files, even if the links aren’t to files within your function. This enables the use of npm link, shared configuration and so on.



Name of the project. This field is used in the default value for “nameTemplate” to prevent collisions between multiple projects.


Description of the project. This field is informational.


Default runtime of function(s) unless specified in their function.json configuration.

Runtimes supported:


Default memory allocation of function(s) unless specified in their function.json configuration.


Default timeout of function(s) unless specified in their function.json configuration.


Default role of function(s) unless specified in their function.json configuration.


Name of the AWS profile to use, this is the name used to locate AWS credentials in ~/.aws/credentials. Use this if you’d prefer not to specify AWS_PROFILE or --profile.


Default infrastructure environment.


Default environment variables of function(s) unless specified in their function.json configuration.


Template used to compute the function names. By default the template {{.Project.Name}}_{{.Function.Name}} is used, for example project “api” and ./functions/users becomes “api_users”. To disable prefixing, use {{.Function.Name}}, which would result in “users”.


Default number of retained function’s versions on AWS Lambda unless specified in their function.json configuration.


Default VPC configuration of function(s) unless specified in their function.json configuration.


List of security groups IDs


List of subnets IDs

Structuring functions

A function is the smallest unit in Apex. A function represents an AWS Lambda function.

Functions must include at least one source file (runtime dependent), such as index.js or main.go. Optionally a function.json file may be placed in the function’s directory, specifying details such as the memory allocated or the AWS IAM role. If one or more functions is missing a function.json file you must provide defaults for the required fields in project.json (see “Projects” for an example).


  "description": "Node.js example function",
  "runtime": "nodejs",
  "memory": 128,
  "timeout": 5,
  "role": "arn:aws:iam::293503197324:role/lambda"


Fields marked as inherited may be provided in the project.json file instead.


Description of the function. This is used as the description in AWS Lambda.


Runtime of the function. This is used as the runtime in AWS Lambda, or when required, is used to determine that the Node.js shim should be used. For example when this field is “golang”, the canonical runtime used is “nodejs” and a shim is injected into the zip file.


Event handler name, this is the function invoked for a given event. Defaults are:


Memory allocated to the function, in megabytes.


Function timeout in seconds. Note that Lambda currently restricts durations to 5 minutes (300s).


AWS Lambda role used.


Environment variables.


Number of retained function’s versions on AWS Lambda. If not specified deploy command will leave 10 versions.


If your function needs to access resources in a VPC security groups and subnets have to be provided. You must provide at least one security group and one subnet.


List of security groups IDs


List of subnets IDs

Deploying functions

To deploy one or more functions all you need to do is run apex deploy. Apex deploys are idempotent; a build is created for each function, and Apex performs a checksum to see if the deployed function matches the local build, if so it’s not deployed.

After deploy Apex will cleanup old function’s versions stored on AWS Lambda leaving only few. Number of retained versions can be specified in project or function configuration.

If you prefer to be explicit you can pass one or more function names to apex deploy. You may also perform shell-style globbing matches with any command accepting function names, such as deploy, logs, and rollback.


Deploy all functions in the current directory:

$ apex deploy

Deploy all functions in the directory “~/dev/myapp”:

$ apex deploy -C ~/dev/myapp

Deploy specific functions:

$ apex deploy auth
$ apex deploy auth api

Deploy all functions which name starts with “auth”:

$ apex deploy auth*

Deploy all functions ending with “_reporter”.

$ apex deploy *_reporter

Invoking functions

Apex allows you to invoke functions from the command-line, optionally passing a JSON event or stream to STDIN.


Invoke without an event:

$ apex invoke collect-stats

Invoke with JSON event:

$ echo -n '{ "value": "Tobi the ferret" }' | apex invoke uppercase
{ "value": "TOBI THE FERRET" }

Invoke from a file:

$ apex invoke uppercase < event.json

Invoke a with stdin clipboard data:

$ pbpaste | apex invoke auth

Invoke function in a different project:

$ pbpaste | apex -C path/to/project invoke auth

Streaming invokes making multiple requests, generating data with phony:

$ echo -n '{ "user": "{{name}}" }' | phony | apex invoke uppercase
{"user":"DELMER MALONE"}
{"user":"JC REEVES"}
{"user":"LUNA FLETCHER"}

Streaming invokes making multiple requests and outputting the response logs:

$ echo -n '{ "user": "{{name}}" }' | phony | apex invoke uppercase --logs
START RequestId: 30e826a4-a6b5-11e5-9257-c1543e9b73ac Version: $LATEST
END RequestId: 30e826a4-a6b5-11e5-9257-c1543e9b73ac
REPORT RequestId: 30e826a4-a6b5-11e5-9257-c1543e9b73ac	Duration: 0.73 ms	Billed Duration: 100 ms 	Memory Size: 128 MB	Max Memory Used: 10 MB
{"user":"COLTON RHODES"}
START RequestId: 30f0b23c-a6b5-11e5-a034-ad63d48ca53a Version: $LATEST
END RequestId: 30f0b23c-a6b5-11e5-a034-ad63d48ca53a
REPORT RequestId: 30f0b23c-a6b5-11e5-a034-ad63d48ca53a	Duration: 2.56 ms	Billed Duration: 100 ms 	Memory Size: 128 MB	Max Memory Used: 9 MB
{"user":"CAROLA BECK"}
START RequestId: 30f51e67-a6b5-11e5-8929-f53378ef0f47 Version: $LATEST
END RequestId: 30f51e67-a6b5-11e5-8929-f53378ef0f47
REPORT RequestId: 30f51e67-a6b5-11e5-8929-f53378ef0f47	Duration: 0.22 ms	Billed Duration: 100 ms 	Memory Size: 128 MB	Max Memory Used: 9 MB
{"user":"TOBI FERRET"}

Listing functions

Apex supports listing of functions in various outputs, currently human-friendly terminal output and “tfvars” support for integration with Terraform.


List all functions and their configuration:

$ apex list

    runtime: nodejs
    memory: 128mb
    timeout: 5s
    role: arn:aws:iam::293503197324:role/lambda
    handler: index.handle
    aliases: current@v3, foo@v4

    runtime: nodejs
    memory: 512mb
    timeout: 10s
    role: arn:aws:iam::293503197324:role/lambda
    handler: index.handle
    aliases: current@v12

Terraform vars output:

$ apex list --tfvars

Deleting functions

Apex allows you to delete functions, however it will prompt by default. Use the -f, --force flag to override this behaviour. You may pass zero or more function names.


Delete all with prompt:

$ apex delete
The following will be deleted:

  - bar
  - foo

Are you sure? (yes/no):

For delete of all functions:

$ apex delete -f

For delete of specific functions:

$ apex delete -f foo bar

Delete all functions which name starts with “auth”:

$ apex delete auth*

Building functions

Apex generates a zip file for you upon deploy, however sometimes it can be useful to see exactly what’s included in this file for debugging purposes. The apex build command outputs the zip to STDOUT for this purpose.


Output zip to out.zip:

$ apex build foo > out.zip

Rolling back versions

Apex allows you to roll back to the previous, or specified version of a function.


Rollback to the previous release:

$ apex rollback foo

Rollback to specific version:

$ apex rollback foo 5

Preview rollback with --dry-run:

$ apex rollback --dry-run lowercase

~ alias testing_lowercase
 alias: current
 version: 2

$ apex rollback --dry-run uppercase 1

~ alias testing_uppercase
 version: 1
 alias: current

Function hooks

Apex supports the notion of hooks, which allow you to execute shell commands throughout a function’s lifecycle. For example you may use these hooks to run tests or linting before a deploy, or to transpile source code using Babel, CoffeeScript, or similar.

Hooks may be specified in project.json or function.json. Hooks are executed in the function’s directory, not the project’s directory. If a hook exits > 0 then Apex will halt and display the error.

Internally Apex uses these hooks to implement out-of-the-box support for Golang and other compiled languages.

Supported hooks


Here’s the hooks used internally for Golang support.

  "hooks": {
    "build": "GOOS=linux GOARCH=amd64 go build -o main main.go",
    "clean": "rm -f main"

Viewing log output

Apex is integrated with CloudWatch Logs to view the output of functions. By default the logs for all functions will be displayed, unless one or more function names are passed to apex logs. You may also specify the duration of time in which the history is displayed (defaults to 5 minutes), as well as following and filtering results.


View all function logs within the last 5 minutes:

$ apex logs

View logs for “uppercase” and “lowercase” functions:

$ apex logs uppercase lowercase

Follow or tail the log output for all functions:

$ apex logs -f

Follow a specific function:

$ apex logs -f foo

Follow filtered by pattern “error”:

$ apex logs -f foo --filter error
$ apex logs -f foo -F error

Output the last hour of logs:

$ apex logs --since 1h
$ apex logs -s 1h

Log all functions which name starts with “auth”:

$ apex logs auth*

Viewing metrics

The apex metrics command provides a quick glance at the overall metrics for your functions, displaying the number of invocations, total execution duration, throttling, and errors within a given time period.


View all function metrics within the last 24 hours:

$ apex metrics

  invocations: 1242
  duration: 65234ms
  throttles: 0
  error: 0

  invocations: 1420
  duration: 65234ms
  throttles: 0
  error: 5

View metrics for the last 15 minutes:

$ apex metrics --since 15m

  invocations: 16
  duration: 4212ms
  throttles: 0
  error: 0

  invocations: 23
  duration: 5200ms
  throttles: 0
  error: 5

Managing infrastructure

Apex is integrated with Terraform to provide infrastructure management. Apex currently only manages Lambda functions, so you’ll likely want to use Terraform or CloudFormation to manage additional resources such as Lambda sources.

Managing infrastructure

The apex infra command is effectively a wrapper around the terraform command. Apex provides several variables and helps provide structure for multiple Terraform environments.

Each environment such as “prod” or “stage” lives in the ./infrastructure directory. For reference it may look something like:

├── prod
│   └── main.tf
├── stage
│   └── main.tf

For example apex infra --env prod plan is effectively equivalent to the following command, with many -var’s passed to expose information from Apex.

$ cd infrastructure/prod && terraform plan

The environment is specified via the --env flag, or by default falls back on the defaultEnvironment property of project.json.

Terraform variables

Currently the following variables are exposed to Terraform:


You’ll typically need to assign $(apex_function_myfunction):current to specify that the “current” alias is referenced.

Previewing with dry-run

Apex lets you perform a “dry run” of any operation with the --dry-run flag, and no destructive AWS changes will be made.


Dry runs use the following symbols:


For example if you have the functions “foo” and “bar” which have never been deployed, you’ll see the following output. This output represents the final requests made to AWS; notice how the function names are prefixed with the project’s (“testing”) to prevent collisions, and aliases are made to maintain the “current” release alias.

$ apex deploy --dry-run

  + function testing_bar
    handler: _apex_index.handle
    runtime: nodejs
    memory: 128
    timeout: 5

  + alias testing_bar
    alias: current
    version: 1

  + function testing_foo
    memory: 128
    timeout: 5
    handler: _apex_index.handle
    runtime: nodejs

  + alias testing_foo
    alias: current
    version: 1

If you were to run apex deploy foo, then run apex deploy --dry-run again, you’ll see that only “bar” needs deploying:

$ apex deploy --dry-run

  + function testing_bar
    runtime: nodejs
    memory: 128
    timeout: 5
    handler: index.handle

  + alias testing_bar
    alias: current
    version: 1

Similarly this can be used to preview configuration changes:

$ apex deploy --dry-run

  ~ alias testing_foo
    alias: current
    version: $LATEST

  ~ config testing_foo
    memory: 128 -> 512
    timeout: 5 -> 10

As mentioned this works for all AWS operations, here’s a delete preview:

$ apex delete --dry-run -f

  - function testing_bar

  - function testing_foo

Environment variables

AWS Lambda does not support environment variables out of the box, so this is a feature provided by Apex. There are several ways to set these variables, let’s take a look!

Flag –set

The -s, --set flag allows you to set environment variables which are exposed to the function at runtime. For example in Node.js using process.env.NAME or in Go using os.Getenv("NAME"). Behind the scenes this generates a .env.json file which is injected into your function’s zip file upon deploy. You may use this flag multiple times.

For example suppose you had a Loggly log collector and it needs an API token, you might deploy with:

$ apex deploy -s LOGGLY_TOKEN=token log-collector

Flag –env-file

The -E, --env-file flag allows you to set multiple environment variables using a JSON file.

$ apex deploy --env-file /path/to/env.json

JSON config

Specify environment variables in project.json or function.json, note that the values must be strings.

  "environment": {
    "LOGGLY_TOKEN": "12314212213123"


The precedence is currently as follows:

Omitting files

Optional .apexignore files may be placed in the project’s root, or within specific function directories. It uses .gitignore pattern format; all patterns defined, are relative to function directories and not the project itself. By default both .apexignore and function.json are ignored.


Here’s an example ignoring go source files and the function.json itself:


Understanding the shim

Apex uses a Node.js shim for non-native language support. This is a very small program which executes in a child process, and feeds Lambda input through STDIN, and program output through STDOUT. Because of this STDERR must be used for logging, not STDOUT.

Viewing documentation

The apex docs command lets you read this documentation from the command line. By default it is piped into the less(1) pager so that you can perform operations like scrolling or searching from the terminal.


Upgrading Apex

The apex upgrade command will update your installation of apex(1) :).


How do you manage multiple environments?

It’s highly recommended to create separate AWS accounts for staging and production environments. This provides complete isolation of resources, and allows you to easily provide granular access to environments as required. See AWS credentials for supplying an account profile.

AWS IAM roles can be used to provide quick access to each environment using a drop-down in the AWS Console.

Can I test functions locally?

Currently there is no way to run functions locally, it would be a very large task to emulate the AWS. We recommend writing the bulk of your logic as libraries or packages native to your chosen language, using only thin connective layers in the Lambda functions themselves. This makes it easy to unit-test your functions, and makes them more portable if you’re worried about vendor lock-in.

How is this different than Serverless?

Serverless uses CloudFormation to bootstrap resources, which can be great for getting started, but is generally less robust than Terraform for managing infrastructure throughout its lifetime. For this reason Apex does not currently provide resource management. This may change in the future for light bootstrapping, likely in an optional form.

At the time of writing Serverless does not support shimming for languages which are not supported natively by Lambda, such as Golang. Apex does this for you out of the box.

The structures imposed by each project are different, as well as varying features, see the documentation for each project to see what either supports.

Serverless aims to be provider agnostic, which can be both a pro and a con depending on the level of abstraction you’re comfortable with, and if you desire to have a tool modelled closer to a single provider’s capabilities. This similar to the contention around ORM vs “raw” queries.

Serverless is written using Node.js, Apex is written in Go. Apex aims to be a simple and robust solution, while Serverless intends on providing a more feature-rich solution, pick your poison.

Is using the Node.js shim slow?

The shim creates a child process, thus creates a few-hundred millisecond delay for the first invocation. Subsequent calls to a function are likely to hit an active container, unless the function sees very little traffic.

Do shimmed languages have any limitations?

The shim currently operates using JSON over stdout, because of this you must use stderr for logging.

Can I manage functions that Apex did not create?

Apex is not designed to handle legacy cases, or functions created by other software. For most operations Apex references the local functions, if it is not defined locally, then you cannot operate against it. This is by-design and is unlikely to change.

What is _apex_index.handle?

Apex provides environment to runtimes by “wrapping” the existing user-defined handlers. For example in the Node.js case an _apex_index.js file is created in order to populate process.env before requiring the original such as index.js.

To get a better understanding of how this works take a look at the Node.js plugin implementation.