DEV Community

Awaji-mitop N. Gilbert
Awaji-mitop N. Gilbert

Posted on

Work around to Deploy Large Lambda Functions with Claudia.js πŸ•΅πŸ»β€β™‚οΈ

A common error encountered while deploying to AWS Lambda is the error associated with the size of the function's deployment package which is as a result of the 50 MB (zipped, for direct upload) and 250 MB (unzipped) quota limitation applied to all the files you upload by AWS.

Deployment package limits

Photo from Lambda quotas.

This Error associated with the size of the deployment package comes in two forms; the RequestEntityTooLargeException and InvalidParameterValueException error for 50 MB and 250 MB limits respectively.

This article shares a work around for these limitations with deploying large functions by leveraging AWS S3 bucket for storage and downloading of the deployment package when the Lambda function loads.

Note: this will increase startup time and may introduce significant latency for initial request processing.

This article, though uses Claudia.js for deployments, shares a general approach that can be applied irrespective of how you deploy your Lambda function.

Requirements and setup

For this article, the deployment tool of choice is Claudia.js which is simply a tool to deploy Node.js projects to AWS Lambda and API Gateway. It abstracts the deployment configuration, sets everything required out of the box and makes it easy to get started with Lambdas, thus, focusing on the business value.

Follow this quick guide to setup your machine with Claudia.js and AWS.

Create the problem

To demonstrate the solution, we need to first create the problem and this section will focus on that. You can jump to the solution if you already have the problem πŸ˜‰.

Here, we create a Lambda function with deployment package limit for direct upload by installing random npm packages to increase the zip size for direct upload.
Also, we will create a large file in the project so as to increase the unzipped size to simulate the 250MB limit.

With these two setups, we will encounter the errors for size limit and work to solve them as we finally deploy our lambda function.

step 1:
Create a directory for our lambda function and initialize it as a node.js project.

mkdir claudia-large && cd claudia-large && npm init
Enter fullscreen mode Exit fullscreen mode

step 2:
Setup package.json.

  • Install a few random dependencies to increase our deployment package size to beat the direct upload limit (50MB).

  • Add a deploy:create script. lambda.handler in the script means lambda is the module/file name and handler is the function to be used in the module.

After this, our package.json file should look like this

{
  "name": "claudia-large",
  "version": "1.0.0",
  "description": "",
  "main": "index.js",
  "scripts": {
    "test": "echo \"Error: no test specified\" && exit 1",
    "deploy:create": "claudia create --region us-east-1 --handler lambda.handler"
  },
  "author": "",
  "license": "ISC",
  "dependencies": {
    "aws-sdk": "^2.814.0",
    "babel-eslint": "^10.1.0",
    "eslint-plugin-prettier": "^3.4.0",
    "faker": "^5.5.3",
    "jest": "^27.3.1",
    "sinon": "^11.1.1",
    "typescript": "^4.2.4"
  }
}

Enter fullscreen mode Exit fullscreen mode

step 3:
Create a large file in the directory of a 100MB.

This will ensure we simulate the second problem by increasing the unzipped size beyond the 250MB limit after solving the first problem.

mkfile -n 100m largefile1.txt 
Enter fullscreen mode Exit fullscreen mode

You should see a file largefile1.txt in the project directory.

step 4:
Create a simple Lambda function that reads a file and reports it's size.

First, we create a file lambda.js and add the contents below to it.

const fs = require('fs');

exports.handler = function (_, context) {
  const info = fs.statSync('largefile1.txt');

  context.succeed(`file size is: ${info.size}`);
};
Enter fullscreen mode Exit fullscreen mode

step 5:
Inspect the size of deployment package by running claudia pack which will package the deployment without deploying it.

A file claudia-large-1.0.0.zip should now be created in your directory.

Also, add ls -l which will list the size of the different files in the directory.

claudia pack && ls -l
Enter fullscreen mode Exit fullscreen mode

You should see that the size of the generated zip is 53411983 bytes which is 53.411983 MB and goes beyond the limit for direct upload.

size of package zip file and other files in project directory

Note: ensure to remove the zipped file created by claudia pack so it does not add to the lambda when we intend to deploy.

Attempt a deploy:
Attempt to deploy the function with the deploy:create script.

npm run deploy:create
Enter fullscreen mode Exit fullscreen mode

The deployment should fail because the zip size is beyond the limit for direct upload. You should see a message containing RequestEntityTooLargeException: Request must be smaller than 69905067 bytes

RequestEntityTooLargeException error

Solution #1 (RequestEntityTooLargeException)

When the size of the zip package created exceeds the limit for direct upload, we can use an S3 bucket to upload the function code before installing in Lambda.

As such, we will create an S3 bucket on AWS console which will be used to upload our function code from where it will be installed in AWS Lambda using the Claudia.js option --use-s3-bucket.

step 1:
Create an S3 bucket with name claudia-large accepting all defaults (depends on your specific needs) with same region as your Lambda function. In our case, our region will be us-east-1.

step 2:
update the deploy:create script to use the bucket created in previous step with the option --use-s3-bucket.

{
...,
    "deploy:create": "claudia create --region us-east-1 --handler lambda.handler --use-s3-bucket claudia-large",
...
}
Enter fullscreen mode Exit fullscreen mode

step 3:
Deploy again.

Note: Since the first deployment failed, we need to delete the default role created by AWS for our Lambda.

Claudia creates a default role for your Lambda in AWS IAM, this role can be customised by passing a policy file.
In our case, the role is claudia-large-executor. If you do not delete it before trying again, you will get an EntityAlreadyExists error in the terminal.
This is the issue on Claudia.js

npm run deploy:create
Enter fullscreen mode Exit fullscreen mode

Congratulations πŸš€, we have gotten a different error πŸ₯²πŸ˜€, the InvalidParameterValueException: Unzipped size must be smaller than 262144000 byteserror.

InvalidParameterV erroralueException

Basically, what this means is that our size limit is no longer 50MB since we do not use direct upload anymore but now 250MB for unzipped limit.

That is, if we never created the largefile1.txt file of a 100MB in the directory, we would have had a successful function upload by now.

Also, you should see the deployment package in the S3 bucket claudia-large we created.

Deployment package created in S3 bucket

Solution #2 (InvalidParameterValueException)

In the case where the Lambda function's unzipped size goes beyond the 250MB limit, one thing to do would be to ensure all external dependencies/files are kept in a separate storage and downloaded during execution time.

In other words, to solve this issue, we will upload our large file largefile1.txt to AWS S3 and only download it when the function runs.

step 1:
Upload the file largefile1.txt manually to the S3 bucket we created claudia-large.

step 2:
Refactor our function code to not load the file from package directory but from AWS S3.

The file lambda.js should be refactored to now have the content below.

const AWS = require("aws-sdk");
const s3 = new AWS.S3();

async function getMetaInfoFromS3(key, bucket) {
  const metaInfo = await s3
    .headObject({ Key: key, Bucket: bucket })
    .promise()

  return metaInfo;
}

exports.handler = async function (_, context) {
  const metaInfo = await getMetaInfoFromS3("largefile1.txt", "claudia-large");

  context.succeed(`file size is: ${metaInfo.ContentLength}`);
};
Enter fullscreen mode Exit fullscreen mode

step 3:
Delete the file largefile1.txt from the project since it is now on S3.

rm largefile1.txt
Enter fullscreen mode Exit fullscreen mode

step 4:
Add a policy folder and json file to the directory to allow our Lambda function access to read from S3. This policy would be used to add additional permissions to our lambda executor role claudia-large-executor created by Claudia.

mkdir policy && touch policy/policy.json
Enter fullscreen mode Exit fullscreen mode

Add the content below to the policy.json file.

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Action": [
        "s3:GetObject"
      ],
      "Effect": "Allow",
      "Resource": "*"
    }
  ]
}

Enter fullscreen mode Exit fullscreen mode

Folder structure should now look like this
Showing current state of folder structure within the project

step 5:
update the deploy:create script to use the policy.json created in previous step by adding --policies policy.

{
...,
    "deploy:create": "claudia create --region us-east-1 --policies policy --handler lambda.handler --use-s3-bucket claudia-large",
...
}
Enter fullscreen mode Exit fullscreen mode

step 6:
Deploy again. Remember to delete the role if existing before attempting this else you would get an error.

npm run deploy:create
Enter fullscreen mode Exit fullscreen mode

Success!.

You should see a claudia.json file created in the root of the project with the contents

{
  "lambda": {
    "role": "claudia-large-executor",
    "name": "claudia-large",
    "region": "us-east-1"
  }
}
Enter fullscreen mode Exit fullscreen mode

Test the Lambda we just created:
Finally, to test the lambda's response, run the command below.

claudia test-lambda --config claudia.json
Enter fullscreen mode Exit fullscreen mode

you should see a response in the terminal containing the file size as expected in the function code.

Success Lambda response

Conclusion:

Generally, we have been able to work around the 50MB zipped and 250MB unzipped lambda limits by use of S3 buckets to upload functions larger than 50MB over a slower network and moving large files which might increase the size of the unzipped package to S3 (or any other storage).

The techniques shared in this article is simply to keep large dependencies/files outside the function package and is a general approach irrespective of the deployment tool used.

However, the limit of 250MB still exists and cannot be exceeded. All we did was apply storage techniques to make our function smaller.

If the Lambda function cannot be reduced, maybe consider checking container image support for lambda which can provide you with Lambda functions as container images of up to 10 GB in size.

The final code used for this article can be found on my Github here

Top comments (4)

Collapse
 
emmygozi profile image
Ahaiwe Emmanuel

Well broken down by the prof. Very insightful read.

Collapse
 
successgilli profile image
Awaji-mitop N. Gilbert

Thank you!

Collapse
 
mayomi1 profile image
Mayomi Ayandiran

Very insightful articleπŸ™ŒπŸ½

Collapse
 
successgilli profile image
Awaji-mitop N. Gilbert

Thank you!