AWS Lambda 250MB limit – how to shrink your Lambda and its node_modules below it?

Have you seen this error when trying to deploy a TypeScript and NPM based Lambda to AWS?

An error occurred (InvalidParameterValueException) when calling the UpdateFunctionCode operation: Unzipped size must be smaller than 262144000 bytes

This comes from an important AWS Lambda limit, related to the “Deployment package size”

To illustrate what we’ll be doing to solve this, let’s start by creating a very simple Lambda, that gets an ID from the Lambda event parameters and read a DynamoDB record by this ID:

import DynamoDB = require('aws-sdk/clients/dynamodb');
import {GetItemInput} from 'aws-sdk/clients/dynamodb';

const ddb = new DynamoDB({apiVersion: '2012-08-10'});

export const handler = async (event: { id: string }) => {
    const params: GetItemInput = {
        TableName: process.env.TABLE_NAME as string,
        Key: {
            'id': {N:}

    // Read item from DynamoDB
    const result = await ddb.getItem(params).promise();

    // Return the item as Lambda result
    return result.Item;

Simple, right?

Let’s try compiling this .ts file to .js and deploying the Lambda folder along with the node_modules dependencies (which mostly contains aws-sdk and a few other sub-dependencies).

We have only written 19 lines of code, yet our deployment size (the .zip file) is already 15MB. The unzipped version is over 100MB. We quickly realize that if use one or two more dependencies while writing our Lambda code (e.g. Lodash or some useful Math library), we will quickly reach the AWS Lambda deployment size limit of 250MB unzipped source code (including node_modules).

But what is the solution?


Using Webpack to shrink a TypeScript file and only its required third party dependencies, into a single, minified JavaScript file

First, let’s install Webpack and a few dependencies:

npm i webpack webpack-cli ts-loader

Create a webpack.config.js file at the root of your project:

const path = require('path');
const fs = require('fs');

// Search for Lambdas in this folder
const dir = path.resolve(__dirname, '/lambdas/');
const handlers = fs.readdirSync(dir).filter(function (file) {
    // Get only .ts files (ignore .d.ts)
    return file.match(/(^.?|\.[^d]|[^.]d|[^.][^d])\.ts$/);

// Collect Webpack suitable entries
// This object will contain a key=>value array of
// filename => file absolute path
// which is what Webpack expects
// @see
const entries = {};
handlers.forEach(handler => {
    // Remove extension from filename
    const filenameWithoutExt = handler.replace('.ts', '');
    entries[filenameWithoutExt] = dir + handler;

module.exports = {
    entry: entries,

    mode: 'production',

    target: 'node',
    module: {
        rules: [
                test: /\.tsx?$/,
                use: {
                    // Pass .ts files through this tool
                    loader: 'ts-loader',
                    options: {
                        // Ignore potential TypeScript compilation errors
                        transpileOnly: true,
                exclude: /node_modules/,
    resolve: {
        modules: [
            path.resolve(__dirname, 'node_modules'),
        extensions: ['.tsx', '.ts', '.js'],
    output: {
        libraryTarget: 'umd',

        // Where to write the resulting compiled .js files
        path: path.resolve(__dirname, 'dist/lambdas'),

        // Define a specific naming scheme if you need one
        filename: "[name].js"

The configuration file mostly defines where will Webpack look for files to compile (./lambdas in this case), how will it compile them (using “ts-loader”) and where to store them (./dist/lambdas).

Now, when we run “npx webpack” and inspect the output folder ./dist/lambdas, we should see our main Lambda handler with the 19 lines of code above, compiled to JavaScript and prefixed with all of the actual libraries or methods that it uses from node_modules, making node_modules no longer a deployment requirement inside the .zip file to be uploaded to AWS Lambda.

Let’s try deploying only the “artifacts” from the ./dist/lambda folder now.

The unzipped size is not that much bigger (700kb). These is a significant difference between <1MB and the original deployment size of 100MB+.


Using the above approach, you should be able to safely keep developing new Lambdas and start using third-party node_modules based libraries without worrying about AWS Lambda deployment size limits.

Using Webpack compilation as a step between writing code and deploying it, will make sure your deployments grow predictably in size, relatively to the actual code you write and use (Webpack strips away unused code using a “tree-shaking” mechanism) – not relative to code you add using “npm install …”.

Happy Lambda coding.

Are you our your company having issues with integrating AWS services, AWS CDK, Terraform or Docker? Drop me a line.

How to display GitHub Actions status badge image in your repository’s

GitHub actions is one of the latest additions in the sphere of tools, available for CI/CD. It’s quite powerful and integrates directly with the GitHub repository where it’s ran.

One of the benefits of having a CI system, is the ability to quickly understand if your latest build is passing or not. The so called “status badges” help with this, especially for Open Source projects.

Showing such a status badge image in your own repository is quite easy.

You just need to collect 3 pieces of information:

  • Your GitHub organization or username
  • Your GitHub repo’s name (the one visible in the URL)
  • The GitHub Actions workflow name (usually on the first line of /.github/workflows/main.xml

Based on these 3 things, you can construct a URL to an image that’s automatically generated by GitHub actions and represents the latest build status:<OWNER>/<REPOSITORY>/workflows/<WORKFLOW_FILE_PATH>/badge.svg

To use it inside a repository’s

![Build passing](<OWNER>/<REPOSITORY>/workflows/<WORKFLOW_FILE_PATH>/badge.svg)

Terraform error “InvalidParameterException: The target group with targetGroupArn arn:aws:xxx does not have an associated load balancer”. How to solve it?

Recently, while trying to create a Terraform stack for the infrastructure of an experimental app, I encountered a cryptic error:

Error: Error applying plan:

1 error(s) occurred:

* module.ecs.aws_ecs_service.current_task: 1 error(s) occurred:

* aws_ecs_service.current_task: InvalidParameterException: The target group with targetGroupArn arn:aws:elasticloadbalancing:eu-west-1:1111111111:targetgroup/xxx/xxxxxxxxxxxxxxx does not have an associated load balancer.
        status code: 400, request id: 5e027278-85ba-asdf-bd5b-cd67eda86a92 "test-apis"

After some research, I discovered that the error appears due to a race condition. Attempting to create an “aws_alb_target_group” Terraform resource without the underlying ALB (load balancer) being ready, triggers the error.

The solution (workaround) was simple, just enforce the order of resource creation using Terraform’s “depends_on” attribute.

For example:

resource "aws_lb" "this" {
  // ALB attributes here
resource "aws_alb_target_group" "frontend" {
  name = "frontend"
  port = 80
  protocol = "HTTP"
  vpc_id = xxxxx
  depends_on = [
    // Important bit is here

Happy Terraforming!

Can modern browsers run TypeScript code, without compiling?

Recently, I bumped into this interesting question as part of a discussion with a fellow developer. As a developer who uses TypeScript often (as part of creating various Angular projects), I was curious to understand what would happen if you try to run raw TypeScript code, such as the one below, directly in your browser without compiling (e.g. by putting it inside the <head> tag).

const sum = (a:number, b:number)=>a+b;

Go ahead… try it.

If you try to execute it, you will likely encounter the same error as I did:

Uncaught SyntaxError: Unexpected token ':'

The reason for this is because all modern browsers follow the ECMAScript scripting-language specification, the same way that TypeScript follows it. However, they both follow it a different “pace”. Currently, TypeScript adopted features from far more modern versions of ECMAScript, whereas browsers like Google Chrome or Mozilla Firefox have barely adopted support for ECMAScript 2015 and partial support for ECMAScript 2016.

As a result, several features that are available in TypeScript are still not available in IE or Firefox due to their slower release cycle and slower adoption of ES in general.

How do I run TypeScript code inside the browser then?

Short answer, you can’t. You need to compile the .ts code into plain .js that is understood by all browsers and browser versions (e.g. IE6). You should acknowledge the fact that probably not all of your end users have the latest version of any given browser.

The good news is, that the framework that you are using (e.g. Angular, React) probably already has a good compiler that does the heavy lifting for you (e.g. Babel). A compiler’s job is to take the “modern” TypeScript that you write and convert it into “old school” .js files that work across all browsers and browser versions (optionally, including some polyfills),

A compiler may also create the so-called “ files” that represent the links/relationship between the compiled .js code and the original .ts source code, allowing for easier debugging and breakpoint usage.

Why VueJS lost me on the first day I tried it?

VueJS is a great micro frontend framework for building single-page applications. However, one serious limitation struck me today, that will probably stop me from using the framework for any serious project from now on.

Specificifically, the fact that you can not use arrow functions (ES syntax) if you declare a VueJS method that reads or writes to one of the Component’s data properties (a very common scenario).

For example, let’s imagine you have a component that does 2 things: it has a form with a single text input and a button to trigger a search based on the text input value (think of it like the Google homepage). The value of the text input is usually stored as a data property within the component, through model binding (v-model="attribute-name-here"). As you press the button, you trigger a method within the component that is supposed to “read” that attribute and make an API call (e.g. using the axios library) and pass that value to the API. The API responds with some values that you may also want to map as data attributes of the component. Both of these things – you can not do, if you use the arrow syntax for the method.

The reason is – arrow syntax changes the this context and it no longer refers to the Vue component itself. This effectively means that the following example is not possible:

    export default {
        name: 'HelloWorld',
        data: () => ({
            url: '',
        methods: {
            someMethod: () => {
                // Make an API call to retrieve some data by using a component data attribute
                    .post('', {
                        // Try to get a component data variable
                        // This results in a "trying to get a property of undefined" error
                        // because "this" doesn't refer to the HelloWorld component
                        url: this.url
                    .then(() => {
                        // Do something with the response

The solution is simple – replace the someMethod definition from an arrow function to a regular function(){}. This doesn’t really make sense in the modern frontend development age though and is not something for Vue to enforce. Arrow functions are the future and I shouldn’t be forced to use a mixture of ES and non-ES syntax, just because I rely on some framework (VueJS).

You can read more about the reasoning why this happens in the Vue documentation. Here’s an excerpt:

Don’t use arrow functions on an options property or callback, such as created: () => console.log(this.a) or vm.$watch(‘a’, newValue => this.myMethod()). Since an arrow function doesn’t have a this, this will be treated as any other variable and lexically looked up through parent scopes until found, often resulting in errors such as Uncaught TypeError: Cannot read property of undefined or Uncaught TypeError: this.myMethod is not a function.

What’s your thought on that? Have you bumped into the same limitation? Is there an easy workaround or you just “get along” and consider this an acceptable thing?

How to clone a CodeCommit repo inside a Cloud9 environment?

Cloud9 is Amazon’s cloud-based IDE that makes writing and building projects a breeze. There are no licensing fees involved (unlike PHPStorm or other similar IDEs) and the added benefit is that you get tight integration with AWS services like Lambda and CodeCommit.

A common use case for using Cloud9 is working collaboratively on a project, that’s hosted in Git (version control). And what’s better than using Amazon’s native Git repo hosting, called CodeCommit?

In this short tutorial, you’ll learn how to clone a CodeCommit repo inside a Cloud9 development environment. We will assume that you already have a Cloud9 environment created, you have logged in and there is an existing CodeCommit repo.

Steps to clone a CodeCommit repo inside Cloud9

While in the terminal of your Code9 environment, run the following commands:

git config --global credential.helper '!aws codecommit credential-helper $@'
git config --global credential.UseHttpPath true

What this does is it tells the Git CLI to use the AWS CLI whenever Git HTTP authentication is required (when cloning, pulling, pushing code). Additionally, the second line tells the Git CLI to enforce strict HTTPs pulling instead of HTTP.

The final result will be that whenever you enter the Cloud9 environment and attempt to clone a CodeCommit repo, the AWS helper will kick in and use temporary AWS credentials that represent YOU (the current Cloud9 user) before CodeCommit while doing Git operations.

The final step would be to clone a Git repo inside Cloud9. Remember to always use the HTTPs clone URL instead of the SSH one, because SSH will fallback to private key-based authentication. Something we haven’t configured above, so it won’t work.

Take the HTTPs clone URL from the CodeCommit console
Clone inside Cloud9 using the same HTTPs clone URL

Happy coding!