Optimize images on S3 with AWS Lambda and Serverless
8 minute read
An image is uploaded on S3, that triggers a lambda that optimize that image and put it back. How hard can that be?
It was way more tricky than I thought, especially since Im not used to the Node ecosystem. So here I am sharing my
research and result. I hope it can be useful.
there are tonnes of great node libraries that help me optimize images. Node8 is soon to be deprecated on AWS lambda
so why not start using the latest and greatest Node12.
I want to optimize the image once and store it back on S3. I dont want to optimize it on demand. I dont want it
that dynamic, flexible and resource heavy.
I also want to use the Serverless framework because that is what Im using when deploying
my PHP applications.
The code is not too complicated. It is just about 100 lines and if you can stand reading this many anonymous functions
you will be alright. I talk about it from top to bottom and later you will see the full index.js.
First we require a bunch of stuff. We need ImageMagic and Imagemin + some plugins. Note that I change the binPath
of ImageMagic… more about that later.
I now define a function that returns 90% of the available memory on the lambda. Then I start reading the S3 event.
I’ve decided to upload all images to a folder named uploads and the optimized images will go to the optimized
Now I start the actual work. I use async.waterfall which will run each anonymous function in order.
I will start by downloading the image from S3 into a buffer.
Next I use ImageMagic to do some small optimizations like turning the image the correct way and removing metadata.
I also tell ImageMagic not to use too much memory.
Now I optimize the image with imagemin. Nothing fancy.
Aaand it is time to upload the image again. I also set some cache control headers. I say that this image is good to
save for 10 years. I also say that this is “immutable” and will never change.
I upload the imag eto the same bucket but a different folder.
Here is the full index.js with some error handling:
My package.json with my depedencies:
This looks good and should work. However, when you run npm install on your local computer you will get dependencies
specific to your system and OS. If we just copy them and add to Lambda we will get all kinds of weird errors about
I solved that issue by using Docker. So I am building my image from an image very similar to what I use in production.
I install some dependencies and then run NPM install. When everything looks good inside the docker image I’m using
Serverless to deploy.
Here is my Dockerfile:
ImageMagic issues on Lambda nodejs10 and nodejs12
On the nodejs8 Lambda runtime Amazon included ImageMagic, but that is not true for nodejs10 and later. So we need to
provide an extra layer to our Lambda. (Using a “layer” is the AWS way to add libraries to your runtime). I found
this Gitbub repository which provides a layer you
can easily deploy.
I just clicked the “Deploy” button, waited 5 minutes and I got a new private layer in my AWS account.
The layer add the required libraries in /opt/bin. That is why we need to tell ImageMagic to look in that folder.
I took my new layer ARN and added it to my serverless.yml:
That is pretty much it. Now Im ready for deployment. I created my bucket in the AWS console then I run the following
Now when I upload a file to my-image-bucket/uploads the image will be optimized and added in my-image-bucket/optimized.
It will also work for subdirectories.