Our use case
At Ophelos, we needed a fast, efficient, and automated way to handle client-uploaded files in our AWS S3 buckets. These files would be processed in the background within our core application using Sidekiq, which has Redis as a dependency and utilises it to queue jobs.
Summary of our solution
To meet our requirements, we opted to develop an AWS Lambda function using the Serverless Framework. This approach enables us to leverage S3 Event Triggers to invoke the lambda and send a job to our Redis instance. When there is an event that meets our desired criteria, Sidekiq will pick it up and process it accordingly.
AWS Lambda functions offer an ideal solution for our problem statement. Although there are drawbacks such as cold start times, execution time limits, and the stateless nature of Lambda, they are negligible for our use case. Cold start times, for instance, will have minimal impact on our solution since we are triggering background tasks that aren’t significantly affected by this performance bottleneck.
It’s thanks to AWS Lambda’s scalability, security and event-driven nature that we’re able to come up with an effective solution.
Assuming you are familiar with Serverless, Lambda and Sidekiq jobs, let’s begin with an overview of our Serverless configuration.
We use the app and org object properties in the config file to allow us to utilise the Serverless Dashboard. The tool gives us the ability to monitor and manage our Serverless service/application from a centralised UI.
Our platform infrastructure takes advantage of VPC endpoints to allow private connections between supported AWS services. We therefore configure the vpc property at the provider level and since we have different environments which will have different IDs and additional config, we use separate .yml files to define the environment specific config.(See the snippets below)
The service is a Typescript, Node application and the serverless-esbuild plugin is a great addition to allow for zero-config JS and TS bundling using esbuild within your serverless environment.
file(config/${sls:stage}.yml):params  pulls the config from the environment specific files.
Now let’s detail the handler function:
We required the S3 object key and bucket name for our Sidekiq Job.
Our handler has a single responsibility and we like to abstract any other logic into separate classes or functions. The RedisConnectionService is responsible for setting up the connection to the Redis instance. There are two good libraries that will do a lot of the heavy lifting for you when interacting with Redis.
Note: We have a PR for sidekiq-client to return the promises from Redis and it does not have any type definitions. We had to declare it in our decs.d.ts file and will aim to further contribute by adding type definitions to the library at some point in the future.
So, there you have it, the basic solution to our use case.
We do have some additional complexities, such as having numerous S3 buckets that must trigger the same Lambda function. To address this, we use SNS Topics to fan in multiple publishers (S3 Buckets) to subscribers, which, in our case, is our Lambda function.
Alternatively, we could detail each bucket individually in our Serverless configuration file. Though despite being the simpler solution, this approach would quickly become unwieldy as the number of buckets grows.
This is our simplified approach to triggering Sidekiq jobs using a Lambda function. I hope this method proves valuable to others!