S3 as a Serverless Service

Link to chapter - https://serverless-stack.com/chapters/s3-as-a-serverless-service.html

Hi All,

I am trying to use this s3 serverless service by modifying the note taking app so that it compiles a users notes into a csv file and uploads it into the bucket created when this service was deployed. I have this feature defined as a new service in which the lambda function makes the query call defined in the list function from part I, iterates over the items in the query to push them into an array, joins the array with a “\n” separator, and then should take that string created by joining the elements of the array as the body of an s3.upload() call. The only problem is that it is not making that call.

Here is my function:

async function uploadNotes(csvBody) {
        const csvBuffer = Buffer.from(csvBody);
        let params = {Bucket: 'notes-app-mono-uploads-dev-mycsvbucket', Key: 'private/' + event.requestContext.identity.cognitoIdentityId + '/notes.csv', Body: csvBuffer};
        console.log("csvBuffer: \n" + csvBuffer);
        s3.upload(params, function(err, data) {
            console.log("s3.upload function should be happening here");
            if (err) console.log(err, err.stack); // an error occurred
            else console.log(data);           // successful response

I set up a mock like the list mock from part I to test this. The output returns shows the “s3” log and the csvBuffer output with all the notes from my test user but the “s3.upload function should be happening here” console.log, the err console.log and the data console.log do not show up on the output.

My try block:

try {
         const result = await dynamoDbLib.call("query", dynamoParams);
         const csvString = await gatherNotes(result.Items);
        return success({ status: true });
    } catch (e) {
        return failure({ status: false });

After running the invoke local command to test this, the output also shows that it has return “status: true” object indicating that the lambda call was successful.

I am somewhat new to serverless but I feel like this should work. My auth service is set up to allow the authorized user to upload to the bucket just like it allows the same user to query the notes table. The query works but the upload doesn’t.

Anyone have any idea what I am missing? Any help would be much appreciated.

All the best,


I’m not entirely sure what is going on here but don’t you need an await on the s3.upload call?

Hey Jay!

Sorry about the word soup. Guess I should proofread before I post, amirite? Anyway, you were correct. That was the problem. I needed to add async in front of s3.upload. I also needed to add .promise() after s3.upload(params) call.

I ended up writing the bad code as follows:

async function uploadNotes(csvBody) {
    let csvBuffer = Buffer.from(csvBody);
    let params = {
        ACL: "public-read",
        Bucket: process.env.notesBucket,
        Key: 'private/' + event.requestContext.identity.cognitoIdentityId + '/notes.csv',
        Body: csvBuffer
    try {
        const result = await s3.upload(params).promise();
    } catch (e) {

I was able to find more help at forum.serverless.com.

Anyway, thanks again!

1 Like

Glad you figured it out!

Are you able to dynamically set environment variables in serverless? For example setting an S3_BUCKET variable where the value is Fn::ImportValue: ${self:custom.stage}-mybucket