Tag: unit testing


Unit testing: Stubs, Fakes, Mocks

June 24, 2024

Uncategorized

Comments Off on Unit testing: Stubs, Fakes, Mocks


Effective communication with your team is crucial. No matter what terms you use, agreeing on and adapting to the preferences of your team is the most important takeaway.

This is how I think of these terms and how they are used on my team.

Stubs

A stub is a minimal unimplemented module.

This is not specific to unit testing. When you begin working on a module, it will have some required interface to allow it to be coupled to the overall system. Generally, you would “stub out” some minimal code that will allow your system to compile. Once you have a stub, you can work on adding tests and filling in the functionality.

Creating a stub example:

interface UserModuleInterface {
  getUser(): User;
}

class UserModule implements UserModuleInterface {
  getUser() {
    throw new Error('Not implemented');
  }
}

test('User module implements interface', () => {
  let sut = new UserModule();
  expect(sut).toBeInstanceOf(UserModuleInterface);
});

This is just enough to make the project compile/run. The next thing you might do is have it actually return a User, but you don’t have any users so you need to create fake users.

Fakes

A fake is a data object with made-up data.

You could return hard-coded literals, but it’s often useful to have some variety. There are useful generators such as faker to help out.

Creating a fake user example:

import { faker } from '@faker-js/faker';

function createFakeUser() {
  return {
    name: faker.person.fullName(),
    email: faker.internet.email()
  };
}

Great, but we don’t want to create fakes in the real implementation, so how can we use this in our test environment and inject it into the system?

Mocks

A mock is an implementation of a dependency that is injected in.

Mocks are used in unit testing to replace external dependencies. This allows for predictable/testable behavior. For example, mocking network requests allows you to test positive cases and fault cases without the unpredictable nature of actual network calls.

Mocks will often include some metadata so that you can “Spy” on them and understand how they are used after being injected.

In an integration test, the real implementation is used.

Creating a mock example:

interface Database {
  find(key: string): any;
}

class UserModule implements UserModuleInterface {
  private db: Database;

  constructor(db: Database) {
    this.db = db;
  }

  getUser() {
    return this.db.find('user');
  }
}

class MockDatabase implements Database {
  private db: { [key: string]: any } = {};
  calls = { find: 0, insert: 0 };

  find(key: string) {
    this.calls['find'] += 1;
    return this.db[key];
  }

  insert(key: string, value: any) {
    this.calls['insert'] += 1;
    this.db[key] = value;
  }
}

test('UserModule returns user from db', () => {
  let dbMock = new MockDatabase();
  let fakeUser = createFakeUser();
  dbMock.insert('user', fakeUser);
  let sut = new UserModule(dbMock);
  let user = sut.getUser();

  expect(dbMock.calls['find']).toBe(1);
  expect(user).toBe(fakeUser);
});

That is how stubs, fakes, and mocks work together to build up tests and implementations.


Using Multer with Multer-S3 for File Uploads to AWS S3

June 14, 2024

Uncategorized

Comments Off on Using Multer with Multer-S3 for File Uploads to AWS S3


When building web applications that handle file uploads, one of the most common requirements is the ability to store these files securely and efficiently. AWS S3 (Amazon Simple Storage Service) provides a scalable, high-speed, low-cost web-based service aimed at online backup and storage. Integrating S3 with a Node.js application for file uploads can be streamlined using multer, a Node.js middleware for handling multipart/form-data (a standard way to upload files via HTTP), along with multer-s3, a library that allows multer to upload files directly to AWS S3.

In this blog post, we’ll explore how to implement file uploads in a Node.js application using multer and multer-s3. We’ll also discuss how to unit test this functionality by mocking AWS S3 to ensure our application behaves correctly without actually performing I/O operations against AWS infrastructure.

Setting Up Your Node.js Application

Before diving into the specifics, let’s set up a basic Express.js application. Ensure you have Node.js installed, and then initialize a new project:

mkdir multer-s3-demo
cd multer-s3-demo
npm init -y
npm install express multer @aws-sdk/client-s3 multer-s3 dotenv

Create an app.js file:

const express = require('express')
const app = express()
const port = process.env.PORT || 3000

app.get('/', (req, res) => {
  res.send('Hello World!')
})

app.listen(port, () => {
  console.log(`App listening at <http://localhost>:${port}`)
})

module.exports = app // for testing purposes

Add a script to start the application in package.json:

"scripts": {
    "start": "node app.js", // <-- Add this line
    "test": "echo \"Error: no test specified\" && exit 1"
  },

Now you can run the app with npm start and make sure things are working by sending a request curl "[http://localhost:3000](http://localhost:3000/)" and you should see Hello World!

Integrating Multer with AWS S3

To handle file uploads to AWS S3, you’ll need to configure multer with multer-s3. Update your project to include environment variables for AWS credentials:

  1. Set up environment variables in a .env file:

    AWS_ACCESS_KEY_ID=your_access_key
    AWS_SECRET_ACCESS_KEY=your_secret_key
    AWS_REGION=your_region
    AWS_S3_BUCKETNAME=your_bucket_name
    
    
  2. Configure multer and multer-s3 in a new file upload.js:

    require('dotenv').config()
    
    const S3Client = require('@aws-sdk/client-s3').S3Client
    const multer = require('multer')
    const multerS3 = require('multer-s3')
    
    const s3Client = new S3Client({
        region: process.env.AWS_REGION,
        credentials: {
          accessKeyId: process.env.AWS_ACCESSKEYID ?? '',
          secretAccessKey: process.env.AWS_SECRETACCESSKEY ?? '',
        },
      })
    
    const upload = multer({
      storage: multerS3({
        s3: s3Client,
        bucket: process.env.AWS_S3_BUCKETNAME,
        metadata: function (req, file, cb) {
          cb(null, {fieldName: file.fieldname})
        },
        key: function (req, file, cb) {
          cb(null, Date.now().toString() + '-' + file.originalname)
        }
      })
    })
    
    module.exports = upload
    
    
  3. Create an endpoint to handle file uploads in app.js:

    const upload = require('./upload')
    
    app.post('/upload', upload.single('file'), (req, res) => {
      if (!req.file) {
        return res.status(400).send('No file uploaded.')
      }
      res.send({
        message: 'File uploaded successfully',
        file: req.file
      })
    })
    
    

At this point we should be able to upload a file to S3 through our API. We can test this by sending any handy/nearby file such as our app.js file.

curl "http://localhost:3000/upload" -F "file=@app.js"

Hopefully you will see a successful response. ****Double check in S3 to make sure the file exists.

Example good response (includes file details):

{
 "message":"File uploaded successfully",
 "file":{
   "fieldname":"file",
   "originalname":"app.js",
   "encoding":"7bit",
   "mimetype":"application/octet-stream",
   "size":470,
   "bucket":"YOUR_AWS_S3_BUCKETNAME",
   "key":"2445762390563-app.js",
   "acl":"private",
   "contentType":"application/octet-stream",
   "contentDisposition":null,
   "contentEncoding":null,
   "storageClass":"STANDARD",
   "serverSideEncryption":null,
   "metadata":{"fieldName":"file"},
   "location":"https://BUCKET.s3.REGION.amazonaws.com/2445762390563-app.js",
   "etag":"\"077da556b90169d85ea622402a5137c5\""
 }
}

Error Handling in File Uploads

Handling errors effectively is crucial for maintaining a reliable application, especially when dealing with external services like AWS S3. Here are key strategies for robust error handling:

  • Catch AWS S3 Errors: Ensure that your application gracefully handles AWS errors such as network issues, access permissions, and exceeded storage limits.

    app.post('/upload', 
      upload.single('file'), 
      (req, res) => {
        res.send({
          message: 'File uploaded successfully',
          file: req.file
        })
      }, 
      (err, req, res, next) => {
        if (err instanceof S3ServiceException) {
          console.log(err)
          return res.send('File upload exception')
        }
        res.send('Unexpected error')
      }
    )
    
  • Validate File Types and Sizes: Before a file reaches S3, validate its type and size within the Multer configuration. This prevents unnecessary S3 requests for files that don’t meet your criteria.

    Add a filter function to upload.js above upload

    const fileFilter = (req, file, cb) => {
      if (file.mimetype === 'image/jpeg' || file.mimetype === 'image/png') {
        cb(null, true)
      } else {
        cb(new Error('Invalid file type, only JPEG and PNG is allowed!'), false)
      }
    }
    

    Then we can use this in our multer configuration along with setting a file size limit

    const upload = multer({
      storage: multerS3({
        s3: s3Client,
        bucket: process.env.AWS_S3_BUCKETNAME,
        metadata: function (req, file, cb) {
          cb(null, {fieldName: file.fieldname})
        },
        key: function (req, file, cb) {
          cb(null, Date.now().toString() + '-' + file.originalname)
        }
      }),
      fileFilter,
      limits: { fileSize: 1024 * 1024 * 5 } // 5 MB limit
    })
    
  • Handle Multer-Specific Errors: Listen for errors on the request object. This can catch Multer errors related to file size and field names that do not align with your expectations. Note that we also move S3 error handling here as well.

    app.post('/upload',
      (req, res) => {
        const uploadSingle = upload.single('file')
        uploadSingle(req, res, (err) => {
          if (err instanceof multer.MulterError) {
            // Eg. File too large, Unexpected field
            return res.status(400).send(err.message) 
          } else if (err instanceof S3ServiceException) {
            // Eg. The bucket does not allow ACLs
            return res.status(500).send(err.message)
          } else if (err) {
            // Eg. Invalid file type
            return res.status(400).send(err.message) 
          }
          if (!req.file) {
            // Eg. File field was empty
            return res.status(400).send('No file uploaded.') 
          }
          res.send({
            message: 'File uploaded successfully',
            file: req.file
          })
        })
      }
    )
    

Customizing the Location Based on req and file

To dynamically customize where files are stored based on the request or file attributes, you can modify the key function in your Multer-S3 configuration:

  • Dynamic Paths Based on User, Date, or File: You can use request headers or file metadata to build the file path dynamically.

    key: function (req, file, cb) {
      // Use user id and date to categorize files
      const userId = req.user.id // Assuming user info is available in request
      const date = new Date().toISOString().split('T')[0]
      cb(null, `uploads/${userId}/${date}/${file.originalname}`)
    }
    

Writing Unit Tests by Mocking AWS S3

Unit testing file uploads can be challenging due to the external dependencies involved (AWS S3 in this case). However, by mocking these dependencies, we can simulate the behavior of our application without actually uploading files to S3.

  1. Install Jest and AWS SDK mocks:

    npm install --save-dev jest supertest aws-sdk-client-mock
    
    
  2. Set up Jest by configuring the package.json:

    "scripts": {
      "test": "jest"
    },
    "jest": {
      "testEnvironment": "node"
    }
    
    
  3. Tell the app not to listen when testing change in app.js:

    if (process.env.NODE_ENV !== 'test') {
      app.listen(port, () => {
        console.log(`App listening at <http://localhost>:${port}`)
      })
    }
    
  4. Create a test file upload.test.js:

    const request = require('supertest')
    const { S3Client }  = require('@aws-sdk/client-s3')
    const { mockClient } = require('aws-sdk-client-mock')
    const app = require('./app')
    
    const s3Mock = mockClient(S3Client)
    
    describe('File upload', () => {
    
      beforeEach(() => {
        s3Mock.reset()
      })
    
      it('should upload a file to S3', async () => {
        const res = await request(app)
          .post('/upload')
          .attach('file', Buffer.from('test file'), 'test.jpeg')
        
        expect(s3Mock.calls().length).toBe(1)
        expect(res.status).toBe(200)
        expect(res.body.file).toBeDefined()
      })
    })
    

Advanced Testing Techniques

Going beyond basic tests, you can incorporate more complex scenarios and integrate testing with continuous integration pipelines:

  • Test for Different File Types and Sizes: Ensure your application only accepts files that meet specific criteria by testing with files of various sizes and types.

    it('should reject oversized files', async () => {
      s3Mock.restore()
      const res = await request(app)
        .post('/upload')
        .attach('file', Buffer.alloc(1024 * 1024 * 10), 'test-big-file.png')
    
      expect(res.status).toBe(400)
      expect(res.text).toContain('File too large')
    })
    
  • Mock Different S3 Responses: Use different mock implementations to simulate how your application handles S3 errors like ServiceUnavailable or AccessDenied.

    it('should error on S3 failure', async () => {
      const error = new S3ServiceException({
        name: 'ServiceUnavailable',
        $fault: 'server',
        $metadata: {},
      })
      s3Mock.rejects(error)
    
      const res = await request(app)
        .post('/upload')
        .attach('file', Buffer.from('test file'), 'test.jpeg')
      
      expect(s3Mock.calls().length).toBe(1)
      expect(res.status).toBe(500)
      expect(res.text).toContain('ServiceUnavailable')
    })
    

By enhancing your application with sophisticated error handling, advanced testing, dynamic file path configuration, and rigorous pre-upload checks, you create a more robust, secure, and user-friendly file upload feature. These improvements not only optimize the operational aspects but also enhance the security and efficiency of handling user-generated content.

Conclusion

By integrating multer with multer-s3, Node.js applications can efficiently handle file uploads directly to AWS S3. The combination of these tools simplifies the file handling process, allowing developers to focus on core application logic without worrying about the details of file storage and security.

Mocking AWS S3 in unit tests is crucial for ensuring that our application behaves as expected without the overhead of real network I/O. This approach not only saves time during testing but also reduces costs associated with using AWS resources. With proper testing in place, we can confidently deploy our file upload features, knowing they’ll work correctly in production.



Subscribe via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.





Swift Tweets