I am confused about mocking aws-sdk with jest, I am trying to mock the putObject method on the S3 prototype. I cannot find the method although it seems like an instance method, I guess it is because you use a custom factory to build this.
I tried to look at the source code of aws-sdk-mock but I could not figure out how it mocks individual SDK methods.
How can I mock the S3.putObject with jest. My code is listed in the stackoverflow question here:
https://stackoverflow.com/questions/49187464/mocking-aws-sdk-s3putobject-instance-method-using-jest
Hi @vamsiampolu
The S3 client and all other service clients are generated dynamically at runtime. This is why you cannot find the operation in the constructor's prototype. But you can mock the operation after the S3 client gets instantiated. Here is a simplified example:
var AWS = require('aws-sdk');
describe('mock a method', () => {
it ('putObject', () => {
var s3 = new AWS.S3({region: 'us-west-2'});
s3.putObject = jest.fn((params, cb) => {
cb(null, 'data');
})
s3.putObject({}, function(err, data){
expect(data).toEqual('data');
})
})
})
Sorry to bring this back from the dead, but I would just like to ask how to mock the section? For example, I have some code that is using the SDK, but it is in a completely different file. I noticed that the example above, the instance of S3 was created, modified, then used, but I would like to do something like this:
// test.js
const AWS = require('aws-sdk')
const DynamoDB = new AWS.DynamoDB.DocumentClient()
DynamoDB.get = jest.fn((params) => { }) // gets it from local store
// file.js
const AWS = require('aws-sdk')
const DynamoDB = new AWS.DynamoDB.DocumentClient()
DynamoDB.get({ TableName: 'test', ...etc }) //this would call test.js get
I am not sure that this is possible since I am loading the lib twice, but possibly with module cache, this would be possible?
You can give a try but I don't think this would be possible. The module cache will only help caching the service clients under AWS namespace but when you construct the DynamoDB client, new operation objects will be attached to the client object. Unless you export the already constructed client from test.js and import to file.js. But this would change your code, so I don't think this will be easy.
I've made it work like this:
```js
// updateQueues.js
import AWS from 'aws-sdk'
// exported for test mocks
export const sqs = new AWS.SQS()
export const main = async event => {
// ...
sqs.deleteQueue({ aaa: 1234 })
// updateQueues.test.js
import { main as updateQueues, sqs } from './updateQueues'
const addEvent = require('../../mocks/tableAddEvent.json')
describe('updateQueues', () => {
it('deletes queue for removed object', async () => {
sqs.deleteQueue = jest.fn()
await updateQueues(addEvent)
expect(sqs.deleteQueue).toHaveBeenCalledWith({ aaa: 1234 })
})
})
@esbenvb, I guess that could actually work since the value is exported. It would simply have to be changed in the test before the process is actually tested. Thanks for that 馃憤
Looks like this was resolved.
Although it looks resolved, exporting a variable like it's being done, just for test purposes, breaks the principle of encapsulation.
In other words., if this interface doesn't have any mean when using the module but serve as a workaround for mocking a module when testing, that's when encapsulation is broken
@AllanFly120, I don't think your example is very useful because you're mocking a specific instance of AWS.S3: unless this is the exact instance you're using in the code under test (which would typically be in another file), your mock is not going to be used.
I had luck with the following pattern to mock out the behavior of the S3 getObject method, for instance:
const AWS = require('aws-sdk');
const mockS3GetObject = jest.fn();
jest.mock('aws-sdk', () => {
return {
S3: jest.fn(() => ({
getObject: mockS3GetObject
}))
};
});
describe('...', () => {
beforeEach(() => {
mockS3GetObject.mockReset();
});
test("...", async () => {
mockS3GetObject.mockImplementation((params) => {
return {
promise() {
return Promise.resolve({ Body: "test document" })
}
};
});
expect(await functionUnderTest()).toEqual("test document");
});
});
// E.g., in another module
const AWS = require('aws-sdk');
async function functionUnderTest() {
const resp = await new AWS.S3(...).getObject(...).promise();
return resp.Body;
}
Although it looks resolved, exporting a variable like it's being done, just for test purposes, breaks the principle of encapsulation.
In other words., if this interface doesn't have any mean when using the module but serve as a workaround for mocking a module when testing, that's when encapsulation is broken
@Otavioensa Could you please elaborate a little more? I'm struggling to figure this out and want to do it right.
This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs and link to relevant comments in this thread.
Most helpful comment
Hi @vamsiampolu
The S3 client and all other service clients are generated dynamically at runtime. This is why you cannot find the operation in the constructor's prototype. But you can mock the operation after the S3 client gets instantiated. Here is a simplified example: