Contact

Blog

Moto Mock Testing with AWS Lambda: Local, Fast & Cost-Effective

Testing AWS Lambdas in a real AWS environment can be slow, costly and risky, especially if you’re spinning up DynamoDB tables or S3 buckets just to run a few tests. Every deployment takes time and mistakes can easily lead to unexpected bills.

That’s where the moto Python library comes in. It lets you mock AWS services locally, so you can test your Lambda functions without touching the cloud, saving both time and money. In this post, find out how to:

  • Set up moto for AWS Lambda testing.
  • Create mocked DynamoDB and S3 resources.
  • Run tests that feel like production – only faster and safer.

Why use moto for AWS Lambda testing?

In this blog, we are going to look into ways of testing your developed AWS Lambdas – this time with mock testing and a Python library called moto. Moto library is used as a way of mocking AWS resources inside tests, and by approaching your testing this way, you do not need to deploy your resources to AWS and risk spending additional resources – everything is done locally and in a safe environment.

In the code example below, you can find a base test class, which could be used as a starting point for all your Lambdas – the base testing class has a DynamoDB table and S3 bucket mocked, so your boto3 calls work like in the production environment. In addition, the Lambda code which is going to be tested will be Lambdas from one of project where a Car Marketplace was created, with the NextJS and AWS stack.

Setting up the base test class

We’re going to go snippet by snippet here and explain how to create the necessary resources for executing the tests with mocked resources.

@mock_aws
class BaseCarListingTest(unittest.TestCase):
def setUp(self):
    self.table_name = 'test-table'
    self.bucket_name = 'test-bucket'
    self.jwt_secret = 'secret'
self.env_vars = {
    'CAR_LISTINGS_TABLE': self.table_name,
    'CAR_LISTINGS_BUCKET': self.bucket_name,
    'SECRET_KEY': self.jwt_secret
}
patch.dict(os.environ, self.env_vars).start()

A few things are important here:

  • @mock_aws decorator – this decorator marks the usage of the moto library, so the library itself knows that the user is trying to mock the resources defined in the class defined under it. You have a choice if you want to mock specific methods or the whole test class – in this case, I mocked the whole class.
  • The setUp method – this method is used so the resources get set up for each of the tests. You can use the setUpClass method here as well.
  • Dictionary env_vars and patch.dict method – this is used to create environment variables for the tests which are going to be run. Remember – one of the best practices for writing AWS Lambdas is having the names of resources like tables, S3 bucket names etc. defined inside the environment variables.

Defining Lambda resources – DynamoDB and S3

In the following code block, we are going to define the global variables for the Lambda – _LAMBDA_S3_CLIENT and _LAMBDA_DYNAMODB_RESOURCE – we’re doing it this way because follows the best practices for writing AWS Lambdas, making these resources global makes the Lambda faster and more efficient.

_LAMBDA_DYNAMODB_RESOURCE = {
    "resource" : boto3.resource('dynamodb'),
    "table_name" : 'test-table'
}
_LAMBDA_S3_CLIENT = {
    "client" : boto3.client('s3',
        endpoint_url=f'<https://s3>.{environ.get("AWS_REGION", "eu-central-1")}.amazonaws.com'
    ),
    "bucket_name" : 'test-bucket'
}
self.dynamodb_class = LambdaDynamoDBClass(_LAMBDA_DYNAMODB_RESOURCE)
self.s3_class = LambdaS3Class(_LAMBDA_S3_CLIENT)

Objects self.dynamodb_class and self.s3_class will be used later in the tests themselves.

Creation of the mocked DynamoDB table and S3 bucket

This code is using resource and client objects for DynamoDB and S3, respectively, which we created in the code above, to create the mocked DynamoDB table and S3 bucket.

The code is going to create the DynamoDB table, make the field uuid the partition key of the table and then wait until the table is created, so we eliminate the chance of a test starting before the table is created. Similar goes for the S3 bucket, but a bit simpler.

self.dynamodb_table = self.dynamodb_class.resource.create_table(
    TableName=environ.get('CAR_LISTINGS_TABLE'),
    KeySchema=[
        {
            'AttributeName': 'uuid', # Defining the DynamoDB’s partition key
            'KeyType': 'HASH'
        }
    ],
    AttributeDefinitions=[
        {
            'AttributeName': 'uuid',
            'AttributeType': 'S'
        }
    ],
    ProvisionedThroughput={
        'ReadCapacityUnits': 1,
        'WriteCapacityUnits': 1
    }
)
self.dynamodb_table.meta.client.get_waiter('table_exists').wait(TableName=self.dynamodb_class.table_name) # Waiting for the table to be created

# Creating the client for accessing our mocked S3 bucket
self.s3_bucket = self.s3_class.client.create_bucket(
    Bucket=self.s3_class.bucket_name,
    CreateBucketConfiguration={
        'LocationConstraint': 'eu-central-1'
    })

The Lambda Function under test

Before we start our test, here is the Lambda code we’re testing:

# Importing code
import json
import uuid

from common.common import (
    LambdaDynamoDBClass,
    LambdaS3Class,
    _LAMBDA_DYNAMODB_RESOURCE,
    _LAMBDA_S3_CLIENT,
    lambda_middleware,
    get_email_from_jwt_token
)

@lambda_middleware
def lambda_handler(event, context):
    # Loading the received event and creating the variables
    event_body = json.loads(event.get('body')) if 'body' in event else event
    print(f"Received event in lambda_handler: {event_body}")
    
    listing_id = str(uuid.uuid4())
    creator_email = get_email_from_jwt_token(event.get('headers').get('authorization'))
    listing_description = event_body.get('description')
    listing_price = event_body.get('price')
    listing_city = event_body.get('city')
    listing_country = event_body.get('country')
    listing_bedrooms = event_body.get('bedrooms')
    listing_rooms = event_body.get('rooms')
    listing_size = event_body.get('size')
    listing_type = event_body.get('type')
    listing_coordinates = event_body.get('coordinates')
    listing_number_of_images = event_body.get('listing_number_of_images')
    
    # Loading our global variables and calling the method which does our Lambda logic
    global _LAMBDA_DYNAMODB_RESOURCE
    global _LAMBDA_S3_CLIENT

    dynamodb_class = LambdaDynamoDBClass(_LAMBDA_DYNAMODB_RESOURCE)
    s3_class = LambdaS3Class(_LAMBDA_S3_CLIENT)
    
    # Calling the method with our Lambda logic
    return create_new_listing(
        creator_email,
        listing_id,
        listing_description,
        listing_price,
        listing_city,
        listing_country,
        listing_bedrooms,
        listing_rooms,
        listing_size,
        listing_type,
        listing_coordinates,
        listing_number_of_images,
        dynamodb_class,
        s3_class
    )

def create_new_listing(
    creator_email, listing_id, listing_description,
    listing_price, listing_city,
    listing_country, listing_bedrooms,
    listing_rooms, listing_size,
    listing_type, listing_coordinates,
    listing_number_of_images,
    dynamodb, s3
):
    try:
        # Save the listing to DynamoDB
        dynamodb.table.put_item(
            Item={
                'uuid': listing_id,
                'listing_author': creator_email,
                'listing_description': listing_description,
                'listing_price': listing_price,
                'listing_city': listing_city,
                'listing_country': listing_country,
                'listing_bedrooms': listing_bedrooms,
                'listing_rooms': listing_rooms,
                'listing_size': listing_size,
                'listing_type': listing_type,
                'listing_coordinates': listing_coordinates,
                'listing_images_location': listing_id
            }
        )

        # Provide presigned URLs for each image
        presigned_urls = []
        for i in range(listing_number_of_images):
            # Generate a presigned URL for each image
            presigned_url = s3.client.generate_presigned_url(
                'put_object',
                Params={'Bucket': s3.bucket_name, 'Key': f"{listing_id}/{i}"},
                ExpiresIn=3600  # URL expires in 3600 seconds (1 hour)
            )
            presigned_urls.append(presigned_url)

        return json.dumps({
            'statusCode': 200,
            'upload_car_image_url': presigned_urls
        })

    except Exception as e:
        print(f"Error processing the event: {e}")
        return json.dumps({
            'statusCode': 500,
            'error_message': 'An error occurred while processing the request.'
        })

Maybe it looks complex, but it’s just Lambda, which is used to save the new car listing when a user requests to create one.

Writing the tests

In this simple test suite, we are only going to test the create_new_listing method, which takes the user’s input like listing title, car make and model and other information and try to put the information inside the DynamoDB table and S3.

We’re going to test that the response status code is 200, that the Lambda returns a URL and see inside the database if the Lambda created the entry.

# Imports
import unittest
from moto import mock_aws
import json
import uuid

from base_listing_test import BaseCarListingTest
from CreateRealEstateListing.lambda_handler import create_new_listing

# Marking our test class with the mock_aws decorator to make our test class use our mocked DynamoDB table and S3 bucket
@mock_aws
class TestCreateCarListing(BaseCarListingTest):
    
    # Test for creating a new listing which calls the Lambda logic method and verifies the response of the Lambda
    def test_create_new_listing_success(self):
        listing_id = str(uuid.uuid4())
        response = create_new_listing(
            'test@test.com', listing_id, 'Beautiful apartment',
            150000, 'Berlin', 'Germany', 3, 5, 120,
            'apartment', {'lat': 52.52, 'lng': 13.405}, 3,
            self.dynamodb_class, self.s3_class
        )

        response_body = json.loads(response)
        self.assertEqual(response_body['statusCode'], 200)
        self.assertIn('upload_car_image_url', response_body)
        self.assertEqual(len(response_body['upload_car_image_url']), 3)
    
    # Test for creating the new listing, however this time we are expecting the Lambda to return that information is missing
    def test_create_new_listing_failure(self):
        response = create_new_listing(
            None, str(uuid.uuid4()), 'Missing email test',
            100000, 'Paris', 'France', 2, 4, 100,
            'house', {'lat': 48.8566, 'lng': 2.3522}, 2,
            self.dynamodb_class, self.s3_class
        )

        response_body = json.loads(response)
        self.assertEqual(response_body['statusCode'], 500)
        self.assertIn('error_message', response_body)

if __name__ == '__main__':
    unittest.main()

Notice how we are calling the method from the Lambda handler which uses the business logic straight away. That’s why one of the best practices for writing AWS Lambdas is to separate the core business logic of the Lambda from the handler method. This makes mock testing our Lambda easier because we can inject the mocked S3 and DynamoDB objects straight to the business logic of the Lambda.
In this code, we have 2 simple tests, one which has a successful execution, and the other test expects an error message.
This makes a huge difference in testing automation – you don’t have to deploy your resources to test them, you can just mock them and test them in the pipeline to confirm that your Lambdas are working as expected.

You can run your tests by opening your terminal and running pythom -m unittest <path_to_test_file> .

Conclusion and next steps

Mock testing AWS Lambdas with the Moto library enables developers to work faster by removing the delays associated with AWS deployments, while also reducing costs by avoiding the creation of real cloud resources.

It allows you to validate functionality in a controlled, local environment, making it safer and more efficient to iterate on your code. By integrating moto into your testing workflow, you can confidently ensure your Lambda functions work as expected before deploying to production.

To explore further how Lambda can help further don’t hesitate to reach out!

Share

Services

IT Services
QA

Let's collaborate

Partner with us

Let’s work together to create smarter, more effective solutions for your business.

Related blogs

Digital illustration of a human face made of glowing blue and orange particles, symbolising artificial intelligence, data, and human cognition.

There’s no shortage of data in marketing: dashboards, attribution reports, CRM exports, social insights. The problem isn’t getting hold of it. The problem is knowing what to do with it.…

29 October 2025

Data and Analytics
Digital Marketing
An outstretched hand in the foreground with three sticky notes on it: a purple note labeled "To Do," a yellow note labeled "Doing," and a pink note labeled "Done." The background shows an office setting.

Struggling with missed deadlines, shifting priorities and team burnout? You’re not alone. Many tech teams struggle to deliver consistent value in fast-paced markets filled with evolving requirements and stakeholder demands.…

03 July 2025

Technology
|
IT Services

Who we are

Explore how our culture and expertise fuel digital innovation