How to mock AWS calls when using Boto3 (version 1.

2019-08-17 14:37发布

I have an API written in python that makes calls to AWS services, specifically sqs, s3, and dynamodb. I am trying to write unit tests for the API and I want to mock all calls to AWS. I have done a lot of research into moto as a way to mock these services however every implementation I have tried does not mock my calls and sends real requests to AWS. Looking into this problem I found people discussing some incompatibilities between boto and moto when using boto3>=1.8. Is there any way around this? My ultimate question is this: Is there an easy way to mock boto3 calls to sqs, s3, and dynamodb using either moto or some other library when using boto3>=1.8?

Here are my current versions of boto3 and moto I am using:

boto3 == 1.9.314
moto == 1.3.11

Below is my latest attempt at using moto to mock calls to sqs. I defined a pytest fixture where I create a mock_sqs session and a (hopefully fake) queue. I use this fixture to unit test my get_queue_item function.

SQS Script

# ptr_api.aws.sqs
import boto3

REGION = 'us-east-1'

sqs_r = boto3.resource('sqs', REGION)
sqs_c = boto3.client('sqs', REGION)

def get_queue_item(queue_name):
    queue = sqs_r.get_queue_by_name(QueueName=queue_name)
    queue_url = queue.url

    response = sqs_c.receive_message(
        QueueUrl=queue_url,
        MaxNumberOfMessages=1,    
        VisibilityTimeout=10,
        WaitTimeSeconds=3
    )

    try:
        message = response['Messages'][0]
        receipt_handle = message['ReceiptHandle']
        delete_response = sqs_c.delete_message(QueueUrl=queue_url,
        ReceiptHandle=receipt_handle)
        return message['Body']
    except Exception as e:
        print("error in get_queue_item: ")
        print(e)
        return False

Test SQS Script

# test_sqs.py
import pytest
from moto import mock_sqs
import boto3
from ptr_api.aws.sqs import get_queue_item

@pytest.fixture
def sqs_mocker(scope='session', autouse=True):
   mock = mock_sqs()
   mock.start()

   sqs_r = boto3.resource('sqs', 'us-east-1')
   sqs_c = boto3.client('sqs', 'us-east-1')

   queue_name = 'test_queue_please_dont_actually_exist'

   queue_url = sqs_c.create_queue(
       QueueName=queue_name
   )['QueueUrl']

   yield (sqs_c, queue_url, queue_name)
   mock.stop()

def test_get_queue_item(sqs_mocker):
   sqs_c, queue_url, queue_name = sqs_mocker

   message_body = 'why hello there' # Create dummy message
   sqs_c.send_message(              # Send message to fake queue
       QueueUrl=queue_url,
       MessageBody=message_body,
   )

   res = get_queue_item(queue_name) # Test get_queue_item function

   assert res == message_body

When I go to check the console however, I see the queue has actually been created. I have also tried moving around the order of my imports but nothing seemed to work. I tried using mock decorators and I even briefly played around with moto's stand-alone server mode. Am I doing something wrong or is it really just the boto3/moto incompatibility I have been hearing about with newer versions of boto3? Downgrading my version of boto3 is not an option unfortunately. Is there another way to get the results I want with another library? I have looked a little bit into localstack but I want to make sure that is my only option before I give up on moto entirely.

1条回答
不美不萌又怎样
2楼-- · 2019-08-17 15:06

I figured out a way to mock all my AWS calls! I am confident now that moto and boto3>=1.8 currently has serious incompatibility issues. Turns out the problem is with botocore >= 1.11.0 which no longer uses requests and instead directly uses urllib3: This means moto cannot use responses the same way it did before, hence the incompatibility issues. To get around this though, I instead created stand-alone moto servers for each of the AWS services I wanted to mock which worked like a charm! By creating the mock servers and not mocking the requests themselves, there wasn't any issues with moto using responses.

I set these mock servers running in the backgound by using a separate start_local.py script. Next I made sure to change my unit test's boto3 reource and client objects to now reference these mock endpoints. Now I can run my pytests without any calls being made to aws and no need to mock aws credentials!

Below is the new start_local.py script and my updated sqs unit test:

Start local AWS services

# start_local.py
import boto3
import threading, subprocess

def start_sqs(port=5002):
    subprocess.call(["moto_server", "sqs", f"-p{port}"])

sqs = threading.Thread(target=start_sqs)

sqs.start()

New Test SQS Script

import pytest
import boto3
import os
from ptr_api.aws import sqs

@pytest.fixture
def sqs_mocker(scope='session', autouse=True):

    sqs_r_mock = boto3.resource('sqs', region_name='us-east-1', endpoint_url=f'http://localhost:5002')
    sqs_c_mock = boto3.client('sqs', region_name='us-east-1', endpoint_url=f'http://localhost:5002')

    queue_name = 'test_queue'

    queue_url = sqs_c_mock.create_queue(
        QueueName=queue_name
    )['QueueUrl']

    yield (sqs_r_mock, sqs_c_mock, queue_url, queue_name)

def test_get_queue_item(sqs_mocker):

    sqs_r_mock, sqs_c_mock, queue_url, queue_name = sqs_mocker

    message_body = 'why hello there' # Create dummy message
    sqs_c_mock.send_message(         # Send message to fake queue
        QueueUrl=queue_url,
        MessageBody=message_body,
    )

    sqs.sqs_r = sqs_r_mock # VERY IMPORTANT - Override boto3 resource global variable within imported module with mock resource
    sqs.sqs_c = sqs_c_mock # VERY IMPORTANT - Override boto3 client global variable within imported module with mock client
    res = sqs.get_queue_item(queue_name) # Test get_queue_item function

    assert res == message_body
查看更多
登录 后发表回答