Use AWS Services in a local environment
Q: Can we use AWS services without a subscription, without paying for services, and complete offline mode?
Ans: Yes, We can do this by using localstack. We can use almost all AWS services without payment and complete offline mode, this sounds crazy, right? Let’s get into the details to know how to set up and use services.
Keywords: Localstack, AWS, SNS, SQS, Lambda, AWS CLI, Free AWS services
Introduction: After development completed, We deploy applications in AWS or any other cloud services like Azure, GCP, Alibaba, etc. We usually take the subscription and use the required services till the free tier completed, post free tier we pay per use basis, if we carefully observe this pipeline here we really don’t have a mechanism to test the AWS cloud services before deploying to actual AWS Cloud. If we didn’t test prior to deployment then we might get errors and need to resolve them while paying for those services. We end up paying a huge amount of money for the building, testing, and experimenting with pipelines in the AWS cloud. To avoid this, we can make use of Localstack, which will allow us to set up AWS services in local and offline mode, we can build pipelines, we can test pipelines, and experiment with services before deploying to the actual AWS cloud.
Localstack set up: LocalStack is a Python-based dockerized application designed to run as an HTTP request processor while listening on specific ports. As this is a dockerized application anyone without prior knowledge of python can be able to install and use this application.
There are two ways to use this application:
1. Install localstack as a python package
2. Use docker-compose
1.Install localstack as a python package: Create a virtual environment as it is recommended to create a virtual environment for every project if possible. We can create a virtual environment either by using anaconda or pip. Install local stack package using the below command.
pip install localstack
We then start localstack with the “start” command as shown below. This will start LocalStack inside a Docker container.
localstack start
2.Use docker-compose: Download the docker-compose file from Localstack repository by using the below link, and run using the command “docker-compose up”
docker-compose.yml file looks like this

When we start the application all available services will run, we can also select specific services if we want. Read the documentation part of GitHub to know more about the number of services available. The default port is 4566, all services routed from the 4566 port.
Configure AWS CLI: We need to create a fake profile, we can use AWS CLI to do this. All the AWS CLI commands will work in localstack too, but only minor changes need to do to point services to localstack from actual AWS. Create a profile using the below command
aws configure --profile localstack
This will prompt for the AWS Access Key, Secret Access Key, and an AWS region. We can provide any dummy value for the credentials and a valid region name like us-east-1, but we can’t leave any of the values blank. Unlike AWS, LocalStack does not validate these credentials but complains if no profile is set. So far, it’s just like any other AWS profile which we will use to work with LocalStack.
So far, we discussed localstack and it’s set up, let’s create a sample project using services like lambda, SNS, and SQS.
A Sample Project: I’ve created a project which will use SNS (Simple notification service), SQS(Simple queue service), and Lambda service.

SNS is a notification service that can be used to publish messages to receivers like email, SMS, SQS, application, EndpointArn for a mobile app and device, and lambda. I’ve used SQS, a queue service to receive the message and store them in a queue. Then I used lambda service to pull messages and stored them in the output file. If we want to read in detail documentation of SNS, SQS, or any other service, go through AWS documentation.
Steps to create the project: We are going to use AWS CLI to execute commands. We can use AWS CLI commands with slight modifications, we need to provide localstack URL, region for every command.
Step-1: Create an SNS Topic.
AWS_DEFAULT_REGION=eu-east-1 aws — endpoint-url=http://localhost:4566 sns create-topic — name sns-publish-msgs
Step-2: List SNS topics
AWS_DEFAULT_REGION=eu-east-1 aws — endpoint-url=http://localhost:4566 sns list-topics
Step-3: Create an SQS queue
AWS_DEFAULT_REGION=eu-east-1 aws — endpoint-url=http://localhost:4566 sqs create-queue — queue-name sqs_queue
Step-4: Create a subscription from SNS to SQS, so that whenever SNS publish messages they reach to SQS queue.
AWS_DEFAULT_REGION=eu-east-1 aws — endpoint-url=http://localhost:4566 sns subscribe — topic-arn arn:aws:sns:us-east-1:000000000000:sns-publish-msgs — protocol sqs — notification-endpoint http://localhost:4566/000000000000/sqs_queue
Step-5: Publish message from SNS
AWS_DEFAULT_REGION=eu-east-1 aws — endpoint-url=http://localhost:4566 sns publish — message “Test_Message” — subject Test_Subject — topic-arn arn:aws:sns:us-east-1:000000000000:sns-publish-msgs
Step-6: View messages in SQS Queue
AWS_DEFAULT_REGION=eu-east-1 aws — endpoint-url=http://localhost:4566 sqs receive-message — queue-url http://localhost:4566/000000000000/sqs_queue — max-number-of-messages 10
Step-7: Create a lambda function to pull messages and delete them after that.
AWS_DEFAULT_REGION=eu-east-1 aws — endpoint-url=http://localhost:4566 lambda create-function — function-name test_lambda — zip-file fileb://python-new.zip — handler lambda-fn.lambda_handler — runtime python3.7 — role arn:aws:iam::000000000000:role/lambda-s3 — timeout 900
Here, I’ve used python-new.zip to create a lambda function. Inside zip, I’ve placed a python file name lambda-fn.py. The code inside lambda-fn.py shown below.
from __future__ import print_function
import json
import os
import boto3
# Create SQS client
sqs = boto3.client('sqs',
endpoint_url="http://{}:4566".format(os.environ['LOCALSTACK_HOSTNAME']),
use_ssl=False
)
queue_url = 'http://{}:4566/000000000000/sqs_queue'.format(os.environ['LOCALSTACK_HOSTNAME'])
def lambda_handler(event, context):
# Receive message from SQS queue
response = sqs.receive_message(
QueueUrl=queue_url,
AttributeNames=[
'SentTimestamp'
],
MaxNumberOfMessages=7,
MessageAttributeNames=[
'All'
],
VisibilityTimeout=0,
WaitTimeSeconds=0
)
res_list = []
if "Messages" in response.keys():
for each_msg in response['Messages']:
res_dict = {}
res_dict['Message Id'] = each_msg['MessageId']
res_dict['Subject'] = json.loads(each_msg['Body'])['Subject']
res_dict['Message Body'] = json.loads(each_msg['Body'])['Message']
res_list.append(res_dict)
receipt_handle = each_msg['ReceiptHandle']
# # Delete received message from queue
sqs.delete_message(
QueueUrl=queue_url,
ReceiptHandle=receipt_handle
)
print('Message deleted', each_msg['MessageId'])
else:
return 'No message available in sqs service'
return res_dict
Step-8: Invoke lambda function
AWS_DEFAULT_REGION=eu-east-1 aws — endpoint-url=http://localhost:4566 lambda invoke — function-name arn:aws:lambda:us-east-1:000000000000:function:test_lambda — invocation-type RequestResponse outfile.txt
Step-9: View output. The output of the lambda function is stored in the outfile.txt file. View the output using the below command
vi outfile.txt
Conclusion: Localstack is very much helpful when we are learning, testing, and experimenting with AWS services before deployment. Nowadays we are seeing many organizations and professionals paying a huge amount of money to learn, test, and experiment. Localstack also has a very good active development community, a lot of services getting added continuously.
Happy learning😊…