Over the last 10 years, the number of applications has slowly migrated to a public cloud, private cloud, or a hybrid of both. Cloud usage has reached a tipping point, and the development of new custom applications on infrastructure-as-a-service (IaaS) platforms – like Amazon Web Services (AWS) – has been increasing significantly.
Imagine what would have happened if an online fashion retailer’s application went down, or a car company’s call centre stopped working. Enterprises can’t afford to have their AWS environment – or the custom applications running in AWS – compromised. That would greatly impact the organisation’s ability to operate.
“Through 2020, 95% of cloud security failures will be the customer’s fault.” Gartner, “Top Predictions for IT Organizations and Users for 2016 and Beyond”

Therefore, we would do everything in our power to protect our sensitive data, operating systems, platforms, networks etc. So, the main question is: “Where should we start and how should we build an automatic security engine that would indicate all the possible failures, missing authentications and authorisations, insecure EC2 ports or not enough encryption”.
Having said that, it’s the perfect time we mention Prowler and Scout Suite. These are two great security auditing command-line tools from which we can get a clear view of the potential attacks, highlighting the risk areas. The only thing you need is your AWS credentials. The security auditing tools will take it from here.
Now, when we have found our security auditing tools, let’s create our engine.
What we will do:
· Python lambda (Serverless)
· 2 lambda layers (Prowler and Scout Suite) + Terraform scripts
· Scheduler lambda with cron inside it for refreshing on a daily basis
Python Lambda (Serverless)
The Python version used in the article is 3.7.
Since both Prowler and Scout Suite are written in Python, let’s develop our lambda in Python as well.
Make sure you have a virtual environment active in CLI.
$ source venv/bin/activate
Next, install serverless-python-requirements and docker.
yarn add serverless-python-requirements
Then add the following to your serverless.yml.
Note: This example is done on MacOS and we need to deploy Linux Python modules – so for that, serverless-python-requirements will create Docker container with Linux image to fetch Linux Python dependencies. This is all done behind the scenes and you will require to have Docker deamon up and running.
Install the required dependencies with pip.
(venv) $ pip install
(venv) $ pip freeze > requirements.txt
(venv) $ serverless deploy
As a result, the serverless-python-requirements will build the Python packages in docker and then zip them up, so they can be uploaded with the rest of our lambda code. This will automatically create an additional lambda layer allowing us to easily access all the packages needed for performing our security audit commands.
Put the layer in your serverless file in the lambda definition.
layers:
- { Ref: PythonRequirementsLambdaLayer }
Note: If you want to perform an audit on an external AWS account, you have to set the credentials from their environment as your local variables.
While performing the Scout Suite command, we have to specify the exact path where it is located, since their main.py script is not on the first level in their file organisation. Put this variable in your config.yml.
SCOUT_SUITE_PATH: /opt/scout-suite/ScoutSuite
…and we have our code for audit actions. Now, let’s test it!
We noticed that an error has occurred: “asyncio-throttle couldn’t be found; python-dateutil couldn’t be found; CherryPy couldn’t be found…”. None of the packages required for the successful running of the lambda can be found. What are we missing…?
We forgot to put the PYTHONPATH as a variable in our config.yml file.
PYTHONPATH: /opt/python
Now, we have a fully running security audit engine in our organisation.
Two lambda layers (Prowler and Scout Suite) + Terraform scripts
The terraform version used in the article is 0.12.16.
Let’s create the layers for the auditing tools and add them to our infrastructure.
Make sure you have zip files with the latest code for Prowler and Scout Suite. Add an AWS provider in your terraform script.
Use an aws_lamba_layer_version resource, so you can define the layers.
Scheduler lambda with cron inside it for refreshing on a daily basis.
Despite the Python lambda, we added the “Scheduler” lambda which is invoked once a day and is triggering our security engine. This way we have the latest information about the AWS accounts connected to our system every day.
EXTRA: Discover your resources
After analysing the output data from Scout Suite, we have noticed that all the AWS resources that were examined have been returned with detailed information about their ARN, region, name, description etc.
So while performing the security audit, we’ve also collected resources that can be later used in other parts of our system.
If you have any questions, feel free to send us an email at: info@alite-international.com.