Test

Overview

Security is an essential element of any application especially when it comes to the Restful API layer. Thousands of calls are made daily to share information via Rest APIs, making security a top concern for all organizations in all stages: designing, testing and deploying the APIs. We are living in an era where our private information is more vulnerable than ever before, so it’s very important to protect your APIs from threats and vulnerabilities that keep on increasing daily.

In addition to all the guidelines available for building a secure API, an important step is to make your API private. Attackers will not be able to launch any attack on your API if they can’t find it. Exposing your APIs to the public will add a range of security and management challenges that you can avoid.

While it’s easy to spin up simple these cloud architectures, mistakes can easily be made provisioning complex ones. Human error will always be present, especially when you can launch cloud infrastructure by clicking buttons on a web app.

The only way to avoid these kinds of errors is through automation, and Infrastructure as Code is helping engineers automatically launch cloud environments quickly and without mistakes.

Why Amazon Web Services ?

AWS Lambda and AWS API Gateway have made creating serverless APIs extremely easy. You can simply upload your decision service to AWS Lambda, configure an API Gateway, and start responding to RESTful endpoint calls. However, the way you build and deploy your Lambda Decision Service is usually not as straightforward as it seems on the surface.

Why Terraform ?

Terraform is a tool for developing, changing and versioning infrastructure safely and efficiently. It can manage existing and popular service providers as well as custom in-house solutions. Terraform is the first multi-cloud immutable infrastructure tool that was introduced to the world by HashiCorp, released three years ago, and written in Go.

Terraform’s speed and operations are exceptional. One cool thing about it is, it’s plan command lets you see what changes you’re about to apply before you apply them. Code reuse feature and Terraform tends to make most changes faster than similar tools like CloudFormation.

Proposed Architecture

Please refer to the IBM ODM installation on AWS Lambda case study for a detailed explanation of the AWS architecture.

Proposed Solution

What Should Be Pre-Installed

In order to follow this case study you will need an AWS account and to have Terraform installed. Configure your credentials so that Terraform is able to act on your behalf.

For simplicity here we will assume you are already using a set of IAM credentials with suitable access to create Lambda functions and work with API Gateway. If you aren’t sure and are working in an AWS account used only for development, the simplest approach to get started is to use credentials with full administrative access to the target AWS account.

◊ Following this case study will create objects in your AWS account that will cost you money against your AWS bill.

What You Will Do

This case study will show how to deploy IBM ODM on AWS Lambda using Terraform. The guide assumes some basic familiarity with Lambda and API Gateway but does not assume any pre-existing deployment. It also assumes that you are familiar with the usual Terraform plan/apply workflow; if you’re new to Terraform itself, refer first to the Getting Started guide.

Building the Lambda Function Package

AWS Lambda expects a function’s implementation to be provided as an archive containing the function source code and any other static files needed to execute the function.

Terraform is not a build tool, so the jar archive must be prepared using a separate build process prior to deploying it with Terraform. For a real application we recommend automating your build via a CI system, whose job is to run any necessary build actions (library installation, compilation, etc), produce the deployment zip file as a build artifact, and then upload that artifact into an Amazon S3 bucket from which it will be ready for deployment.

In the IBM ODM installation on AWS Lambda case study, we’ve performed in detail these build steps manually to build the ODM Lambda function.

For the sake of this case study, we’ve uploaded the artifact to the following link

Creating the Lambda Function

With the source code artifact built and uploaded to S3, we can now write our Terraform configuration to deploy it. In a new directory, create a file named lambda.tf containing the following configuration:

provider "aws" {
region = "us-west-2"
}
resource "aws_lambda_function" "example" {
function_name = "hello_world"
s3_bucket = "ibm-odm-dependencies"
s3_key = "greetings-0.0.1-SNAPSHOT-shaded.jar"
handler = "com.sb.greetings.SimpleDecisionEngineRunner::handleRequest"
runtime = "java8"
timeout="10"

role = aws_iam_role.lambda_exec.arn
}

# IAM role which dictates what other AWS services the Lambda function
# may access.
resource "aws_iam_role" "lambda_exec"{
name = "hello_world_lambda"

assume_role_policy = <<EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Action": "sts:AssumeRole",
"Principal": {
"Service": "lambda.amazonaws.com"
},
"Effect": "Allow",
"Sid": ""
}
]
}
EOF
}

Each Lambda function must have an associated IAM role which dictates what access it has to other AWS services. The above configuration specifies a role with no access policy, effectively giving the function no access to any AWS services since our example application requires no such access.

Before you can work with a new configuration directory, it must be initialized using terraform init, which in this case will install the AWS provider:

$ terraform init

Initializing the backend...

Initializing provider plugins...

* provider.aws: version = "~> 2.52"

Terraform has been successfully initialized!

Now apply the configuration, as usual, using terraform apply.

After the function is created successfully, try invoking it using the AWS CLI:

$ aws lambda invoke --region=us-west-2 --function-name=hello_world output.txt
{
"StatusCode": 200,
"ExecutedVersion": "$LATEST"
}
$ cat output.txt
"Good morning!\n"

With the function working as expected, the next step is to create the API Gateway REST API that will provide access to it!

Configuring API Gateway

API Gateway’s name reflects its original purpose as a public-facing frontend for REST APIs, but it was later extended with features that make it easy to expose an entire web application based on AWS Lambda. These later features will be used in this tutorial. The term “REST API” is thus used loosely here, since API Gateway is serving as a generic HTTP frontend rather than necessarily serving an API.

Create a new file api_gateway.tf in the same directory as our lambda.tf from the previous step. First, configure the root “REST API” object, as follows:

resource "aws_api_gateway_rest_api" "example" {
name = "hello_world_api"
description = "Terraform Serverless Application Example"
}

The “REST API” is the container for all of the other API Gateway objects we will create.

All incoming requests to API Gateway must match with a configured resource and method in order to be handled. Append the following to the lambda.tf file to define a single proxy resource:

 
resource "aws_api_gateway_resource" "proxy" {
rest_api_id = aws_api_gateway_rest_api.example.id
parent_id = aws_api_gateway_rest_api.example.root_resource_id
path_part = "{proxy+}"
}

resource "aws_api_gateway_method" "proxy" {
rest_api_id = aws_api_gateway_rest_api.example.id
resource_id = aws_api_gateway_resource.proxy.id
http_method = "ANY"
authorization = "NONE"
}

The special path_part value “{proxy+}” activates proxy behavior, which means that this resource will match any request path. Similarly, the aws_api_gateway_method block uses a http_method of “ANY”, which allows any request method to be used. Taken together, this means that all incoming requests will match this resource.

Each method on an API gateway resource has an integration which specifies where incoming requests are routed. Add the following configuration to specify that requests to this method should be sent to the Lambda function defined earlier:

resource "aws_api_gateway_integration" "lambda" {
rest_api_id = aws_api_gateway_rest_api.example.id
resource_id = aws_api_gateway_method.proxy.resource_id
http_method = aws_api_gateway_method.proxy.http_method

integration_http_method = "POST"
type = "AWS"
uri = aws_lambda_function.example.invoke_arn
}

The AWS type of integration lets the API gateway expose AWS service actions. In AWS integration, you must configure both the integration request and integration response and set up necessary data mappings from the method request to the integration request, and from the integration response to the method response, to implement that, add the following configuration:

resource "aws_api_gateway_method_response" "proxy_response_200" {
rest_api_id = aws_api_gateway_rest_api.example.id
resource_id = aws_api_gateway_method.proxy.resource_id
http_method = aws_api_gateway_method.proxy.http_method
status_code = "200"
}

resource "aws_api_gateway_integration_response" "proxy" {
rest_api_id = aws_api_gateway_rest_api.example.id
resource_id = aws_api_gateway_method.proxy.resource_id
http_method = aws_api_gateway_method.proxy.http_method
status_code = aws_api_gateway_method_response.proxy_response_200.status_code
}

Unfortunately, the proxy resource cannot match an empty path at the root of the API. To handle that, a similar configuration must be applied to the root resource that is built into the REST API object:

resource "aws_api_gateway_method" "proxy_root" {
rest_api_id = aws_api_gateway_rest_api.example.id
resource_id = aws_api_gateway_rest_api.example.root_resource_id
http_method = "ANY"
authorization = "NONE"
}

resource "aws_api_gateway_integration" "lambda_root" {
rest_api_id = aws_api_gateway_rest_api.example.id
resource_id = aws_api_gateway_method.proxy_root.resource_id
http_method = aws_api_gateway_method.proxy_root.http_method

integration_http_method = "POST"
type = "AWS"
uri = aws_lambda_function.example.invoke_arn
}
resource "aws_api_gateway_method_response" "root_response_200" {
rest_api_id = aws_api_gateway_rest_api.example.id
resource_id = aws_api_gateway_method.proxy_root.resource_id
http_method = aws_api_gateway_method.proxy_root.http_method
status_code = "200"
}

resource "aws_api_gateway_integration_response" "proxy_root" {
rest_api_id = aws_api_gateway_rest_api.example.id
resource_id = aws_api_gateway_method.proxy_root.resource_id
http_method = aws_api_gateway_method.proxy_root.http_method
status_code = aws_api_gateway_method_response.root_response_200.status_code
}

Finally, you need to create an API Gateway “deployment” in order to activate the configuration and expose the API at a URL that can be used for testing:

resource "aws_api_gateway_deployment" "example" {
depends_on = [
aws_api_gateway_integration.lambda,
aws_api_gateway_integration.lambda_root,
aws_api_gateway_integration_response.proxy,
aws_api_gateway_integration_response.proxy_root
]

rest_api_id = aws_api_gateway_rest_api.example.id
stage_name = "test"
}

With all of the above configuration changes in place, run terraform apply again to create these new objects!

After the creation steps are complete, the new objects will be visible in the API Gateway console.

The integration with the Lambda function is not functional yet because API Gateway does not have the necessary access to invoke the function. The next step will address this, making the application fully-functional.

Allowing API Gateway to Access Lambda

By default, any two AWS services have no access to one another until access is explicitly granted. For Lambda functions, access is granted using the aws_lambda_permission resource, which should be added to the lambda.tf file created in an earlier step:

resource "aws_lambda_permission" "apigw" {
statement_id = "AllowAPIGatewayInvoke"
action = "lambda:InvokeFunction"
function_name = aws_lambda_function.example.function_name
principal = "apigateway.amazonaws.com"

# The "/*/*" portion grants access from any method on any resource
# within the API Gateway REST API.
source_arn = "${aws_api_gateway_rest_api.example.execution_arn}/*/*"
}

In order to test the created API, you will need to access its test URL. To make this easier to access, add the following output to api_gateway.tf:

output "base_url" {
value = aws_api_gateway_deployment.example.invoke_url
}

Apply the latest changes with terraform apply:

Apply complete! Resources: 3 added, 0 changed, 0 destroyed.

Outputs:

base_url = https://z94if2am56.execute-api.us-west-2.amazonaws.com/test

Load the URL given in the output from your run in your favorite web browser. If everything has worked, you will see the text “Good Morning!” or “Good Afternoon!” based on the DateTime of your system.

Cleaning Up

Once you are finished with this case study, you can destroy the example objects with Terraform:

terraform destroy

Conclusion

In this case study, you created a serverless IBM ODM solution that produces a result compatible with Amazon API Gateway proxy resources and then configured API Gateway.

When combined with an automated build process running in a CI system, Terraform can help to deploy applications as AWS Lambda functions, with suitable IAM policies to connect with other AWS services for persistent storage, access to secrets, etc.

A cloud in mind ?
Contact us !

How can we help ?