Passer au contenu principal

Cyberlearn

Le centre e-learning de la HES-SO, il a été créé en 2004. Il propose aux professeur-e-s de la HES-SO un soutien actif pour les aider à enrichir leurs cours présentiels avec des solutions technologiques favorisant l'apprentissage.

Cyberlearn Cyberlearn IA
fr
Deutsch ‎(de)‎ English ‎(en)‎
Vous êtes connecté anonymement
Connexion
  • Accueil
  • Support
  • CyberLearn
  • Archives
  • HES-SO Numérique
  • Plus
Accueil Support CyberLearn Archives HES-SO Numérique
  1. Cloud & Deployment 23 - 24
  2. Session 9: Lab3 (FaaS)
  3. Lab 3: Serverless - Faas Service targeting DB service

Lab 3: Serverless - Faas Service targeting DB service

Conditions d’achèvement
Ouvert le : mercredi 22 novembre 2023, 16:45
À rendre : mardi 28 novembre 2023, 23:59

EXERCISE: SERVERLESS - FAAS SERVICE TARGETING DB SERVICE

INTRODUCTION AND PREREQUISITES

This exercise is to:

  • Become familiar with a Function-as-a-Service platform,
  • Learn how to deploy and monitor a function,
  • Compare latencies in the case of warm start and cold start.

In this exercise you will use three different public cloud providers to deploy a function in an FaaS service. The function is triggered by HTTP requests and returns an HTTP response. The function reads data from a DB and returns it in its response.

You will then run load tests using the JMeter tool on your local machine and measure the latencies of function execution.

The following resources and tools are required for this exercise session:

  • Account on AWS Amazon
  • Account on Azure Cloud
  • Account on Google Cloud Platform

Please respect the case of the DB table names given or the functions will may not work.

TASK 1 - USING AWS

Replace "[YOUR_NAME]" with your name to easily identify your ressources. Please interact only with your ressources.

Please work only on Eu-Central-1 region (Frankfurt)

TASK 1.1 - CREATE DYNAMODB TABLE

Using the AWS Management Console create a table in DynamoDB named Music_[YOUR_NAME]. Use the guide Getting Started with DynamoDB and follow the instructions of Step 1 and 2.

Add the following items to your database :

{
    "Artist": "No One You Know",
    "SongTitle": "Call Me Today",
    "AlbumTitle": "Somewhat Famous",
    "Awards": 1
}
{
    "Artist": "Acme Band",
    "SongTitle": "Happy Day",
    "AlbumTitle": "Songs About Life",
    "Awards": 10
}

There are two options for creating items: Form and JSON. We recommend the JSON option. Items must be created one by one.

TASK 1.2 - CREATE A FUNCTION LAMBDA ON AWS

We will create function that accesses DynamoDB to read data.

To create and deploy the function we will use an AWS-provided blueprint that will not only create and deploy a ready-made function, but also:

  • configure the necessary access control rights so that the function is able to access DynamoDB,
  • deploy an API on the API Gateway service so that an HTTP request to the API gateway triggers the execution of the function, and
  • configure the necessary access control rights for API Gateway.

Create a Lambda function (from Lambda service) with an API Gateway using a blueprint called microservice-http-endpoint-python. Fill in the following fields and leave the rest at their default settings:

  • Basic information

    • Function name: myfunc-[YOUR_NAME]
    • Check Create a new role from AWS policy templates.
    • Role name: role_[YOUR_NAME]
    • Policy templates: Simple microservice permissions DynamoDB
  • API Gateway trigger

    • Create a new API
    • API type: HTTP API
    • Security: Open

Create the function.

In the function panel, in the code editor, replace the function code with the following :

import boto3
import json

print('Loading function')
dynamo = boto3.client('dynamodb')


def respond(err, res=None):
    return {
        'statusCode': '400' if err else '200',
        'body': err.message if err else json.dumps(res),
        'headers': {
            'Content-Type': 'application/json',
        },
    }


def lambda_handler(event, context):
    '''Demonstrates a simple HTTP endpoint using API Gateway. You have full
    access to the request and response payload, including headers and
    status code.

    To scan a DynamoDB table, make a GET request with the TableName as a
    query string parameter. To put, update, or delete an item, make a POST,
    PUT, or DELETE request respectively, passing in the payload to the
    DynamoDB API as a JSON body.
    '''
    print("Received event: " + json.dumps(event, indent=2))

    operations = {
        'DELETE': lambda dynamo, x: dynamo.delete_item(**x),
        'GET': lambda dynamo, x: dynamo.scan(**x),
        'POST': lambda dynamo, x: dynamo.put_item(**x),
        'PUT': lambda dynamo, x: dynamo.update_item(**x),
    }

    operation = event['requestContext']['http']['method']
    if operation in operations:
        payload = event['queryStringParameters'] if operation == 'GET' else json.loads(event['body'])
        return respond(None, operations[operation](dynamo, payload))
    else:
        return respond(ValueError('Unsupported method "{}"'.format(operation)))

Click on Deploy Button. 

  1. Click on the Configuration tab of the of the lambda function you just created
  2. Click on the Function URL button qui se trouve on the left menu and click on the Create function URL button.
  3. Choose NONE button (for "no authentication")
  4. Copy the Function URL.

Trigger your Lambda function with this URL type :

https://[Your Function URL]/[nameOfTheFunction]?TableName=[YOUR_DB_NAME]

by entering the complete URL in the address bar of your web browser. You are now experiencing a function cold start. After some seconds, you should see a response containing the data from the database.

TASK 1.3 - AWS : USE ENVIRONNMENT VARIABLES

Change the deployed code to give the name of the database table through an environment variable and not through the URL.

Deliverables (D1.1):

Submit your code that supports the environment variable.


TASK 2 - USING GOOGLE CLOUD PLATFORM

TASK 2.1 - CREATE A DATASTORE

Go to Datastore service

Create a datastore on GCP and create the equivalent of AWS DynamoDB created on task 1.1, and add the same entries with the kind "Music".

TASK 2.2 - CREATE A CLOUD FUNCTION ON GCP

We will a create a function that accesses the Datastore to read data.

Go to Cloud Functions service.

Create a Cloud function on GCP, on Europe region, 1st Gen, HTTP trigger type and allowing unauthenticated invocations. Select python 3.9 runtime and paste the following code in the editor and change they entry point with the correct function name :

main.py:

from google.cloud import datastore
import json

def query(request):
    """Responds to any HTTP request.
    Args:
        request (flask.Request): HTTP request object.
    Returns:
        The response text or any set of values that can be turned into a
        Response object using
        `make_response <http://flask.pocoo.org/docs/1.0/api/#flask.Flask.make_response>`.
    """
    l = list()
   
    client = datastore.Client()
    query = client.query(kind='Music')
    l = query.fetch()
    l = list(l)
    return json.dumps(l)

Add google-cloud-datastore to the requirements.txt file.

Retrieve the URL of your function from the Trigger view for testing it in your browser.

TASK 2.3 - GCP : USE ENVIRONNMENT VARIABLES

Change the deployed code to give the name of the database table through an environment variable and not anymore directly written in the code.

Deliverables (D1.2):

Submit your code that supports the environment variable.


TASK 3 - USING AZURE CLOUD

TASK 3.1 - CREATE COSMOSDB DATABASE

Go to Azure Cosmos DB

Create a CosmosDB account by choosing the Core (SQL) API option, on Switzerland North region.

When the account is created, go in the ressource and go to Data explorer on the left menu. Then create a new database named serverless-db and a container named Music. Finally add the two items of task 1.1 in the Items part of Music container.

On the left panel : Go in Settings > Keys and copy the PRIMARY CONNECTION STRING in a Notepad. This data is needed during the next task.

TASK 3.2 - CREATE A FUNCTION APP ON AZURE CLOUD

Go to Function App Service

Create a new Function App on python 3.9 and Switzerland North region. Go to the new resource created.

Create a new function : Functions > Functions > Create and follow the tutorial from Azure for creating function in your Code editor. Tip : Use the Visual Studio Code add-on it will be simpler for you.

For this part create an HTTP Trigger function without any authentication needed.

In your editor, once the function is created, fill the following files :

__init.py__:

import logging
import json
import azure.functions as func

def main(req: func.HttpRequest, doc:func.DocumentList) -> func.HttpResponse:
    
    logging.info('Python HTTP trigger function processed a request.')
 
    entries_json = []

    for entry in doc:
        entry_json = {
            "id": entry['id'],
            "AlbumTitle": entry['AlbumTitle'],
            "Artist": entry['Artist'],
            "SongTitle": entry['SongTitle'],
            "Awards": entry['Awards'],
        }
        entries_json.append(entry_json)

    return func.HttpResponse(
            json.dumps(entries_json),
            status_code=200,
            mimetype="application/json"            
    )

function.json:

{
    "scriptFile": "__init__.py",
    "bindings": [{
            "authLevel": "anonymous",
            "type": "httpTrigger",
            "direction": "in",
            "name": "req",
            "methods": [
                "get",
                "post"
            ],
            "route": "music/list"
        },
        {
            "type": "cosmosDB",
            "direction": "in",
            "name": "doc",
            "databaseName": "serverless-db",
            "collectionName": "Music",
            "createIfNotExists": "true",
            "connectionStringSetting": "AzureCosmosDBConnectionString",
            "sqlQuery": "SELECT * from c"
        },
        {
            "type": "http",
            "direction": "out",
            "name": "$return"
        }
    ]
}

For locally testing purposes in VS Code, the local.settings.json need to looks like this. Replace "INSERT HERE PRIMARY CONNECTION STRING" by your PRIMARY CONNECTION STRING saved earlier :

{
    "IsEncrypted": false,
    "Values": {
        "AzureWebJobsStorage": "",
        "FUNCTIONS_WORKER_RUNTIME": "python"
    },
    "ConnectionStrings": {
        "AzureCosmosDBConnectionString": "INSERT HERE PRIMARY CONNECTION STRING"
    }
}

After deploying your function on Azure Cloud, this won't work directly because the CosmosDB service is not linked to your function App. To resolve this, you need to go to your Function App Resource panel > Settings > Configuration.

Then add a "New connection string" > Name : AzureCosmosDBConnectionString > Value : Your PRIMARY CONNECTION STRING > Type : Custom.

Retrieve the URL from the HTTP Trigger view and test it in your browser.

TASK 3.3 - AZURE : USE ENVIRONNMENT VARIABLES

Change the deployed code to give the name of the database table through an environment variable and not anymore directly written in the code.

Deliverables (D1.3):

Submit your code that supports the environment variable.


TASK 4 - MEASURE LATENCY OF WARM SANDBOXES AND COLD START

You need to do this task for each provider.

In this task you will performance test the Lambda platform with a load generator. You will compare the performance of request processing when the function is already deployed in sandboxes ("warm sandboxes") and request processing when the function code hasn't been deployed yet ("cold start").

  1. Download and install on your local machine JMeter from http://jmeter.apache.org/.

  2. First test the performance of normal request processing with warm sandboxes.

    • Open JMeter and create a new test plan. Add a Thread Group under the test plan and add a HTTP Request to this Thread Group. In the HTTP Request sampler you configure the function's invoke URL (plus the query string). You have to split it into the Server name or IP part and the Path part.
    • Find the correct(s) listener(s) to see your tests results (Graph Result)
    • Open the provider's metrics dashboard of your function: called Monitor on AWS, Metrics on Google and Azure.
    • Run a test with 1 user and a loop count of 100.
    • Observe the requests in the dashboard and in JMeter. What do you see? What is the function latency in the case of warm sandboxes?
  3. To test cold start latency modify the code of the function slightly to create a new version (e.g., add a print statement) and deploy it. This invalidates the previous version deployed in the sandboxes and forces the platform to load the new version. Repeat the JMeter test and observe the latency of the first request and the subsequent requests. What do you see?

Deliverables (D2):

  1. For each performance test, copy a screenshot of the JMeter Graph Results listener and the Metrics graphs of the dashboard view into the report. Please show only relevant criteria on the screenshots. 
  2. Compare the response times shown by JMeter and the Management Console. Explain the difference.
  3. How much resources have you used running these tests? How many invocations and how many Gigabyte-seconds of execution? Does your resource usage remain within the free tier?

TASK 5 - EXPERIMENTING COLD START ON AN OPEN SOURCE FAAS PLATFORM : APACHE OPENWHISK

SETUP

This part is designed, tested and working on Ubuntu 22.04

To do this part, you will need to have installed on you machine several packages. If you don't want to do this directly on your computer you can use a VM (Here pre-installed VM : Os Boxes Ubuntu images). No need to use JMeter during this task.

List of needed packages (see at the end some procedure to install each if you don't have it yet) :

  • NPM
  • NodeJS
  • Docker
  • Java (minimum 8)

LAUNCH OPENWHISK SERVER :

git clone https://github.com/apache/openwhisk.git
cd openwhisk
./gradlew core:standalone:bootRun

If you are in an OS with GUI it will open a browser with the OpenWhisk playground (usually on local address 172.17.0.1:3232).

You can use this playground for the exercise or use the CLI.

SETUP OPENWHISK CLI (OPTIONAL FOR GUI USERS)

Download and extract Openwhisk cli archive :

wget https://github.com/apache/openwhisk-cli/releases/download/1.2.0/OpenWhisk_CLI-1.2.0-linux-386.tgz
tar zxvf OpenWhisk_CLI-1.2.0-linux-386.tgz

Setup CLI :

sudo mv wsk /usr/local/bin/wsk
wsk property set --apihost 'http://172.17.0.1:3233' --auth '23bc46b1-71f6-4ed5-8c54-816aa4f8c502:123zO3xZCLrMN6v2BKK1dXYFpXlPkccOFqm12CdAsMgRU4VrNZ9lyGVCGuMDGIwP'

EXERCISE

Create an action (a function in OpenWhisk language) threw the CLI (Creating and Invoking actions with CLI) or the playground, using JavaScript or Python language.

Deliverables (D3):

  1. How can you observe a cold start ? You are not being asked to compare the execution times of different requests. Show all the steps of your answer and add some print screens to illustrate it.
  2. How does OpenWhisk avoid a cold start when you invoke the same action (function) several times in a short period of time ? Show all the steps of your answer and add some print screens to illustrate it.
  3. Where are stored all your actions code in OpenWhisk when the function is not running at all ?

Combine D2 and D3 in one document

PROCEDURE TO INSTALL PACKAGES (ON UBUNTU)

NPM

sudo apt install npm

NodeJS sudo apt install nodeJS

Docker Docker installation Post-docker-installation steps

JAVA 8

sudo apt-get install openjdk-8-jdk export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-amd64


CLEANUP

After you have finished this exercise, for each provider:

  • Delete the databases
  • Delete the function
  • Delete the API Gateway (AWS)

A penalty can be given if a ressource is not properly deleted on AWS.

Remember that every resource left may cost you and drain your credit.

If you have any questions related to AWS Amazon, contact francisco(dot)mendonca(at)hesge(dot)ch




◄ FaaS presentation
OpenWhisk Presentation ►
Vous êtes connecté anonymement (Connexion)
Obtenir l’app mobile
Rejoignez-nous!
Obtenir l’app mobile