Scaleway serverless

Disclaimer: This walkthrough worked in the moment of the writing of this post and may be broken in the future.

I was very happy when Scaleway announced their "function as a service"-service. There is a need for more competition to AWS, Google or Azure.

To get a feeling for the complexity I want to describe how I deployed a function at scaleway. Their api documentation is better than I expected but the walkthrough is using serverless and nodejs.

I wanted to try this with Python 3, PyTorch and curl only. I failed with PyTorch because of an issue with a shared library in Alpine Linux.

So this walkthrough only uses numpy as an example. But this should work for all Python libraries that can run with Alpine Linux.

First you need your organization_id and a new token from your scaleway settings. This is described in the documentation.

Then you need a namespace:

curl -X POST "https://api.scaleway.com/functions/v1alpha2/regions/fr-par/namespaces" \
-H "accept: application/json" -H "X-Auth-Token: $TOKEN" -H "Content-Type: application/json" \
-d "{\"name\": \"hello-ns\", \"organization_id\": \"$ORGANIZATION_ID\"}"

To be sure everything worked, we list the namespaces:

curl "https://api.scaleway.com/functions/v1alpha2/regions/fr-par/namespaces" -H "accept: application/json" \
-H "X-Auth-Token: $TOKEN" -H "Content-Type: application/json"

Now the interesting part. We need the namespace_id created before. Create a function:

curl -X POST -H "X-Auth-Token: $TOKEN" \
"https://api.scaleway.com/functions/v1alpha2/regions/fr-par/functions" \
-d "{\"name\": \"hello\", \"namespace_id\": \"$NAMESPACE_ID\", \"memory_limit\": 128, \"min_scale\": 0, \"max_scale\": 1, \"runtime\": \"python3\"}"

In the response is an id we need next. For uploading the function we need to create a zip file with the name like this: function-$FUNCTION_ID.zip

The files we want to add to our zipfile are

  • handler.py

import numpy

def handle(event, context):
    return {
        "body": {
            "numpy": {
              "version": numpy.__version__,
        }},
        "statusCode": 200,
    }
  • packages

mkdir package
# install packages using the alpine image from scaleway for this
docker run --rm -v $(pwd):/home/app/function --workdir /home/app/function rg.fr-par.scw.cloud/scwfunctionsruntimes/python-dep3:v2.1.0 pip install numpy --target ./package

Generate zip and get size of zip:

# generate zip
zip -r function-$FUNCTION_ID.zip handler.py package
# get size of zip for upload calls
ls -l *zip

For my test the size of the zip file was 12003113 bytes.

get upload url (insert TOKEN, FUNCTION_ID and SIZE_OF_ZIP in bytes)

curl -X GET -H "X-Auth-Token: TOKEN" https://api.scaleway.com/functions/v1alpha2/regions/fr-par/functions/FUNCTION_ID/upload-url?content_length=SIZE_OF_ZIP

Upload using the url, zip-filename and zip-size from before:

export FUNCTION_ARCHIVE_URL=$(echo -n "<upload-url-from-before>")
curl -H "Content-Type: application/octet-stream" -H "Content-Length: SIZE_OF_ZIP" --upload-file FUNCTION_ARCHIVE_NAME $FUNCTION_ARCHIVE_URL

Now deploy the function from the uploaded storage:

curl -X POST -H "X-Auth-Token: $TOKEN" "https://api.scaleway.com/functions/v1alpha2/regions/fr-par/functions/$FUNCTION_ID/deploy" -d "{}"

The last call returns the url to invoke in the function. Use this url like this:

curl "<url-given-by-deploy>"

This should result in something like this:

{"numpy":{"version":"1.16.4"}}

inotify

Sometimes it would be nice to do something when a file was written to disk.

For example a file is saved in the editor and after the file is written on disk we want to run a script or code.

Two ways to solve this:

inotify-tools

Install inotify-tools via your package manager.

Example usage:

# watch all files in this folder
while inotifywait -e close_write *; do
  bash process.sh
done

inotify in Python

There are some Python packages for inotify. One is aionotify and is Python 3 only.

Source of aionotify: https://github.com/rbarrois/aionotify

Because inotify is kernel based this can be used to trigger events when files are uploaded to a system. When the file is closed a function is called to process the file.

Example usage:

import asyncio
import aionotify
from pathlib import Path

PATH = Path("/tmp/uploads")

# Setup the watcher
watcher = aionotify.Watcher()
watcher.watch(alias="uploads", path=str(PATH), flags=aionotify.Flags.CLOSE_WRITE)

# Prepare the loop
loop = asyncio.get_event_loop()

def process(filename):
    # print first line; could do more useful stuff
    with open(PATH / filename) as fp:
        print(fp.readline())

async def work():
    await watcher.setup(loop)
    while True:
        event = await watcher.get_event()
        print(event)
        process(event.name)
    watcher.close()

loop.run_until_complete(work())
loop.stop()
loop.close()

This example monitors the folder /tmp/uploads for files written and closed. A closed file will be opened and the first line is printed.

An example call of the script above:

echo "foo\nbar" > /tmp/uploads/xxx

results in

Event(flags=8, cookie=0, name='xxx', alias='uploads')
foo

Allowed flags are listed here: https://github.com/rbarrois/aionotify/blob/master/aionotify/enums.py

SSH config notes

Some examples from my .ssh/config file.

All my automatically installed Raspberry PIs are in the same tinc within the same group of ten. I use archlinuxarm, so I want to use the user alarm as my default. The IP address range is from .20 to .29.

Host 10.10.10.2?
  User alarm

On the rare occasion I have to manually create an AWS server, I need this.

Host manual-aws-server
  HostName ec2-xx-xx-xxx-xxx.eu-west-1.compute.amazonaws.com
  User ubuntu
  IdentityFile ~/.ssh/given-certicate.pem

Login with custom user, custom port and specific certificate only for this system.

Host custom
  HostName custom.local
  User username
  Port 8022
  IdentityFile ~/.ssh/custom-local-id_ed25519