Use pi camera on Raspberry pi with archlinux-arm

For Raspbian there is raspi-config to enable the camera on a raspberry pi. On a system without raspi-config the changes are:

  • Changes in /boot/config.txt:

initramfs initramfs-linux.img followkernel
cma_lwm= cma_hwm= cma_offline_start=

Not reboot for the change to get loaded.

  • To check if the camera is working:

/opt/vc/bin/vcgencmd get_camera

this should result in: supported=1 detected=1

Finally shoot an image:

/opt/vc/bin/raspistill -o image.jpg --nopreview

Using Google Cloud Datastore with Google Cloud Run

Google Cloud Datastore (former Firebase) seems a good alternative for a small document based storage. When PostgreSQL (my preferred database) is too big (and expensive) and SQL not really needed.

I use the Google Cloud SDK installed on ArchLinux with the community package.


All sourcecode of this demo is on Github:


I chose the new "datastore mode" and not the old "native mode". The datastore location for me is in europe-west3. More on that in the Google documentation:


For development we need a credentials.json file to access the Datastore. A serviceaccount that can only read/write to the datastore seems the official way to do this. So create one as described in

Save this file in app/credentials.json. This file should never be deployed or version controlled. For this add the file to .dockerignore and .gitignore.

On production the Cloud Run instance is inside the same Google Cloud Project and doesn't need additional authentication.


The code will run in Docker on Google Cloud Run. The Dockerfile is very similar to the one in my previous blogpost.

FROM tiangolo/uvicorn-gunicorn:python3.8-slim
COPY requirements.txt /requirements.txt
RUN pip install --no-cache-dir -r /requirements.txt
COPY ./app /app

For development we use a docker-compose.yml like this one:

version: "2"
    build: .
    command: "/"
      - ./app:/app
     - "8001:80"
Here the start command is replaced with one that reloads on every code change.
And the code is mounted as volume into the container, so every change in app is seen.
Additionally the credentials.json is set to an environment variable the google-cloud-datastore client automatically uses.


We need fastapi and of course the google library for datastore access. This is our requirements.txt


The App

The main python code is placed in tha app directory and named

import uuid
from import datastore
from fastapi import FastAPI

app = FastAPI()
datastore_client = datastore.Client()

async def hello():
    return {"message": "Hello World"}

async def demo():
    kind = "Task"
    uid = str(uuid.uuid4())
    task_key = datastore_client.key(kind, uid)

    task = datastore.Entity(key=task_key)
    task["description"] = f"{uid} needs a description"

    # retreive element back and return it
    key = datastore_client.key("Task", uid)
    return datastore_client.get(key)

Run locally

To run the code locally using docker-compose:

docker-compose up web

And surf to http://localhost:8001/ to see the status page. The demo is located http://localhost:8001/demo/.

Now you should see a new entity in the Google Cloud Datastore.

Run on Google Cloud Run

Build the container on the Google Cloud. (Replace <PROJECT_ID> with your Google Cloud project ID)

gcloud builds submit --tag<PROJECT_ID>/datastore-demo

And run the build artifact on Google Cloud Run.

gcloud run deploy datastore-demo --image<PROJECT_ID>/datastore-demo --allow-unauthenticated

The last command returns the URL were the code is deployed with a * domain.

On https:// a new document should be added to the datastore.
(XXXXXXXX is a random string generated by Google Cloud)

Use Google Cloud Run with FastAPI

For a webservice project idea that only should serve an atom feed I tried to find an alternative to use a virtual server. Google Cloud Run seems to be a good alternative. So I clicked a free account and tried it.

The easiest way to use the Google Cloud is by using their sdk. On ArchLinux install the community package. The sdk documentation shows how to do this for other Linux distributions and other operating systems.

First step is to login:

gcloud init

And to configure a bit:

gcloud config set run/platform managed
gcloud config set run/region europe-west4

For cloud run we need a Dockerfile to run the code:

FROM tiangolo/uvicorn-gunicorn:python3.8-slim
RUN pip install --no-cache-dir fastapi
COPY ./app /app

The base Docker container used is And of course a bit of Python code to with the api. The framework used is FastAPI which similar to Flask, but build on Python 3 concepts. This is placed in an app directory and named

from fastapi import FastAPI
app = FastAPI()

async def hello():
    return {"message": "Hello World"}

Now build the container on the Google Cloud. (Replace <PROJECT_ID> with your Google Cloud project ID)

gcloud builds submit --tag<PROJECT_ID>/helloworld

And run the build artifact on Google Cloud Run.

gcloud run deploy hello1 --image<PROJECT_ID>/helloworld --allow-unauthenticated

The last command returns the URL were the code is deployed with a * domain. Of course we want to run the code with our own domain. The Google Cloud can handle this even with automatic SSL certificates.

First verify your domain:

gcloud domains verify <DOMAIN>

Then add the TXT entry shown to your DNS configuration. Additionally a CNAME for the domain to use, for example:


DNS may take a few minutes for the verification and CNAME registation to propagate. When this has happenend register the domain mapping:

gcloud beta run domain-mappings create --service hello1 --domain

Now the FastAPI minimal api is running at until I delete the Google Cloud Run service for it. 🙂