Raspberry Pi - use blinkt shield for status LEDs

A while ago I bought a Pimoroni blinkt shield. Today I started usint it as a status LED.

First step is to install the library. For me - using archlinuxarm - the "simple" installer from the tutorial doesn't work - so I installed blinkt via pip:

pip install blinkt

Now test if everything works:

import time
import blinkt

while True:
    for i in range(8):
        blinkt.set_pixel(i, 0, 255, 0)

And finally a real example - showing the status of a github-action build on the first LED:

import time
import requests
import blinkt

while True:
    r = requests.get("https://github.com/mlugs/website/workflows/github%20pages/badge.svg")

    # led_num, r, g, b
    led_color = (0, 255, 255, 0)  # yellow

    if r.status_code == 200:
        if "passing" in r.text:
            led_color = (0, 0, 2, 0)
            led_color = (0, 255, 0, 0)

If the status is green the first LED is only a little green glow.
On api fail the first LED is yellow and on build fail the LED is red.

To run this script on boot in the background we use systemd.
Save the systemd config file as: /etc/systemd/system/blinkt-status.service:

Description=blinkt LED status

ExecStart=python /root/blinkt_github_actions.py


Enable the systemd service with: systemctl enable blinkt-status

Use pi camera on Raspberry pi with archlinux-arm

For Raspbian there is raspi-config to enable the camera on a raspberry pi. On a system without raspi-config the changes are:

  • Changes in /boot/config.txt:

initramfs initramfs-linux.img followkernel
cma_lwm= cma_hwm= cma_offline_start=

Not reboot for the change to get loaded.

  • To check if the camera is working:

/opt/vc/bin/vcgencmd get_camera

this should result in: supported=1 detected=1

Finally shoot an image:

/opt/vc/bin/raspistill -o image.jpg --nopreview


This post is from 2020 and the camera config changed a bit since then, see https://www.raspberrypi.com/documentation/computers/config_txt.html

Using Google Cloud Datastore with Google Cloud Run

Google Cloud Datastore (former Firebase) seems a good alternative for a small document based storage. When PostgreSQL (my preferred database) is too big (and expensive) and SQL not really needed.

I use the Google Cloud SDK installed on ArchLinux with the community package.


All sourcecode of this demo is on Github: https://github.com/mfa/google-cloud-datastore-demo.


I chose the new "datastore mode" and not the old "native mode". The datastore location for me is in europe-west3. More on that in the Google documentation: https://cloud.google.com/datastore/docs/firestore-or-datastore.


For development we need a credentials.json file to access the Datastore. A serviceaccount that can only read/write to the datastore seems the official way to do this. So create one as described in https://cloud.google.com/docs/authentication/production#auth-cloud-implicit-python.

Save this file in app/credentials.json. This file should never be deployed or version controlled. For this add the file to .dockerignore and .gitignore.

On production the Cloud Run instance is inside the same Google Cloud Project and doesn't need additional authentication.


The code will run in Docker on Google Cloud Run. The Dockerfile is very similar to the one in my previous blogpost.

FROM tiangolo/uvicorn-gunicorn:python3.8-slim
COPY requirements.txt /requirements.txt
RUN pip install --no-cache-dir -r /requirements.txt
COPY ./app /app

For development we use a docker-compose.yml like this one:

version: "2"
    build: .
    command: "/start-reload.sh"
      - ./app:/app
     - "8001:80"
Here the start command is replaced with one that reloads on every code change.
And the code is mounted as volume into the container, so every change in app is seen.
Additionally the credentials.json is set to an environment variable the google-cloud-datastore client automatically uses.


We need fastapi and of course the google library for datastore access. This is our requirements.txt


The App

The main python code is placed in that app directory and named main.py:

import uuid
from google.cloud import datastore
from fastapi import FastAPI

app = FastAPI()
datastore_client = datastore.Client()

async def hello():
    return {"message": "Hello World"}

async def demo():
    kind = "Task"
    uid = str(uuid.uuid4())
    task_key = datastore_client.key(kind, uid)

    task = datastore.Entity(key=task_key)
    task["description"] = f"{uid} needs a description"

    # retrieve element back and return it
    key = datastore_client.key("Task", uid)
    return datastore_client.get(key)

Run locally

To run the code locally using docker-compose:

docker-compose up web

And surf to http://localhost:8001/ to see the status page. The demo is located http://localhost:8001/demo/.

Now you should see a new entity in the Google Cloud Datastore.

Run on Google Cloud Run

Build the container on the Google Cloud. (Replace <PROJECT_ID> with your Google Cloud project ID)

gcloud builds submit --tag eu.gcr.io/<PROJECT_ID>/datastore-demo

And run the build artifact on Google Cloud Run.

gcloud run deploy datastore-demo --image eu.gcr.io/<PROJECT_ID>/datastore-demo --allow-unauthenticated

The last command returns the URL were the code is deployed with a *.run.app domain.

On https://https://datastore-demo-XXXXXXXX.a.run.app/demo a new document should be added to the datastore.
(XXXXXXXX is a random string generated by Google Cloud)