Sensor API with Django (Part 1)

Disclaimer: This post is not a Django tutorial. The Django documentation has a really good one.

About 9 years ago we build an API to save particulates sensor data using Django and PostgreSQL. I was asked to build something similar, but this time with a few learnings from the previous iteration. Back then we never thought this will get that big, so the database table structure couldn't cope with the load later. Today I would chose a way more free data scheme with a lot less joins needed. This may require (async) postprocessing for displaying on a map, but removes joins on the database. For authentication the ESP internal id together with an (optional) pin was used. This limits to Espressif devices and may not be unique when using other manufacturers or microcontrollers. I plan on generating a ulid for every Sensor and not Node level. For authentication we will generate a random string per Sensor. There will be multiple secrets allowed and a comment field to make it easier to replace the secret later.

On the other hand I would do a lot of things the same way: Django is still the most boring choice. PostgreSQL (or maybe timescaledb) is still my choice for the database. The hosting will be on and the PostgreSQL instance managed by Supabase. This should be enough for quite a while and gives always the option to scale either with fly/supabase or move everthing to a Hetzner VPS and pay with your spare time instead of money.

This post will describe the initial setup of the Django service at with using Supabase for PostgreSQL and Mailgun for sending emails (i.e. password reset; or later password-less login). A second post will describe the Models and Views used.

The full code is in

First step is to setup a Django project. This is already well documented for example in the docs I followed this steps and deployed a first version using SQLite as database. Without a volume this database will be replaced on every deployment, so this clearly needs to be solved. For this I don't want to add a volume and start with SQLite, because I don't want to migrate the SQLite database to PostgreSQL later, we will use Supabase.

But some important steps before. The SECRET_KEY in settings should never be in the repository, so we deploy a new one with fly secrets set SECERT_KEY=insert-some-secret-value-here and use it in For static files (i.e. the css/js of the admin) I added whitenoise. We use a script as CMD at the end of the Dockerfile to start gunicorn. Before this we add the collectstatic command to prepare the static files.

To get a PostgreSQL database we create a new project in the Supabase dashboard. The password set on create is the password used to log into the database, so choose a long one! Additionally we need the HOST and the USER, which can be found in the api settings, i.e. On this database settings page we additionally need to download the client certificate and enable SSL enforcing. Before we can use the Host, Password and User we need to give them to by setting the secrets, i.e. fly secrets set The same for SUPABASE_PASSWORD and SUPABASE_USER.

In the Django settings file we set the database like this:

    'default': {
        'ENGINE' : 'django.db.backends.postgresql',
        'NAME' : 'postgres',
        'HOST' : os.environ.get('SUPABASE_HOST'),
        'PASSWORD': os.environ.get('SUPABASE_PASSWORD'),
        'PORT': 5432,
        'USER': os.environ.get('SUPABASE_USER'),
        # from supabase: database/settings
        'CERT' : '',

We add the migrate command to the, the same way we did for collectstatic.

The current


python collectstatic --no-input
python migrate --no-input

gunicorn --bind :8000 --workers 2 sensor_api.asgi:application -k uvicorn.workers.UvicornWorker

And the database with all tables created at Supabase looks like this:


To get the admin login work we need to add a user and a setting to fix CSRF for the setup. The CSRF is fixed by adding SECURE_PROXY_SSL_HEADER = ('HTTP_X_FORWARDED_PROTO', 'https') to the As described in the Django documentation, we need this because otherwise the is_secure() check from Django fails because of the proxy.

The other thing we need is an Admin user. This is described in the Django getting started from and can be done via

fly ssh console --pty -C 'python /code/ createsuperuser'

Again choose a long password here!

To get password reset emails we want to enable sending emails. In 2024 email is no fun anymore, so don't host your own mailserver until you know what you are doing. I use mailgun because it is easy to use and the free plan should be enough for a while. For email we again need some secrets. We not only add the password, but the username too. So the same fly secrets set as described above for EMAIL_HOST_PASSWORD and EMAIL_HOST_USER and this lines in the

if os.environ.get('EMAIL_HOST_PASSWORD'):
    EMAIL_HOST = ''
    EMAIL_PORT = 587
    EMAIL_HOST_USER = os.environ.get('EMAIL_HOST_USER')
    EMAIL_USE_TLS = True

The page to reset the email is still missing and we will add that later, but we want to try if email sending works now. The Django shell is started the a similar way we added the superuser before: fly ssh console --pty -C 'python /code/ shell'. In the the shell we send an email with the Django send_mail:

from django.core.mail import send_mail
send_mail("Testsubject", "Here is the message.", "", [""], fail_silently=False)

Of course choose the email addresses ("from" and "to") to something you can receive and own.

This concludes Part 1.

Get a Counties List

I need a list of counties of Germany (Landkreise), of France (Arrondissements) and of some other countries. Such lists are in Wikipedia in form of a table, i.e. Liste der Landkreise in Deutschland (no English version available) or List of arrondissements of France.

To extract the table Convert Wiki Tables to CSV works really well. Of course, every country has its special cases, i.e. in Germany there is "Aachen" and "St├Ądteregion Aachen". I manually removed all cities that are part of a larger region for the German Landkreise.

But is there a way to get a better list out of Wikidata? I build a minimal SPARQL query using the query builder:


Which results in this SPARQL query:

SELECT DISTINCT ?item ?itemLabel WHERE {
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE]". }
      ?item p:P31 ?statement0.
      ?statement0 (ps:P31/(wdt:P279*)) wd:Q106658.

The resulting list of districts has 690 elements which is clearly too many, because Germany currently has 294 rural districts. There are too many because historical districts are in there too, i.e. Borken or Zeitz district (GDR). A lot of them have properties for them being former districts, i.e. "dissolved, abolished or demolished date" or "replaced by", but a SPARQL query to filter all of them feels like too much trouble. Another way to filter would be to download the list of 690 entries and filter with code for the fields that mark former districts.

I will revisit the wikidata-query-and-filtering-with-code idea, if the manually edited wikipedia table extract has too many issues. For now I will keep going with the filtered wikitable extract.

Compare the temperature measurement of the Pico

The Raspberry Pi Pico has an internal temperature sensor. The measurements are done via comparing the ADC returned by the sensor with the current ADC of the Pico. I want to compare the internal sensor with an BME280 added to I2C. To record the data I will pull the temperature of both sensors with Homeassistant.

First we need to read the internal sensor on the Pico with MicroPython. For instructions on how to get MicroPython an the Pico and use the REPL see a previous post. Important here, because we want to use WiFi, use the RPI_PICO_W version.

Using the REPL to see the internal temperature:

import machine
adc_voltage = machine.ADC(4).read_u16() * (3.3 / (65536))
temp = 27 - (adc_voltage - 0.706) / 0.001721
print(f"Temperature: {temp:2.1f} ┬░C")

Next, the same for an I2C sensor. I connected the BME280 to GP20 and GP21. There are other options, i.e. GP0 and GP1. The code to see which I2C devices are connected:

import machine
i2c = machine.I2C(0, sda=machine.Pin(20), scl=machine.Pin(21), freq=100000)
print([hex(i) for i in i2c.scan()])

This returns for the BME280: ['0x76'].

We don't want to implement the protocol for a BME280 ourselves, so we use a library that does that. I chose the one from robert-hh on github. Using rshell copy the file to the Pico with cp /pyboard. Now we try if this worked again with the REPL:

import machine
import bme280_float as bme280
i2c = machine.I2C(0, sda=machine.Pin(20), scl=machine.Pin(21))
bme = bme280.BME280(i2c=i2c)

This should return a reasonable temperature value in Celsius as float.

Before adding a webserver to the Pico we need a network connection via WiFi. This is pretty standard and well documented (the tutorial is for the ESP8266, but this works for the Pico, too):

import network
nic = network.WLAN(network.STA_IF)
nic.connect('your-ssid', 'your-key')
# check if this worked:
# for example:
# ('', '', '', '')

Next is a webserver running on the Pico. There are lots of alternatives to writing our own minimal webserver, so we have to choose from more or less feature complete ones out there. I chose microdot because it feels like Flask/FastAPI. This one has clearly too many features and is quite big for such a small problem, but I like the interface and future possibilities. We only need the, so we copy it the same way as for the bme280 module in the rshell: cp src/microdot/ /pyboard.

Now a simple example in the REPL (connect to the WiFi first!):

from microdot import Microdot
app = Microdot()

async def index(request):
    return "Hello, world!", port=80)

Copy this to the REPL in 3 steps, to get the indention correct for the index function. Then connect from your Webbrowser to http://<ip-address-of-your-pico>/ and you should see a "Hello, world!".

We have all parts needed to start a webserver on the Pico with two temperature sensors pulled by Homeassistant. Save the following code as and copy via rshell to the Pico: cp /pyboard. After reboot the Pico should answer with two temperature values as json on http://<ip-address-of-your-pico>/.

import bme280_float as bme280
import machine
import network
from microdot import Microdot

app = Microdot()

def connect_wifi():
    nic = network.WLAN(network.STA_IF)
    # set your WiFi here:
    nic.connect('your-ssid', 'your-key')

def get_internal_temp():
    adc_voltage = machine.ADC(4).read_u16() * (3.3 / (65536))
    return 27 - (adc_voltage - 0.706) / 0.001721

def get_bme280_temp():
    i2c = machine.I2C(0, sda=machine.Pin(20), scl=machine.Pin(21))
    bme = bme280.BME280(i2c=i2c)
    return bme.read_compensated_data()[0]

async def index(request):
    return {
        "internal_temperature": get_internal_temp(),
        "bme280_temperature": get_bme280_temp(),


Adding debug=True in the didn't work for me when running outside of rshell. So be aware that printing may block standalone starting.

And finally the Homeassistant setting to poll from this new sensor. This is added to the configuration.yaml:

  - scan_interval: 60
      - name: "Pico internal temperature"
        value_template: "{{ value_json.internal_temperature }}"
      - name: "Pico BME280 temperature"
        value_template: "{{ value_json.bme280_temperature }}"

In my Homeassistant dashboard the two sensors are shown like this:


A few days of both sensors shown using InfluxDB/Grafana:


The internal sensor graph is jittering a lot and is clearly a few degrees of. The jittering could be solved by windowing the query in InfluxDB and the temperature error by adjusting the formula. But on the positive side both curves are behaving the same. So if only a temperature delta is needed the internal sensor may be good enough.

The Flux code to get the graph:

from(bucket: "home_assistant")
  |> range(start: v.timeRangeStart, stop: v.timeRangeStop)
  |> filter(fn: (r) => r["entity_id"] == "pico_bme280_temperature" or r["entity_id"] == "pico_internal_temperature")
  |> filter(fn: (r) => r._field == "value")
  |> keep(columns: ["_time","entity_id", "_field", "_value"])