Raspberry PI Pico with Micropython

After using ESPHome, I want to experiment more with the Picos without going back to programming in C. So it is MicroPython or CircuitPython. I chose MicroPython because it seems to have more features (i.e. Bluetooth).

First download a current Firmware from https://micropython.org/download/RPI_PICO/ and install via pressing the button on the Pico while connecting to USB. Copy the file on the device and it restarts when done.

On my ArchLinux I need sudo to write to /dev/ttyACM0. To fix this add the group uucp to your user:

sudo usermod -a -G uucp $USER

This is ArchLinux specific. For Ubuntu/Debian it is typically the group dialout.

Next step is to get a REPL and type in some Python. For this I used screen because of old habbit from years ago:

screen /dev/ttyACM0

This small snippet copied into there will let the LED blink every second:

from machine import Pin, Timer
led = Pin(25, Pin.OUT)
timer = Timer()
timer.init(freq=1, mode=Timer.PERIODIC, callback=lambda timer: led.toggle())

More on the timer in the MicroPython docs.

Writing in the REPL is nice but actual files are the way to go. But how to get the files on the Pico? Because I am late here and the MicroPython hype was years ago, the Internet is full of now dead instructions. One that seems to be still active is rshell. This can be pip installed, but I installed the Archlinux package.

When started with rshell it connects automatically to /dev/ttyACM0. To start the same REPL as before with screen type repl in the rshell. Positive here: Ctrl-X to exit the REPL, which was more trouble in screen.

After REPL success lets use a file to do the same. Copy the code into a file on your PC and name it blink.py. In rshell copy the file to the Pico:

cp blink.py /pyboard/main.py

The file main.py is started when the Pico is started. A boot.py is started before, but this may interfere with the REPL, so I chose not to add my own boot.py.

Verify if the file is there (still in rshell):

ls /pyboard

When disconnected and reconnected again the Pico will now let the LED blink. After reconnecting with rshell the blinking will stop again and the filesystem can be changed again.

Thanks to https://blog.martinfitzpatrick.com/using-micropython-raspberry-pico/ for a good rshell/Pico introduction.

Homeassistant, InfluxDB 2.x and Grafana

I installed grafana the same way I installed InfluxDB (described in a post last year) via the ArchLinux package and activated the systemd service. On first connection with default settings (browse to http://ip-address:3000) the login is admin/admin, but the password has to be changed on first login.

First the database connection. I of course want to use the InfluxDB but I chose "Flux" as query system. The only things in the dialog that needs to be filled in are the url: http://localhost:8086 and the InfluxDB Details.

To fill this we need the org, the bucket and a token. For this I used the influx-cli:

# get the orgs
influx orgs list

# get the buckets
influx bucket list

# create a readonly token for grafana
influx auth create --org hass --read-bucket 62302b4f139a4971 --description grafana-ro

Next is creating a dashboard with visualizations. The flux language is a bit strange and quite different from InfluxQL.

So we want to explore what data we have by activating the table view and for example list the possible measurements:

import "influxdata/influxdb/schema"
schema.measurements(bucket: "home_assistant")

A lot of sensors are stored by their meassurements, i.e. "°C", "%", others by their name, i.e. "binary_sensor.madflex_de" (uptime monitor), "weather.forecast_home" or "zone.home". For more exploration see here: https://docs.influxdata.com/influxdb/cloud/query-data/flux/explore-schema/.

I deployed a second sensor in the kitchen that uses ESPHome, see previous blog post, and now I want to see how different they are. I assume a bit of difference because of the location but otherwise a similar curve. The code to see the temperature of sensors filtered by enitity_id and exactly the two sensors in the kitchen selected:

from(bucket: "home_assistant")
  |> range(start: v.timeRangeStart, stop: v.timeRangeStop)
  |> filter(fn: (r) => r["_measurement"] == "°C")
  |> filter(fn: (r) => r._field == "value")
  |> filter(fn: (r) => (r.entity_id == "zero6_temperature" or r.entity_id == "bme280_temperature_2"))
  |> keep(columns: ["_time","entity_id", "_field","_value"])

img1

Another example is a graph for upload/download speed of the speedtest integration:

from(bucket: "home_assistant")
  |> range(start: v.timeRangeStart, stop: v.timeRangeStop)
  |> filter(fn: (r) => r["_measurement"] == "Mbit/s")
  |> filter(fn: (r) => r._field == "value")
  // add windowing
  |> window(every: 1d)
  |> mean()
  // duplicate _time so we can plot again
  |> duplicate(column: "_stop", as: "_time")
  |> keep(columns: ["_time","entity_id", "_field", "_value"])

The plot shows all speedtest data I have currently, with mean() value per day to remove the jitter in the very long timeframe.

img2

And a third one, the status of a binary sensor -- choosing one that actually changes: binary_sensor.home_assistant_versions_update_available:

from(bucket: "home_assistant")
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) => r["_measurement"] == "binary_sensor.home_assistant_versions_update_available")
|> filter(fn: (r) => r._field == "value")
|> keep(columns: ["_time","entity_id", "_field","_value"])

A timeseries plot didn't work here, but a barplot does:

img3

The value update_available seems to be pushed irregular, so this plot is probably not very helpful.

Overall Grafana is a cool addon to InfluxDB. The "new" InfluxDB query language Flux is very much noch SQL and I don't really like it. Probably because of my 25+ years of SQL experience.

ESPHome: Raspberry PI Pico W

Another device I have lying around is a Raspberry PI Pico W which is the version with Wifi and Bluetooth. Running ESPHome on this small and cheap microcontrollers is not mature yet, but it works.

I followed some of the instructions from Koen Vervloesem in his post about the Pico and ESPHome. The wizard still doesn't work, but I didn't want to use the dashboard. So I started copy/pasting a yaml for my Pico W which looks like this:

esphome:
  name: pico2

rp2040:
  board: rpipicow
  framework:
    platform_version: https://github.com/maxgerhardt/platform-raspberrypi.git

api:
  password: ""

wifi:
  ssid: !secret wifi_ssid
  password: !secret wifi_password

i2c:
  sda: 20
  scl: 21

sensor:
  - platform: bme280
    temperature:
      name: "BME280 Temperature"
    pressure:
      name: "BME280 Pressure"
    humidity:
      name: "BME280 Humidity"
    address: 0x76

Most of the setup is the same as in the previous post for an ESP32. The differences are the board and the i2c ports. There is no automatic upload yet because the Pico works a bit differently. When in boot mode (pressing the button and then connect to the PC) the Pico connects as a drive. To change the firmware an uf2-file needs to be copied on the device and when finished the Pico reboots. To get this file I used this Docker command and copied the file to the drive:

docker compose run --rm esphome compile pico2.yaml
cp config/.esphome/build/pico2/.pioenvs/pico2/firmware.uf2 /run/media/mfa/RPI-RP2/

After the automatic reboot the Pico is discovered by Homeassistant and everything works the same way as for an ESP32.

img1