Meinsack with Datasette

In the end of 2015 we built a Django based website to generate a calendar for our local recycling dates. We used Django, DjangoRestFramework and PostgreSQL because this was our hammer (and still is for a lot of things). Every year I imported the current dates for the new year, but I never updated the core code of the project. Since then the router internals of DjangoRestFramework changed and we would have to rewrite our code to use a newer version of DjangoRestFramework. Instead I decided to use Datasette.

Datasette fits perfectly for this usecase:

  • the API is readonly

  • the database is small and can be shipped as file

  • ical is only a custom renderer

  • the site can be hosted on Google Cloud Run without hastle

Migration

For the migration I exported the PostgreSQL database using db-to-sqlite. Then I removed all tables not in the main app and remove the main_ from all the remaining tables. To have one view with all necessary data joined I added a database view (see readme.md).

One important part of the migration was that the urls have to stay the same. This is achieved by redirect rules in form of a datasette plugin. In the plugin there is an additional renderer to add the custom json export format used in the old api. The code of the plugin: old_api_compatibility.py

The new frontend is based on bulma css and doesn't use any jquery. For autocompletion I used HTML5 datalist with a bit of javascript to autofill them.

To keep my sanity for the next updates I added some tests to ensure the important endpoints stay the same. This tests are run via Github Actions on every push.

Run everything on Google Cloud Run is only the publish command shown in the readme. No server to keep updated or monitored anymore.

Next Steps

The data import has to be ported to the new codebase. Something I need to do until end of 2020.

A nice addon would be to port the code that added streets, districts and areas into the database. This is a requirement to add more cities.