Google Cloud Storage cleanup

I got quite some storage usage since using Google Cloud Run for a growing number of applications in the last months.
So I need to cleanup a bit. Regularly.

If you have only a few applications this can be done manually on the Google Cloud Registry website: https://console.cloud.google.com/gcr/.
But I want to automate this.

First step is to get to know what is there:

gcloud container images list --repository eu.gcr.io/$PROJECT_ID

This lists all services with container images in my europe registry folder.

Next list all images in one of the folders:

gcloud container images list-tags $IMAGE --limit=unlimited

$IMAGE is the full name of the folder, i.e. eu.gcr.io/$PROJECT_ID/helloworld

This returns a list of images with tags and date, i.e.:

DIGEST        TAGS    TIMESTAMP
101211eecef5  latest  2021-01-19T14:01:04
83e3f7541cf5          2021-01-19T13:55:24

To delete an image we need the SHA256 of the image.

Same command as before but with a --format at the end.

gcloud container images list-tags $IMAGE --limit=unlimited --format='get(digest)'

The result is only the sha256 without tags and date. So quite hard to decide if I want to delete it. :(

Now delete one of the images, for safety reasons I chose the oldest one:

gcloud container images delete -q --force-delete-tags \
  "gcr.io/$PROJECT_ID/helloword@sha256:b280f4f858492e50176470c26edb9cd4903cf69dc78070c806340edc1a1c84bc"

Doing this manually feels wrong and is quite time consuming.

But others had this problem before.
A good starting point for me was this gist: https://gist.github.com/ahmetb/7ce6d741bd5baa194a3fac6b1fec8bb7.
I made it a bit more verbose, but kept the system of deleting until a given date.
Some forks improved the gist to keep a given number of images.