Using Google Cloud Datastore with Google Cloud Run
Google Cloud Datastore (former Firebase) seems a good alternative for a small document based storage. When PostgreSQL (my preferred database) is too big (and expensive) and SQL not really needed.
I use the Google Cloud SDK installed on ArchLinux with the community package.
Sourcecode
All sourcecode of this demo is on Github: https://github.com/mfa/google-cloud-datastore-demo.
Setup
I chose the new "datastore mode" and not the old "native mode". The datastore location for me is in europe-west3. More on that in the Google documentation: https://cloud.google.com/datastore/docs/firestore-or-datastore.
Authentication
For development we need a credentials.json
file to access the Datastore.
A serviceaccount that can only read/write to the datastore seems the official way to do this.
So create one as described in https://cloud.google.com/docs/authentication/production#auth-cloud-implicit-python.
Save this file in app/credentials.json
.
This file should never be deployed or version controlled.
For this add the file to .dockerignore
and .gitignore
.
On production the Cloud Run instance is inside the same Google Cloud Project and doesn't need additional authentication.
Docker
The code will run in Docker on Google Cloud Run. The Dockerfile is very similar to the one in my previous blogpost.
FROM tiangolo/uvicorn-gunicorn:python3.8-slim COPY requirements.txt /requirements.txt RUN pip install --no-cache-dir -r /requirements.txt COPY ./app /app
For development we use a docker-compose.yml
like this one:
version: "2" services: main: build: . command: "/start-reload.sh" environment: PYTHONUNBUFFERED: 0 GOOGLE_APPLICATION_CREDENTIALS: credentials.json volumes: - ./app:/app ports: - "8001:80"
app
is seen.credentials.json
is set to an environment variable the google-cloud-datastore client automatically uses.Requirements
We need fastapi and of course the google library for datastore access.
This is our requirements.txt
The App
The main python code is placed in that app
directory and named main.py
:
import uuid from google.cloud import datastore from fastapi import FastAPI app = FastAPI() datastore_client = datastore.Client() @app.get("/") async def hello(): return {"message": "Hello World"} @app.get("/demo") async def demo(): kind = "Task" uid = str(uuid.uuid4()) task_key = datastore_client.key(kind, uid) task = datastore.Entity(key=task_key) task["description"] = f"{uid} needs a description" datastore_client.put(task) # retrieve element back and return it key = datastore_client.key("Task", uid) return datastore_client.get(key)
Run locally
To run the code locally using docker-compose:
And surf to http://localhost:8001/ to see the status page. The demo is located http://localhost:8001/demo/.
Now you should see a new entity in the Google Cloud Datastore.
Run on Google Cloud Run
Build the container on the Google Cloud. (Replace <PROJECT_ID> with your Google Cloud project ID)
And run the build artifact on Google Cloud Run.
gcloud run deploy datastore-demo --image eu.gcr.io/<PROJECT_ID>/datastore-demo --allow-unauthenticated
The last command returns the URL were the code is deployed with a *.run.app
domain.
https://https://datastore-demo-XXXXXXXX.a.run.app/demo
a new document should be added to the datastore.XXXXXXXX
is a random string generated by Google Cloud)