How to develop and test with Google Cloud Datastore running locally?
If you’re having projects running with Google Cloud Datastore, you’ve probably wonder if it would be possible to just have the whole stack locally so you can develop without any extra setup or external connections.
This is possible, Google provides an emulator for Datastore.
It’s really handy for local development and testing environment.
The problem we had
We’re working on several projects mostly hosted on GCP and we’re versioning our code on Github. Developers can easily setup projects because we’re trying to make it easy for every one with a Docker compose stack.
Where I’m a bit concerned, it’s when I need to connect to external services for my development journey.
I saw it a bit like a defeat…
The most painful aspect is that developers cloning the repository will need credentials file(s) to connect to this/these service(s).
We’re not versioning our credentials of course and you’re probably not!
Several solutions exists to share these credentials across teams but that’s, once again, more steps in the process and it makes the journey worst.
Lucky for us, there is always a solution! This solution is Google Datastore emulator, you can read the official documentation here.
A complete Docker Compose stack you say?
As you can imagine, installing the service locally is not satisfying enough. It requires to follow the documentation (nobody read docs sadly D: and even more Google ones…) and getting through all potential issues with dependencies can become long and tedious.
Like I said: we wanted a simple way to start working on projects, clone the repository -> start Docker compose -> enjoy!
To keep it simple, I manage to build a Docker image (joemugen/datastore-dsui) with a bundle consisting of the Google Datastore emulator service and a handy Web interface @streamrail/dsui.
With this image we can simply add the following to our docker-compose.yml
in fact our docker-compose.development.yml
where we add required services for development (it could probably be the subject for a future post...) but let's keep it simple for now
version: "3"services:
datastore-dsui:
container_name: datastore-dsui
image: joemugen/datastore-dsui
environment:
- DATASTORE_PROJECT_ID=project-test
- DATASTORE_LISTEN_ADDRESS=0.0.0.0:8081
ports:
- "8081:8081"
- "3000:3000"
Environment variables
DATASTORE_PROJECT_ID
can be a string of anything, here it'sproject-test
but it could bemy-uber-development-app
DATASTORE_LISTEN_ADDRESS
is the address that refers to a listen address, meaning that0.0.0.0
can be used. The address must use the syntaxHOST:PORT
, like in our example0.0.0.0:8081
.
Ports:
8081
is the default port for the Datastore Emulator. If it's working, you should see Ok with a HTTP 200 when accessing http://localhost:80813000
is the default port for the DSUI interface. It's a Web interface, meaning you can access it here: http://localhost:3000
The Docker image is around 400Mb, that’s a lot but it does not require installing Node dependencies (Needed for the @streamrail/dsui package), it just have these already installed.
It’s probably not the best solution you’d say but meh after all we need to download these some 400Mb anyway. I will probably add a volume for node_modules
in a near future to share these files between datastore-dsui containers... If you have better ideas, I'll be glad to hear it :)
Please give me code that I can test quickly!
Below an example from the official Google code snippet with some modifications. That way you can have a quick try.
In a new folder, run this command:
npm install @google-cloud/datastore
it will install the only dependency we need for this test.
Create a docker-compose.yml file
And paste the following configuration:
version: "3"services:
datastore-dsui:
container_name: datastore-dsui
image: joemugen/datastore-dsui
environment:
- DATASTORE_PROJECT_ID=project-test
- DATASTORE_LISTEN_ADDRESS=0.0.0.0:8081
ports:
- "8081:8081"
- "3000:3000"
Create a index.js
file
And add the following code, it’s the official Google code snippet, you can find here but with some modifications.
// Imports the Google Cloud client library
const {Datastore} = require('@google-cloud/datastore');// Creates a client
const datastore = new Datastore({
namespace: 'ProjectTest'
});async function quickstart() {
// The kind for the new entity
const kind = 'Task';// The name/ID for the new entity
const name = 'aaa-bbbb-cccc-dddd';// The Cloud Datastore key for the new entity
const taskKey = datastore.key([kind, name]);// Prepares the new entity
const task = {
key: taskKey,
data: {
description: 'Buy milk',
},
};// Saves the entity
await datastore.save(task);
console.log(`Saved ${task.key.name}: ${task.data.description}`);
}
quickstart();
Add a run script in your package.json
To make it easier to test, we’ll create a quick script with the 2 mandatory environment variables (DATASTORE_EMULATOR_HOST
and DATASTORE_PROJECT_ID
). Of course on your nice projects, you won't go that way: it's a quick and dirty proof of concept.
{
"dependencies": {
"@google-cloud/datastore": "^6.3.1"
},
// Add this section
"scripts": {
"start": "DATASTORE_EMULATOR_HOST=localhost:8081 DATASTORE_PROJECT_ID=project-test node index.js"
}
}
Start Docker compose
When you have everything done, first run docker-compose up
(without the -d
to let you see what's happening the first time you run it) and then try to access both http://localhost:8081 and http://localhost:3000 If everything is working well, you should respectively see these 2 Web pages:
Time to run the JavaScript code:
npm run start
You should see this result:
That’s it! You can now implement this in your own development and testing projects :)