Setting up a new local Fauna cluster using Docker
Every Oracle DBA will have a RAC setup story to tell. It can range from the time it took, the patches they had to install midway, or not being able to connect to the database for no apparent reason. Having lived through these experiences over the years, the very idea of setting up a database cluster makes me think that it cannot be very straightforward. So when I embarked on setting up my first Fauna cluster on my second week on the job, I was a bit apprehensive and thought that it would take hours. Leave aside Oracle RAC-- even Cassandra and Mongo gave me a hard time. But it turned out that setting up Fauna is really simple and can be done within a few minutes. My DBA friends, are you listening ?!!!
As with most new databases these days, I decided to setup a cluster on my laptop using docker. Docker provides a nice way to keep things very simple and clean. My first step was to pull the Fauna container image from docker hub.
$ docker pull fauna/faunadb:latest latest: Pulling from fauna/faunadb be8881be8156: Pull complete 60f08eedb1d2: Pull complete d4f58360b842: Pull complete 6a391283a674: Pull complete e5fae5985ac7: Pull complete eb00faac30ba: Pull complete 90e011c54f88: Pull complete 6529b57b5cf9: Pull complete 71a47929723d: Pull complete 3e67826a23e5: Pull complete c62cc3207452: Pull complete Digest: sha256:15fcf6e1daf31447fd8762c4d925d268f7138623f3e448602 a33a3ba9efb9168 Status: Downloaded newer image for fauna/faunadb:latest $ docker image list|grep -i fauna fauna/faunadb latest d2f23397fcce 9 days ago 327MB Debadityas-MacBook-Pro:~ deba$
Now that the image is available, we can straightaway use it to start my first node of the cluster.
$ docker run -d --rm --name faunadb -p 8443:8443 fauna/faunadb 47a75e91096149d9607d660eb81b29f87ed32659cc473dc70466976f2f590c4e
If you intend to write a lot of data and want it to persist it between container shutdowns, you will want to map a local disk as a volume inside the container. Further, if you want to access the logs from your host OS, you will want to map the log folder. You can do all of this using the command below.
$ docker run --rm --name faunadb -p 8443:8443 \ -v <host-directory or="" named-volume="">:/var/lib/faunadb \ -v <host-directory>:/var/log/faunadb \ fauna/faunadb:<version></version></host-directory></host-directory>
Once the container is up and running, then you can check the status of this node. We will log into the container and check the status with the admin tool:
$ docker exec -it 47a /bin/bash root@47a75e910961:/faunadb# cd /faunadb/enterprise/ root@47a75e910961:/faunadb/enterprise# bin/faunadb-admin --key secret status No configuration file specified; loading defaults... Datacenter: NoDc ================ Status State WorkerID Address Owns Goal HostID up live 512 172.17.0.2 100.0% 100.0% 6bc66c3b-8a67-40a4-a06c-fdbe3bd281ad
Once the first node is up and running, open another terminal window and join two new nodes. Make sure to note the ip/address assigned to the first node.
# Add the 2nd Node $ docker run -d --rm --name faunadb2 -p 8444:8443 fauna/faunadb --join 172.17.0.2 1e56362fdfede5f884abec8a5b9bc8050db013b483498a0fed107120f7458d71 # Add the 3rd Node $ docker run -d --rm --name faunadb3 -p 8445:8443 fauna/faunadb --join 172.17.0.2 921594173d98af5042b1f77312d321543cd3484926f754c22397452f433d8dea $ docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 921594173d98 fauna/faunadb "faunadb-entrypoint.…" 4 minutes ago Up 4 minutes 7500-7501/tcp, 0.0.0.0:8445->8443/tcp faunadb3 1e56362fdfed fauna/faunadb "faunadb-entrypoint.…" 5 minutes ago Up 5 minutes 7500-7501/tcp, 0.0.0.0:8444->8443/tcp faunadb2 47a75e910961 fauna/faunadb "faunadb-entrypoint.…" 6 minutes ago Up 6 minutes 7500-7501/tcp, 0.0.0.0:8443->8443/tcp faunadb
After starting the two nodes, we can now check the status of the cluster.
root@47a75e910961:/faunadb/enterprise# bin/faunadb-admin --key secret status No configuration file specified; loading defaults... Datacenter: NoDc ================ Status State WorkerID Address Owns Goal HostID up live 512 172.17.0.2 40.2% 33.6% 6bc66c3b-8a67-40a4-a06c-fdbe3bd281ad up live 513 172.17.0.3 42.6% 35.4% cb2f3ff4-dda7-47c5-ade4-fef9e96ff146 up live 514 172.17.0.4 17.2% 31.1% 44ecee55-b1a4-4eba-af40-0dda3e1daadc
So the cluster, a single replica with multiple nodes, is up and running. But keep in mind this has only been installed on my laptop to play with the database. Real Fauna clusters are installed across globally distributed data centers. If you want to get a feel for that architecture, here is a great post by John Miller.
Once the database is setup, we want to spin up the dashboard tool. For setting up the dashboard tool, you can refer to its Git repository here.
I have already cloned the repository and done the install. So all I had to do was to start the dashboard.
$ npm start You can now view dashboard-base in the browser. Local: http://localhost:3000/ On Your Network: http://10.0.1.158:3000/ Note that the development build is not optimized. To create a production build, use npm run build.
The dashboard will open in a browser window and prompt for your secret that is specified part of the image.
Once the you click on the “Use Secret” button you can access the dashboard to create databases, classes etc.
As you can see, in matter of minutes we could setup a cluster, access the dashboard and start creating data in Fauna. If you want even faster access you can try the always ON Fauna Serverless Cloud. It is super simple to get started and you don’t have to worry about running or managing a database. We will also provide a docker compose file soon that will make your local setup even easier. Keep an eye on our git repo for docker.
In the coming blog posts, I will talk about how to get started with Fauna using Python and also setup a multi-DC cluster with Docker.
If you enjoyed our blog, and want to work on systems and challenges related to globally distributed systems, serverless databases, GraphQL, and Jamstack, Fauna is hiring!
Subscribe to Fauna blogs & newsletter
Get latest blog posts, development tips & tricks, and latest learning material delivered right to your inbox.