Docker on Linux, macOS
Docker is the de facto industry standard for packaging applications into a container. By doing so, all dependencies, such as the language runtimes, operating system, and libraries are combined with the product.
This guide is for setting up the development environment with Docker. If you are looking for ARK Core Production ready Docker images, they are now available at Docker Hub, but are not meant to be used for development purposes.
Step 1: Clone Core Repository
Let’s clone our
core repository and run the initial
yarn setup command. We will also checkout the latest
yarn setup command leverages Lerna to clean, bootstrap and build the core packages (including transpiling typescript). For mode information look into core’s
package.json file in the root folder.
git clone https://github.com/arkecosystem/core cd core git checkout develop yarn setup #run Lerna to clean, bootstrap and build the core packages
Step 2: Generate the Docker Configurations
ARK Core includes several
docker-compose.yml templates to ease development. They can be used to generate different configurations, depending on the network and token.
For instance, you could use this command (to be run from
core root folder):
yarn docker ark
This command creates a new directory (
docker) that contains 1 folder per network. You can read more about generating of Docker configurations here.
Once your basic docker configurations are generated, you can select one of the two available approaches on how to best utilize the most fitting docker development setup.
In the next sections we will explain two approaches on how you can setup you development environment with docker:
- Run a PostgreSQL container while using NodeJS from your local environment
- Run a PostgreSQL container, build and run ARK-Core using a mounted volume
Approach 1: Containerize Only the Persistent Store
This approach runs a PostgreSQL container while using NodeJS from your local environment. This configuration is well suited when you are not developing ARK Core, but instead working with the API. By tearing down the PostgreSQL container, you reset the Nodes blockchain.
PostgreSQL is run in a separate container and its port gets mapped to your localhost, so you should NOT have PostgreSQL running locally.
cd docker/development/$NETWORK # (NETWORK = testnet || devnet) docker-compose up postgres
To run the Postgres container in the background execute the following command:
docker-compose up -d postgres
In case you need to start with a clean Database:
docker-compose down -v docker-compose up -d postgres
Approach 2: Serve Core Node and Postgres as a Collection of Containers
This approach runs a PostgreSQL container, builds and runs ARK-Core using a mounted volume. When a container is built, all files are copied inside the container. It cannot interact with the host’s filesystem unless a directory is specifically mounted during container start. This configuration works well when developing ARK Core itself, as you do not need to rebuild the container to test your changes.
Along with PostgreSQL container, now you also have a NodeJS container which mounts your local ark-core git folder inside the container and installs all NPM prerequisites.
cd docker/development/$NETWORK # (NETWORK = testnet || devnet)
docker-compose up -d
You can now enter your ark-core container and use NodeJS in a Docker container (Linux environment).
docker exec -it ark-$NETWORK-core bash
Need to start everything from scratch and make sure there are no remaining cached containers, images or volumes left? Just use the purge_all.sh script.
Development files/presets are not Production ready. Official Production ARK-Core Docker images are now available at Docker Hub.
Start core and play with Public API
You can jump to Spinning Up Your First Testnet Section here and test your local Core Server, by following the link below: