Dip Dapp Doe — Anatomy of an Ethereum distributed fair game (part 3)

Yet, Test Driven Development

In the last article we dived into the architecture, design and the building blocks of our dapp’s frontend. For educational purposes, we even showed the integration of one of the blockchain transactions, but let’s not lose perspective. In TDD, we need to spec first and code later.

There are very nice tools which allow to automate UI tests and even record them visually among different browsers. However, in dapp testing we are limited by two important caveats:

Only a few browsers support Web3
Browser support may extend with the release of new MetaMask plugins, but we are mainly pivoting around the Chrome engine and Gecko.

We can’t get programmatic access to control MetaMask/Web3
Allowing Javascript code to accept Ethereum transactions would be a huge security flaw, because any web site could steal our funds at once. However, that is what we need to do in order to test our code.

The last issue would have been a major drawback for any serious project’s workflow. Until now.

Dappeteer

Puppeteer is an official package from Google that allows to programatically control a Chromium instance from NodeJS on Linux, Windows and MacOS. However how do we add the MetaMask plugin and tell it to accept transactions if the plugin runs out of our window?

That’s where Dappeteer comes into play! It is another NPM package that features a bundles version of MetaMask, tells Puppeteer to run with the plugin enabled and provides some wrapper methods to import accounts, accept transactions and even switch to a different network.

In our web folder:

$ npm i -D puppeteer dappeteer

Local blockchain

If you recall, in part 1 we developed our smart contracts by deploying and testing them in a local blockchain. Test cases waiting for every public transaction to be mined would take ages to complete.

However, in part 2 we demonstrated the integration to the public blockchain from the browser. What happens, now? How can we use a local blockchain so that transactions can be mined as fast as when using Truffle?

The tool for this is Ganache CLI. It is another NPM package, which is part of the Truffle Framework and it is what we actually used under the hood in part 1.

$ npm i -D ganacle-cli

If you run it now, you should see something like this:

Ganache CLI output

As you see, it generates random wallets with 100 ether, but it can be fully customized. Now we can mine immediate transactions without polluting the public blockchain with junk.

Workflow scripts

I normal web projects, you may be used to working with Webpack started by a simple NPM script. However, in the current project we need to start combining different simultaneous components at the same time.

What needs to happen when we run our E2E tests?

  • Starting the Ganache local blockchain (in the background)
  • Recompile the contracts
  • Deploy them to the local blockchain
  • Write the contract instance’s address so that the frontend knows where to attach to
  • Bundle the frontend files with Parcel
  • Start a local HTTP server for the static files (in the background, too)
  • Launch Chromium+Dappeteer and run the tests
  • Kill Ganache and the HTTP server
  • Forward the exit() code of Mocha to the parent process, so that it can determine if all tests passed or not

You are free to use any task runner that you like, but to me this clearly becomes a job for a shell script. To get the best of both worlds, I’d suggest you to use runner-cli, along with a Taskfile. More on this.

$ [sudo] npm i -g runner-cli

Let’s create one:

$ run --new
? What template do you want to use?
Gulp file
NPM package
Makefile
? Shell script

Now edit taskfile and add a function called test with the following set of commands (commented in-line)

function test {
echo "Starting ganache"
ganache-cli --mnemonic "$(cat ./dev/mnemonic.txt)" > /dev/null &
ganache_pid=$!

# ...

Here we start the server in background (with the & at the end) and retrieve the process PID by assigning $! into ganache_pid. Also note that "$(cat ./dev/mnemonic.txt)" reads the contents of the mnemonic.txt file and puts them as a Ganache parameter. With that, everyone can import the same account.

 echo "Recompiling the contracts"
cd ../blockchain
./taskfile build
cd ../web

Here we go to the contracts folder and run another script that launches Solc to compile the contracts. Compilation can run concurrently with Ganache.

 echo "Deploying to ganache"
node ./dev/local-deploy.js

This script is quite similar to blockchain/deploy/lib.js. Instead of deploying to the Ropsten network, it deploys them to Ganache. It also stores the instance address into .env.test.local (will see it later).

 echo "Bundling the web with NODE_ENV=test"
NODE_ENV=test parcel build -d ./build --log-level 2 --no-source-maps src/index.html &
parcel_pid=$!

Now that we know what address to attach to, we can tell Parcel to bundle from src to build with the appropriate environment variables in place. This can run in parallel with our next step:

 echo "Starting local web server"
serve build -p 1234 &
serve_pid=$!

This will simply start an HTTP server, leave it on the background and get note of its PID. Run npm install -D serve to add it to the project.

 echo "Running the tests"
wait $parcel_pid
mocha ./test/frontend.spec.js
mocha_result=$?
sleep 1

Here, we wait for the Parcel process to complete, and when it does, we finally start our Mocha test cases. We keep the exit code of Mocha by reading $? and a bit later we start to clean things up:

 echo "Stopping the servers"
kill $ganache_pid
kill $serve_pid
exit $mocha_result
}

We kill the two background processes and finally exit with the status code returned by Mocha.

Ta da!

Environment data

At the current point, if we run parcel -d ./build src/index.html, we will start a dev server on port 1234 with a Web3 pointing to the Ropsten (test) network. But if we dorun test, then we expect to have a web site that will connect to Ganache. How to achieve that without touching any code?

Parcel allows us to use .env files and map the KEY=value lines into process.env.* variables. Let’s create a couple of files for our environments. In web/.env:

CONTRACT_ADDRESS=0xf42F14d2cE796fec7Cd8a2D575dDCe402F2f3F8F
WEBSOCKET_WEB3_PROVIDER=wss://ropsten.infura.io/ws
EXPECTED_NETWORK_ID=ropsten

These are the environment variables that will be used by default. This is, when compiling the web, we will connect to the public Ropsten network, expect MetaMask to be on this network too and use the contract address where it is deployed.

However, when we are testing, we want those variables to look like below in web/.env.test.local:

CONTRACT_ADDRESS="--- LOCAL CONTRACT ADDRESS GOES HERE ---"
WEBSOCKET_WEB3_PROVIDER=ws://localhost:8545/ws
EXPECTED_NETWORK_ID=private

When NODE_ENV is set, Parcel will look for .env.$(NODE_ENV).local and inject those values instead of the default ones. So process.env.EXPECTED_NETWORK will evaluate to private in testing and be ropsten otherwise. More info here.

As we already mentioned, we need web/dev/local-deploy.js to replace the CONTRACT_ADDRESS placeholder by the contract’s local address. The main difference with the deployment script we already wrote in blockchain/deploy/lib.js is the following function:

function setContractAddressToEnv(contractAddress) {
if (!contractAddress) {
throw new Error("Invalid contract address")
}
const filePath = path.resolve(__dirname, "..", ".env.test.local")
 let data = fs.readFileSync(filePath).toString()
 const line = /CONTRACT_ADDRESS=[^\n]+/
data = data.replace(line, `CONTRACT_ADDRESS=${contractAddress}`)
 fs.writeFileSync(filePath, data)
}

Every time we run test, the .env.test.local file is updated, and there is no code to modify.

What if I want to just develop on a version of the dapp using the local blockchain?

Two versions of the dev task are available on the web folder’s taskfile on GitHub.

  • run dev will provide an environment identical to the one used to run the tests, but leaving the browser open for you
  • run dev ropsten will simply run Parcel’s dev server and rely on Chrome/Firefox’s MetaMask as any user would do

Time for spec’s

Create the web/test/frontend.spec.js file and copy the following content into it:

Ready? Type run test and see the magic happen 🙂

Everything we need is ready for us. To keep the article readable, we will not elaborate on every use case. Feel free the check the spec file on GitHub.

What happens next?

We could approach the specs by starting a game, switching to another account; accepting the game, switching account back again, etc. However, this could lead to overcomplex specs and check a behaviour that users will not experience like that. We’d rather focus in one player’s experience and make sure that all relevant use cases are checked.

To simulate the actions of the opponent, we will launch the corresponding transactions from the NodeJS testing script. So the approach we will follow looks like:

  • We tell Chromium to create a game
  • We launch a tranasction from web/test/frontend.spec.js to accept the game from accounts[1]
  • Chromium confirms
  • We tell Chromium to mark one position
  • We make a transaction from the opponent’s account to mark another position
  • Repeat the process until we reach a draw
  • We check that the cells have the appropriate state and that the game ends in draw

So how would such use-case test look like?

Writing UI specs like this can be slow at the beginning, but the effort pays off as soon as you have simulated 5 complete games in less than a minute.

A few things to note:

  • Some assertions need to be delayed a bit, so that the frontend receives events and UI components respond
  • The amount of time to delay may vary, depending on the environment speed
  • We have added HDWalletProvider to reuse the same mnemonic, get the second account available and let the opponent play from it
  • We have created a couple of helper functions to encapsulate repetitive tests, and will probably add more as we test more use cases

Given the following spec, we code the behaviour of the frontend accordingly.

Let’s watch the movie of our test case playing against itself:

Facebook Comments

More Stuff

Don’t Invest in ICO’s 1. Removes Middle Man in a Current Process Essentially one of the most important concepts of Blockchain technology is that it aims to cut out a middl...
Dip Dapp Doe — Anatomy of an Ethereum distributed ... Let’s add some code to start shaping our layout: This is a very simple skeleton with basic detection of Web3 support to show dummy messages accordi...
How We’re Designing a Better Virtual Machine than ... The Ethereum Virtual Machine (EVM) The Ethereum Virtual Machine (EVM) is a “quasi-turing complete” 256-bit virtual machine, one of the most important ...
Augur Launches and ICO Graveyard July 10, 2018 Newsletter. If you’d like to receive analysis like this everyday. Subscribe here. AUGUR LAUNCHES Today, Augur, announced in a Medium po...
Spread the love

Posted by News Monkey