Skip to Main Content

Test Design Best Practices Using Mocha Chai, Page Objects, and Linters

test best practices

“We need automation.” It’s often the phrase stated by management. What does that mean exactly? Does it mean, “Write test code however you want as long as it automates the product’s functionality?” No. I think not! There are many aspects to writing test automation that are often overlooked and could cause issues later on. I have found that a majority of problems can be addressed by focusing on three areas including living documentation, simplification, and return on investment (ROI). Of course, there are other areas to consider. However, with these suggestions, I believe you will get the most out of your test framework and test best practices.

Whether you are using a behavior driven development (BDD) or test driven development (TDD) approach to testing, testing can become an arduous task. I have worked with many developers who prefer to write tests after they have the application code written. There is nothing wrong with this approach, as long as tests are written well. To write TDD/BDD tests effectively, one must understand the behaviors of the feature under development. TDD is similar to BDD in that the tests are written before you write the product code. Many times, there are little to no requirements written and as a result, it is left to the developer to investigate and ask the product owner for details. Regardless of the testing approach your team takes, using Mocha as your test framework is a great choice.

Before we get into the details, it is important to understand how Mocha works. Why Mocha? Well, I like frameworks that are simple to use, open source, flexible, and have a great support community. The standard out-of-the-box configuration of Mocha uses the concept of Describe, It, and Hooks. `describe()` is merely for grouping, which you can nest as deep as necessary to clearly ‘group’ your tests into suites. `it()` is a test case, there are usually several it statements within a describe statement. `before()`, `beforeEach()`, `after()`, `afterEach()` are hooks to run e.g. ‘before()’ is run before all tests.

 

Living Documentation

Test plans and test strategies are often written, read once, then forgotten. A better approach is to transform your test cases into living documentation. Simply put, living documentation is the definition of your features providing a common support dialogue for your business and tech teams.

Take this basic example of using nested describes for consideration:

php.spec.js

describe("PrizmDoc - PHP Sample Viewer", function() { // main describe
  before(() => alert("Testing started –
  describe("PrizmDoc - PHP Sample Viewer", function() { // main describe
  before(() => alert("Testing started – before all tests"));
  after(() => alert("Testing finished – after all tests"));
  beforeEach(() => alert("Before a test – enter a test"));
  afterEach(() => alert("After a test – exit a test"));
    describe("Redactions", function() { // nested describe
      it('should allow text redactions', () => // test code);
      it('should allow retangular redactions', () => // test code);
    });
    describe("Annotations", function() { // nested describe
      it('should allow drawing of a circle annotation', () => // test code );
      it('should allow drawing of a rectangular annotation', () => // test code);
    });
}); 

This example covers a small amount of features. However, if you know your product, it should be clear that you have inadequate testing by simply reading. I have nested describes within a main describe statement. This allows grouping of tests providing clear organization of your tests. Since there are multiple Samples, PHP, JSP etc., I will create different javascript spec files accordingly: php.spec.js, jsp.spec.js, etc.

For each sample, there are many test scenarios that can be performed on content, e.g. redactions, annotations, search, etc. Grouping or nesting provides the developer with an easy way to locate the test suite allowing for easy troubleshooting and/or adding additional tests. From a reporting perspective, it provides stakeholders with enough information to understand there is appropriate test coverage for the feature.

Test Best Practices

Use nested describe statements and describe/it statements that are human readable. Ubiquitously, the test case should be understood regardless of your understanding of code. For example, if you write a single describe statement like this:

describe("sum", function()

It’s ambiguous – leaving the question “the sum of what?” – which causes the viewer to dive into more details of the test. This is potentially a waste of time as it might not be the test case the person is looking for. Regardless of the skillset of the person looking at the test case, they should have a basic understanding of ‘what’ and ‘why’ but not necessarily the ‘how’.

Chai provides an easily readable assertion library and pairs nicely with the Mocha test framework. There are three options: should, expect, and assert. I particularly like expect.

var expect = chai.expect;
expect(foo).to.be.a('string');
expect(foo).to.equal('bar');
expect(foo).to.have.lengthOf(3);
expect(tea).to.have.property('flavors')
  .with.lengthOf(3);

For me, these examples are very easy to read; the intent is clear. However, using expect, should, or assert is really preference. I feel expect allows chaining of words which is closer to natural language and promotes the most readability.

Using Mocha with Chai is a great way of providing living documentation for the specifications of the features of your products. Remember, test cases should be easy to read and understand without knowing the specifics of the test code. Next, we’ll discuss simplification and ROI. Learn more about the rest of my findings in my full article here.

Neal Armstrong, Sr. Software Development Engineer in Test

Neal Armstrong (not the astronaut), Senior Software Development Engineer in Test (SDET), began his career at Accusoft in 2018. Neal has a bachelor’s degree in Information Technology and has written frameworks in several languages: C#, Java, Javascript, and Python. He has worked with several PrizmDoc teams and especially thrives in new product development. He is currently assisting teams working on PrizmDoc Viewer. Neal’s career interests include architecting test frameworks and contributing to test automation. In his spare time, he enjoys playing guitar, fishing, and riding bicycles, but not at the same time.