How PrizmDoc Editor Uses GitLab CI
Here at Accusoft, each product has its own, special solution for continuous integration and, in some cases, continuous deployment. One of our newer products, PrizmDoc Editor, has taken a new approach to this and utilized the GitLab CI tool built into GitLab. This new approach has allowed us to have an amazingly small turnaround time for fixing product bugs and issues and getting a better version to the customers in far less time. For this product, GitLab CI was an obvious choice due to our monorepo and already having our repositories hosted in GitLab. This being our first product to use the GitLab CI tool, the structure put into place for the PrizmDoc Editor project has also been a guideline for some of our other PrizmDoc products as well.
GitLab CI Background Information
For those unfamiliar with GitLab CI, it is a tool that is fully integrated into GitLab for the purpose of continuous integration, continuous delivery and continuous deployment. The premise is that every time code is checked in, a pipeline of scripts is automatically run to build, test, and validate the code changes before merging them into master. For more information, check out the marketing website and documentation.
Essentially, what you need to know is that in order to run tests through GitLab CI, you need to have at least one GitLab instance and at least one GitLab runner. A pipeline is split into multiple stages and each stage has one or more jobs. Jobs are the collection of instructions outlined in a .gitlab-ci.yml file that the GitLab runner executes in order to build, test, or deploy code. Stages are gated when run automatically (the next stage does not start executing until the previous one has finished and passed) unless otherwise specified. All jobs that belong to a single stage execute in parallel when that stage is running.
PrizmDoc Editor GitLab CI Pipeline Structure
Our GitLab runner is shared between multiple projects and exists on an always-on Ubuntu 18.04 virtual machine that uses docker-machine to spawn up to ten Ubuntu 18.04 Openstack instances with docker installed on them.
There are two different pipelines for the product: the manual pipeline and the master pipeline. The manual pipeline is the one available when running pipelines against individual commits to branches, which are not master, or merge requests. The master pipeline is the pipeline that will automatically start upon the acceptance of every merge request that is merged to master.
The manual pipeline is associated with all commits that are not associated with the master branch. The pipelines are separated from each other by using the `except` and `only` keywords in the .gitlab-ci.yml file. The jobs that belong to the manual pipeline get the keyword `except: – master` and the jobs that belong to the master pipeline get the keywords `only: – master`. This tells the runner to not add the manual jobs to the master pipeline and that the master jobs only belong to the master pipeline:
test-manual: <<: *test-job when: manual dependencies: - build-manual except: - master test-master: <<: *test-job dependencies: - build-master only: - master
There are six stages in the manual pipeline: Test, Build, Fail-fast-test, Full-test, Docker-publish, and Deploy-dev:
When code is pushed to a branch or merge request in the remote repository, the Test stage automatically runs. The Test stage is the only stage in the manual pipeline that runs automatically and is also included in the master pipeline, therefore it does not get the keyword `except: – master`. Subsequently, all of the other manual pipeline jobs have `when: manual` specified in the gitlab-ci.yml file to let the runner know that these stages do not run automatically. Because of this, a developer can run any test job in the manual pipeline in any order they wish, barring dependencies on other jobs:
test-manual: <<: *test-job when: manual dependencies: - build-manual except: - master
In the rest of this article, we will go over the six stages of the manual pipeline and how it differs from the master pipeline. Download the rest of the article here to continue reading.