Table of Contents
Delivering stable service to clients or reliable and secured products to the customer is very important. Delivering quality service or products rapidly is also very important. So, we are Implementing DevOps Practices into the SDLC and continuously improving it. But how do we know if the DevOps Practice is implemented properly or not? The answer is we have a standard assessment process and Metrics that will show whether your DevOps ecosystem is mature enough and running efficiently and securely. A few of them are DORA Metrics, Flow Metrics, DevSecOps KPIs, and more. In this article, we will discuss How to Measure and Visualize Realtime DevOps DORA Metrics with Apache DevLake.
Let’s see DORA Metrics briefly.
DORA (DevOps Research and Assessments) metrics can be used by Enterprises to measure Performance and Efficiency by calculating the important 4 metrics.
- Mean Lead Time for change – It is the Length of time between code changes to VCS and it is deployment.
- Change failure rate – It is the percentage of code changes required for hotfixes and other production fixes.
- Deployment Frequency – How often Code is deployed to production.
- Mean Time to recovery – it is to measure how long the recovery from any interception takes.
Apache DevLake is an open-source Apache Project which is available for engineers and organizations to take data from various DevOps systems like Version control systems, CICD Tools, and ALM (Application Lifecycle Management tools) and Analyze and Visualize to capture Important DevOps Metrics like DORA Metrics, Flow Metrics, DevSecOps KPIs.
DevLake consists of six major components.
- Config UI – This will help us to create Data Connections with tools like GitHub, GitLab, Jira, etc. and This will also help us create Blueprints to create a workflow to connect Data connection, Transformation Rules, and Sync frequency.
- API Server – It will connect to Runner and Database to interact across the machine.
- Database – It stores the UserData and Pipeline Data connected by Data Connections in Blueprints.
- Runner – Runner basically executes the Logic defined in Blueprints in Config UI and delivers it to Plugins and Database.
- Plugins – It helps DevLake to connect with DevOps tools like GitHub and GitLab to get data.
- Dashboard – It helps us to visualize the collected data and make the desired metrics visualize in DevOps Ecosystem. We can use any Dashboard or BI tool to visualize.
Apache DevLake Installation
So, in this article, we will see How Data Collection and Connection to one of the DevOps tools and Visualization are done. For that, we are going to install Apache DevLake components with Docker Compose provided by Apache.
It is very simple to install Apache DevLake using Docker Compose provided by Apache. Just follow the steps for smooth installation. Before that, Make sure you have installed Docker and Docker-compose installed on your machine.
env.examplefrom here (https://github.com/apache/incubator-devlake/releases/latest).
- Then Rename
mv env.example .envin the terminal. This file will have environmantal configurations that will be utilized when DevLave is running.
- Then, run
docker-compose up -d. and then your Apache DevLake is up and running.
- You can visit http://localhost:4000 and see if your Apache DevLake is running in your browser.
- And You can visit http://localhost:3002 or click the Dashboard button in the Apache DevLake to see Grafana Dashboard and you can use
adminas username and
Now, Let’s see how Apache DevLake is configured.
Configure Data Connection.
In this article, we will see How GitLab data is connected to Apache DevLake as Data Connection. For that, I am using my Personal GitLab account and repository with CICD Pipeline.
- From the left-hand side of the application, select Data Connection and select GitLab then select “Add Connection”.
- This will open a new page and, on that page, fill out the information like Connection Name, Endpoint URL (API Endpoint), Access Token (I use my Personal User-level Token), and optionally, proxy URL and Rate Limit.
- Then You can test your connection and save it. (Refer to the Screenshot)
As we discussed, Apache Blueprint is nothing but the Workflow with Data connection and its transformation configurations. Now let’s see how to configure Blueprint in Apache Blueprint.
- From the left-hand side of the Apache DevLake App, Select Blueprint and Select “New Blueprint”.
- This will open a new page and it will have, Blueprint Name and Add Data Connection. Select the data connection you created in your previous process. And select next.
- Then, On the next page, Select the project you have in your GitLab. And enter Data entities which are like SCM, Issues, Code Review Process, CICD etc.
- Then, on the next page select “Add Transformation” and on that new page, you can select Data Transformation rules like “Detect Deployment from Jobs in GitLab CI” and you can select the Job to detect that as Deployment Job. Else, Apache will take the Production environment as the data Transformation rules. Then, select Finish. And select Next Step.
- Last, on the Set Sync Frequency page, select the Frequency of syncing between the applications. For the Article purpose, I select Manual. Then you can select Save and Run.
- Then on the Page of your Blueprint, you can see the Current Run, “Run Now” button for Manual trigger, Historical Run, etc, Refer to the screenshot in order.
For Dashboard, we can use any Dashboard or BI tool. But in this article, we are using Grafana as it is the default component of Apache DevLake.
- Now in the Apache DevLake application select Dashboard on the left-hand side and it will open Grafana Dashboard. Just open it and you will be prompted for a username and password. Enter, admin as username and admin as password. This will open the Grafana application. Then search for DORA Metrics in the search button. Then, you can see the DORA dashboard as shown in the screenshot.
In this article, we have seen what DORA Matrices are, what other DevOps and DevSecOps metrics we have, what is Apache Devlake and its architecture, and how to configure the Apache DevLake component. In our upcoming article, we will discuss more DevOps Metrics and measure DevOps performance and its impact on the business directly. Stay tuned and subscribe DigitalVarys for more articles and study materials on DevOps, Agile, DevSecOps , and App Development.
Experienced DevSecOps Practitioner, Tech Blogger, Expertise in Designing Solutions in Public and Private Cloud. Opensource Community Contributor.