Value Streams Concepts

A CloudBees DevOptics Value Stream models a complex continuous delivery process. It can be assembled from multiple pipelines. A Value Stream shows a series of interconnected gates or steps that deliver value to a customer. Those steps are presented in the "Value Stream view". The Value Stream view allows you to:

  • Track changes

  • Detect stalled and delayed changes

  • Identify failures and blockages

  • Find components ready for testing

  • View contributing components

00 delivery view overview

Value Streams

A CloudBees DevOptics Value Stream is a visual model of the software delivery process. Value Streams are defined using phases and gates. A gate shows one or more tickets. Value Stream progress is tracked based on changes to tickets, commits, and artifacts.

Phases

Phases represent the flow of changes from initial implementation to delivery (promotion/deployment). The names of the phases and the number of phases can be customized for each Value Stream.

In simpler applications, phase definitions might reference software lifecycle terms like "Development", "Testing", "Deployment", and "Maintenance". In more complex applications, phase definitions might reference architectural elements like "Libraries", "Services", "Integration", and "Deploy".

00 delivery view phases

Gates

Gates are the Jenkins projects (Jobs) that create and package software artifacts. The condition of the gate is indicated by

  • Number of tickets currently in that gate

  • Solid green for success

  • Pulsing blue for running

  • Red for failed

The circle that represents the gate is drawn in colors segmented to show the status of tickets in the gate. Tickets are assigned to a gate when they are mentioned in a commit processed by the Jenkins project of that gate. If a project has built the ticket successfully, a green segment is shown. If a later Jenkins project builds the same ticket and fails, a red segment is added to show the failure. When the build of a project succeeds, all tickets for that project appear as green segments of the circle.

Gate status and ticket status are key parts of the Value Stream view.

00 delivery view gates

Tickets

Tickets are the Jira® issues that describe work to deliver the software. Tickets are assigned to a gate when they are mentioned in a commit.

Tickets move from gate to gate as development progresses. Tickets:

  • Move from one gate to another as artifacts created in that gate are delivered to later gates.

  • Remain in a gate until the artifact created in that gate is used in a later gate.

00 delivery view tickets

Commits

Commits are the changes applied to the software and commited to your SCM. Commits are the basic "unit of value" tracked by CloudBees DevOptics. Commits may reference one or more tickets by mentioning the ticket in the commit message. When a commit references a ticket, the ticket is assigned to the gate that processes the commit. CloudBees DevOptics supports only Git commits. Other source code control systems (Mercurial, Subversion, etc.) are not supported.

Artifacts

Artifacts are reusable/shareable components generated by Jenkins Jobs (gates) e.g. binary component files (e.g. JAR files, script files etc), NPM packages, Docker images, RPMs etc.

Artifacts are related back to SCM Commits because artifacts are built from source checked out from the SCM. For that reason, artifacts are also consider to be a unit of "value" that can be tracked.

To enable the tracking of artifacts produced and consumed by CI Jobs, DevOptics provides a Value Stream Artifact Tracking.

Warning
Use of Jenkins fingerprinting for artifact tracking in DevOptics is deprecated. Support for it will soon be withdrawn.

Define Sample Projects

To help you explore CloudBees DevOptics, define a sample Value Stream using two projects. The two projects are:

Application Build

In this example, a Jenkins Pipeline builds the application. Freestyle projects are also fully supported with CloudBees DevOptics. See Tracking the production/consumption of artifacts for how to track artifacts in Freestyle Jobs.

When this example Jenkins Pipeline runs, a file 'application.sh' is created. The Job then tells CloudBees DevOptics that it has "produced" this artifact by using the gateProducesArtifact pipeline step. See Tracking the production/consumption of artifacts.

  • Create a new GitHub repository for the sample application

  • Create a Jenkinsfile in the GitHub repository

The Jenkinsfile in the GitHub repository should contain:

pipeline {
  agent any
  stages {
    stage('Build') {
      steps {
        writeFile file: "application.sh", text: "echo Built ${BUILD_ID} of ${JOB_NAME}"
        gateProducesArtifact file: 'application.sh', label: "application.sh:${BUILD_ID}"
      }
    }
  }
}

Create a Jenkins project with the GitHub sample application repository. From Jenkins, click Jenkins  New Item

24 new item

Create a Pipeline

25 new pipeline

Add the GitHub repository definition and save the Pipeline by pressing Save.

26 add scm to pipeline

Run the Pipeline by pressing Build Now.

27 build now

Build results show the Pipeline completed successfully and generated an artifact, application.sh. The contents of application.sh change each time a build is performed.

28 build results

Application Deploy

A Jenkins Pipeline deploys the application. The project uses the Copy Artifact plugin to copy the results from the build project.

Install the Copy Artifact plugin by opening Jenkins  Manage Jenkins  Manage Plugins.

04 manage plugins

  • Click the "Available" tab and enter a "Filter" value of "copy artifact".

  • Select the checkbox for "Copy Artifact Plugin".

  • Press Download now and install after restart.

31 install copy artifact plugin

Restart Jenkins when the install is complete. The "Copy Artifact plugin" is ready to use.

32 copy artifact installed

Create a Jenkins Pipeline to copy the artifact from the Application Build. From Jenkins, click Jenkins  New Item

33 new item

Create a Pipeline

34 new pipeline

Add the Pipeline definition by pressing Save after inserting the Pipeline script into the editor. The Pipeline script should be:

pipeline {
  agent any
  stages {
    stage('Build') {
      steps {
        step([$class: 'CopyArtifact', projectName: 'application-build'])
        gateConsumesArtifact file: 'application.sh'
      }
    }
  }
}

The Pipeline copies 'application.sh' archive from the most recent successful Application Build. The build tells CloudBees DevOptics that it consumes the 'application.sh' file by calling the gateConsumesArtifact pipeline step. See Tracking the production/consumption of artifacts.

Run the Pipeline by pressing Build Now.

36 copy artifact build now

Build results show the Pipeline completed successfully and copied the artifact, application.sh, from the earlier Pipeline. Each run of the deliver Pipeline copies the most recent successfully built application.sh from the Application Build Pipeline.

37 copy artifact complete

Create a Value Stream

Open the CloudBees DevOptics editor with Create a Value Stream. That opens a new value stream in editor mode with default phases and gates in place.

Phases

the milestones or stages used to deliver the software

Gates

the Jenkins projects that create and package artifacts

Per default CloudBees DevOptics creates 3 commonly used phases "build", "test" and "release" with one unconfigured gate within each phase. All gates and phases can be changed in the visual editor or through a JSON representation of your values stream.

01 visual create

The "Untitled Gates" represent placeholder gates that can be instrumented to see tickets and commits as part of these gates and get devops metrics for a gate or the whole value stream.

Don’t forget to name the Value Stream in the top left corner so you can easily find it again.

What’s next?

Model your actual value stream with the visual editor or start instrumenting your value stream by tracking artifacts within your existing gates.

Model a Value Stream

In order to model a value stream, you need to configure the phases and gates so that value can be realized at the end. Think of your whole software delivery system, not just individual jobs or pipelines.

Value Stream Visual Editor

To get into editing mode of a value stream and start editing click the three dots icon in the top right of the screen to edit the Value Stream.

Click Edit Value Stream to use the Visual Editor.

00 visual editor

DevOptics visual editor lets you model different phases and gates of your value stream.

Phases

the milestones or stages used to deliver the software

Gates

the Jenkins pipeline that is run to move the code change forward (e.g. build, test, deploy, …​)

Once modeled you can instrument the Jenkins pipeline so DevOptics is able to track tickets and commits flowing through the value stream end to end.

Manage Phases

Phases represent the large milestones or stages needed to delivery the software.

Create new phases

You can create as many phases as you need. To do so focus your mouse in between 2 existing phases or at the edge of a starting a ending phase indicated by a blue dot which turns into a '' sign. By clicking on the '' sign a new phase is added at this position.

05 visual editor add new phase

Make sure to give the newly created phase a meaningful name.

Edit phases

To edit the name of a phase name click on the phase header and you can adjust the name of the phase. Make sure to hit save on your value stream in order to persist the changes.

05 visual editor edit phase name

Delete phases

You can only delete phases that do not have any gates within them. If you are interested on how to delete a gate see the section below managing gates. If you have an empty phase that needs deletion click on the phase header to go into edit mode. That brings up the delete icon on the right.

After confirming the deletion of this phase, make sure to persist the changes by clicking save on the value stream.

05 visual editor delete phase

Manage Gates

Gates represent the processes that are part of your software delivery system. They define the Jenkins pipelines that create and package artifacts and surface important metrics on efficiency of that gate.

A gate requires an existing phase and there can only be one gate per phase.

Create new connected gates

New gates can be created as a new connection from an existing gate. Hovering over an existing gate shows possible new connections that can be made.

From there a connection endpoint can be clicked and dragged into an empty phase to the drop target (dashed circle) until you see "Click to add gate" below the plus icon. Clicking on the drop target (dashed circle) in the empty phase will create a new gate at this position.

Note
Keep in mind that only a downstream gate (gate to the right) can create a connected gate to the left. Only the last gate can create new gates downstream.

05 visual editor create new gate

Make sure to name the gate and save the changes. In order to see work flow through created gates need to be configured and the connected Jenkins pipeline instrumented, see below.

Create non-connected gates or sub-streams

In order to create gates that are not connected to existing gates, you need to create first a gate that is connected to your most right gate and then delete the connection. This results in a single gate in that phase. From there you can now create new gates down and upstream of this initial gate and model independent gates or sub-stream.

05 visual editor model independent gates

This is important when modeling microservices system and you want to see all your services in one place. You might want to see all the services in one place to understand where tickets and features are across all services that deliver software to the end user. See the template on microservice value stream below.

Note
This can also be easily modeled with the JSON editor
Configure gates

Gates can be created to get a model of the current software delivery system. In order to see work flow through the value stream these gates need to be configured and connected to the Jenkins job. To configure a gate, click on the gate and the settings item that will appear next to it.

05 visual editor gate settings

Once gate configuration screen opens the connection to the corresponding Jenkins job needs to be established. DevOptics will auto-complete existing connected masters and jobs on these masters.

Note
If the master does not appear in the list make sure the plugin is installed on that master and it is connected properly.

Including gate in deployment frequency of value stream:

If the gate is a deployment gate, meaning you want the gate to be part of the deployment frequency computation of the value stream, make sure to check 'This is a deployment job'.

05 visual editor gate configuration

Note
Additionally the Jenkins pipeline associated with that job needs to ensure it tracks the artifact that it consumes or produces properly, so DevOptics can track the tickets and commits. See the artifact tracking section on how to instrument Jenkins pipelines.
Delete gates

In order to delete gates ensure you are in editing mode and click on the gate that you want to delete. That will show the settings and deletion icon.

A gate can be deleted right from the deletion icon or from within the gate configuration and will be deleted after confirmation.

Note
Make sure to save the changes on the value stream.

Manage connections between gates

Create new connection

A connection is usually created when creating a new gate. However, connections can also be created between existing downstream and upstream gates.

In order to connect existing gates drag and click the connection between two existing gates.

05 visual editor new connection

Delete existing connection

In editor mode click on existing connection. This will show a solid line and red scissors. Clicking on the red scissors will remove the connection between these gates.

05 visual editor cut connection

Persist changes to value stream

Changes in the Visual Editor are not saved until you press Save in the top right of the screen.

Pressing Cancel will undo any changes you have made and revert the Value Stream to its previously saved state.

06 visual editor

Value Stream JSON Editor

Value streams can also be defined as json entities. In order to enter json editing mode, click on the three dots menu on the top right of your value stream and select 'Edit JSON'.

Json representation makes it easy to share templates and scaffolds or adds the ability to insert generated value streams based on your software delivery system.

It requires a list of phases. Each phase can have multiple gates.

{
  "phases": [
    {
      "id": "<custom_id_of_phase>",
      "name": "<name_of_phase>",
      "gates": [
        {
          ...
        }
      ]
    }
  ]
}

Defining a phase:

{
  "id": "<custom_id_of_phase>",
  "name": "<name_of_phase>",
  "gates": [<gate>]
}
id

Identifier for that phase.

name

(Optional) The name of the phase

gates

(Optional) List of gates within that phase

Defining a gate:

{
  "id": "<custom_id_of_gate>",
  "name": "<name_of_gate>",
  "master": "<master_connected_to_gate>",
  "job": "<job_connected_to_gate>",
  "feeds":"id_of_gate_this_gates_feeds_into",
  "type": "<deplpyment_gate>"
}
id

Identifier for that phase.

name

(Optional) The name of the phase

master

(Optional) Master that connects to this gate. (required to see tickets and commits within the gate)

job

(Optional) Job within master that connects to this gate. (required to see tickets and commits within the gate)

feeds

(Optional) ID of gate this gate feeds into. (Not needed for most right gate.)

type

(Optional) Set type to deployment if this gates represents a deployment job

See below for a simple example:

{
  "phases": [
    {
      "id": "phase1",
      "name": "Build",
      "gates": [
        {
          "id": "gate1",
          "name": "Untitled Gate",
          "master": "",
          "job": "",
          "feeds": "gate2"
        }
      ]
    },
    {
      "id": "phase2",
      "name": "Test",
      "gates": [
        {
          "id": "gate2",
          "name": "Integration Tests",
          "master": "",
          "job": "",
          "feeds": "gate3"
        }
      ]
    },
    {
      "id": "phase3",
      "name": "Release",
      "gates": [
        {
          "id": "gate3",
          "name": "Untitled Gate",
          "master": "",
          "job": "",
          "type": "deployment"
        }
      ]
    }
  ]
}

Value Stream Templates

Template: Large monolithic system

Software delivery system of large complex application usually contain a lot of different components that need to go through rigurous testing, security checks before the release can be built and deployed. Value stream modelling visualizes the dependences in these processes and surfaces the tickets and commits within the software delivery pipeline. That enables to see bottlenecks and blockers early and act quickly to remove them and improve the overall system.

DevOptics lets you map all the dependencies of your software delivery processes from build to production.

05 visual editor use case large app

Here is a json representation of above value stream template. Copy & paste template into Json editor of your value stream to get started with this template.

{
  "phases": [
    {
      "id": "dev",
      "name": "Dev (Build/Test)",
      "gates": [
        {
          "id": "component_a",
          "name": "Component A",
          "master": "",
          "job": "",
          "feeds": "component_test_a"
        },
        {
          "id": "component_b",
          "name": "Component B",
          "master": "",
          "job": "",
          "feeds": "component_test_b"
        },
        {
          "id": "component_c",
          "name": "Component C",
          "master": " ",
          "job": "",
          "feeds": "component_test_c"
        },
        {
          "id": "component_d",
          "name": "Component D",
          "master": "",
          "job": "",
          "feeds": "component_test_d"
        }
      ]
    },
    {
      "id": "component_tests",
      "name": "Component Tests",
      "gates": [
        {
          "id": "component_test_a",
          "name": "Component A",
          "master": "",
          "job": "",
          "feeds": "integration"
        },
        {
          "id": "component_test_b",
          "name": "Component B",
          "master": "",
          "job": "",
          "feeds": "integration"
        },
        {
          "id": "component_test_c",
          "name": "Component C",
          "master": "",
          "job": "",
          "feeds": "integration"
        },
        {
          "id": "component_test_d",
          "name": "Component D",
          "master": "",
          "job": "",
          "feeds": "integration"
        }
      ]
    },
    {
      "id": "system_integration",
      "name": "system Integration",
      "gates": [
        {
          "id": "integration",
          "name": "Integration",
          "master": "",
          "job": "",
          "feeds": "integration_tests"
        }
      ]
    },
    {
      "id": "system_tests",
      "name": "System Tests",
      "gates": [
        {
          "id": "integration_tests",
          "name": "Integration Tests",
          "master": "",
          "job": "",
          "feeds": "staging_deploy"
        }
      ]
    },
    {
      "id": "staging",
      "name": "Staging",
      "gates": [
        {
          "id": "staging_deploy",
          "name": "Staging",
          "master": "",
          "job": "",
          "feeds": "production_deploy"
        }
      ]
    },
    {
      "id": "release-promotion",
      "name": "Production",
      "gates": [
        {
          "id": "production_deploy",
          "name": "Release",
          "master": "",
          "job": "",
          "type": "deployment"
        }
      ]
    }
  ]
}

Template: Microservice system

When delivering your application through multiple loosey coupled microservices the delivery process of each service becomes simpler but the overall system more complex. It is important to understand how these services deliver features and where and if there are blockers and bottlenecks.

DevOptics lets you map sub-streams of your overall value streams with multiple endpoints and visualize everything in one value stream.

05 visual editor use case microservices

Here is a json representation of above value stream template. Copy & paste template into Json editor of your value stream to get started with this template.

{
  "phases": [
    {
      "name": "Build Services",
      "id": "build_services",
      "gates": [
        {
          "id": "service_a_build",
          "name": "Service A - Build",
          "master": "",
          "job": "",
          "feeds": "service_a_test"
        },
        {
          "id": "service_b_build",
          "name": "Service B - Build",
          "master": "",
          "job": "",
          "feeds": "service_b_test"
        },
        {
          "id": "service_c_build",
          "name": "Service C - Build",
          "master": "",
          "job": "",
          "feeds": "service_c_test"
        }
      ]
    },
    {
      "id": "tests",
      "name": "Tests",
      "gates": [
        {
          "id": "service_a_test",
          "name": "Service A - Test",
          "master": "",
          "job": "",
          "feeds": "service_a_staging"
        },
        {
          "id": "service_b_test",
          "name": "Service B - Test",
          "master": "",
          "job": "",
          "feeds": "service_b_staging"
        },
        {
          "id": "service_c_test",
          "name": "Service C - Test",
          "master": "",
          "job": "",
          "feeds": "service_c_staging"
        }
      ]
    },
    {
      "id": "staging_deploy",
      "name": "Staging Deploy",
      "gates": [
        {
          "id": "service_a_staging",
          "name": "Service A - Staging Deploy",
          "master": "",
          "job": "",
          "feeds": "service_a_verification"
        },
        {
          "id": "service_b_staging",
          "name": "Service B - Staging Deploy",
          "master": "",
          "job": "",
          "feeds": "service_b_verification"
        },
        {
          "id": "service_c_staging",
          "name": "Service C - Staging Deploy",
          "master": "",
          "job": "",
          "feeds": "service_c_verification"
        }
      ]
    },
    {
      "id": "verification",
      "name": "Verification",
      "gates": [
        {
          "id": "service_a_verification",
          "name": "Service A - Staging Verification",
          "master": "",
          "job": "",
          "feeds": "service_a_prod"
        },
        {
          "id": "service_b_verification",
          "name": "Service B - Staging Verification",
          "master": "",
          "job": "",
          "feeds": "service_b_prod"
        },
        {
          "id": "service_c_verification",
          "name": "Service C - Staging Verification",
          "master": "",
          "job": "",
          "feeds": "service_c_prod"
        }
      ]
    },
    {
      "name": "Production Deploy",
      "id": "production_deploy",
      "gates": [
        {
          "id": "service_a_prod",
          "name": "Service A - Production Deploy",
          "master": "",
          "job": "",
          "feeds": null,
          "type": "deployment"
        },
        {
          "id": "service_b_prod",
          "name": "Service B - Production Deploy",
          "master": "",
          "job": "",
          "feeds": null,
          "type": "deployment"
        },
        {
          "id": "service_c_prod",
          "name": "Service C - Production Deploy",
          "master": "",
          "job": "",
          "feeds": null,
          "type": "deployment"
        }
      ]
    }
  ]
}

View tickets and commits

CloudBees DevOptics combines Git commit messages, Jira® tickets, and project results in the Value Stream view. Git commit message text refers to Jira® tickets by ticket ID. For example, a commit message referring to Jira® ticket EXAMPLE-12345 must include EXAMPLE-12345 in the text. Multiple Jira® tickets may be referenced in a single git commit.

Using the Git repository created earlier, create a commit message that references a Jira® ticket. Commit and push a change to the repository referring to your ticket (for example, EXAMPLE-12345) so that the ticket reference is detected in the commit message.

Return to the gate job page and click Build Now to start a build.

47 Build Now

The Value Stream view is updated to show the ticket progress. Click on the Build Gate to open the ticket panel on the right. The ticket panel shows a summary of ticket status and gate job status. Click the ticket in the ticket panel to see more details.

48 see completed gate and ticket

The ticket panel detail view shows:

  • Description of the ticket (expandable if the description is large)

  • Gate status (success or failure)

  • Number of commits referencing this ticket

  • Summary of each commit

The ticket panel detail view Find commits field filters commits based on text you enter.

49 ticket details and commit

View the Value Stream

CloudBees DevOptics shows progress of changes through the software delivery process. When a commit is built that references a Jira® ticket, that ticket is included in the Value Stream view for the gate associated with that project. When a build succeeds that includes an artifact from a preceding gate, tickets from the preceding gate move to the successful gate.

Gate Status

The condition of the gate is indicated by:

  • Number of tickets currently in that gate

  • Solid green for success

  • Pulsing blue for running

  • Red for failed

Gate Job

The job associated with a gate can be opened by clicking on the "View gate job" drop down in the gate status pane.

Each gate is associated with a Jenkins project.

  • View the associated Jenkins project by clicking the gate in the Value Stream view.

  • When the "Build Gate" panel appears, click the three dots in the right hand pane.

  • Click View gate job…​ to open a new web browser on the project associated with this gate.

42 view gate job

The gate project is visible with its artifacts and build history.

43 gate job view

Ticket Status

Clicking a gate shows the tickets for that gate. Tickets can be opened by clicking the ticket in the gate status pane.

The Value Stream view time period can be adjusted from the drop down menu in the top right of the page. Multiple time periods are available, including:

  • 24 hours

  • 48 hours

  • 7 days

  • 14 days

  • 30 days

  • 90 days

00 delivery view overview

Instrument a Value Stream to track tickets and commits

Tracking the production/consumption of artifacts

Warning
Use of Jenkins fingerprinting for artifact tracking in DevOptics is deprecated. Support for it will soon be withdrawn. Instead, please use the Value Stream Artifact Tracking features described in this section.

DevOptics pipeline steps

The CloudBees DevOptics pipeline steps are an extension to the Jenkins Pipeline. They allow a Jenkins pipeline to explicitly declare that it produces artifacts, or consumes the changes made by other Jenkins jobs.

DevOptics gateProducesArtifact step

The CloudBees DevOptics produces step is a pipeline extension that allows a Jenkins pipeline to explicitly declare that it produces artifacts that can be consumed by the gateConsumesArtifact step.

This step allows your pipeline to explicitly define what artifacts it is producing, that you are interested in for CloudBees DevOptics. Explicitly defining the artifacts you want CloudBees DevOptics to track allows you to more accurately follow work as it moves across your Value Streams.

Produce a specific artifact with a known ID

Using the step in this form is as simple as follows:

gateProducesArtifact id: '<id>', type: '<type>', label: '<label>'
id

The ID you have assigned to the artifact you want to produce. This ID should match the ID used in a gateConsumesArtifact step call. This ID can be whatever identification scheme you use for artifacts. The only requirement is that the ID is unique within the context of your CloudBees DevOptics Organization.

type

The type of artifact you are producing. Common values are file, docker, rpm. This type value should match the type value in a gateConsumesArtifact step call. This type can be whatever name you use for classifying artifact types.

label

(Optional) A readable label, providing contextual information about the artifact produced. This label should be human readable as it will be used in the DevOptics UI.

Produce a specific file artifact

In order to notify CloudBees DevOptics that this run produces a file:

gateProducesArtifact file: '<file>', type: '<type>', label: '<label>'
file

The file within the workspace that you want to notify CloudBees DevOptics about. This will hash the file to produce an ID.

type

(Optional) The type of artifact you are producing. Common values are file, docker, rpm. This type value should match the type value in a gateConsumesArtifact step call. This type can be whatever name you use for classifying artifact types. If not defined, it defaults to file.

label

(Optional) A readable label, providing contextual information about the artifact produced. This label should be human readable as it will be used in the DevOptics UI.

Example

Here is an example Jenkinsfile scripted pipeline that produces a plugin-a.txt and notifies CloudBees DevOptics about it.

// Jenkinsfile scripted pipeline
node {
    stage ('checkout') {
        checkout scm
    }
    stage ('build') {
        // Creates a file called plugin-a.txt. Using git rev-parse HEAD
        // here because it will generate a new artifact when the HEAD ref
        // commit changes. You could also just echo a timestamp, or something else.
        sh "git rev-parse HEAD > plugin-a.txt"

        // Records plugin-a.txt as a produced artifact.
        archiveArtifacts artifacts: 'plugin-a.txt'
    }
    stage ('produce') {
        // Notify DevOptics that this run produced plugin-a.txt.
        gateProducesArtifact file: 'plugin-a.txt'
    }
}
DevOptics consumes step

The CloudBees DevOptics consumes step is a pipeline extension that allows a Jenkins pipeline to explicitly declare that it consumes artifacts that have been marked as produced by the gateProducesArtifact step.

This step allows your pipeline to explicitly define what artifacts it is consuming, that you are interested in for CloudBees DevOptics. Explicitly defining the artifacts you want CloudBees DevOptics to track allows you to more accurately follow work as it moves across your Value Streams.

Consume a specific artifact with a known ID

Using the step in this form is as simple as follows:

gateConsumesArtifact id: '<id>', type: '<type>'
id

The ID you have assigned to the artifact you want to consume. This ID should match the ID used in a gateProducesArtifact step. This ID can be whatever identification scheme you use for artifacts. The only requirement is that the ID is unique within the context of your CloudBees DevOptics Organization.

type

The type of artifact you are consuming. Common values are file, docker, rpm. This type value should match the type value in a gateProducesArtifact step call. This type can be whatever name you use for classifying artifact types.

Consume a specific file artifact

In order to consume a file within the workspace:

gateConsumesArtifact file: '<file>', type: '<type>'
file

The file within the workspace you want to consume. This will hash the file to produce an ID.

type

(Optional) The type of artifact you are consuming. Common values are file, docker, rpm. This type value should match the type value in a gateProducesArtifact step call. This type can be whatever name you use for classifying artifact types. If not defined, it defaults to file.

Example

Here is an example Jenkinsfile scripted pipeline that consumes a plugin-a.txt artifact, notifies CloudBees DevOptics about it, then produces a plugin-b.txt artifact and notifies CloudBees DevOptics about it.

// Jenkinsfile scripted pipeline
node {
    stage ('checkout') {
        checkout scm
    }
    stage ('build') {
        // Copies the artifacts of plugin-a/master (plugin-a.txt) in to this workspace.
        copyArtifacts projectName: 'plugin-a/master'

        // Notify DevOptics that this run consumed plugin-a.txt.
        gateConsumesArtifact file: 'plugin-a.txt'

        // Creates a file called plugin-a.txt. Using git rev-parse HEAD
        // here because it will generate a new artifact when the HEAD ref
        // commit changes. You could also just echo a timestamp, or something else.
        sh "git rev-parse HEAD > plugin-b.txt"

        // Records plugin-b.txt as a produced artifact.
        archiveArtifacts artifacts: 'plugin-b.txt'
    }
    stage ('produce') {
        // Notify DevOptics that this run produced plugin-b.txt.
        gateProducesArtifact file: 'plugin-b.txt'
    }
}
DevOptics consumes run step

This step is a CloudBees DevOptics pipeline extension that allows a Jenkins pipeline to explicitly declare that it consumes the changes (commits and Issue Tracker tickets) made by another Jenkins Job upstream from it in a CD pipeline process.

Before using this step, consider using the gateConsumesArtifact and gateProducesArtifact steps to track artifacts instead. This step is intended for use in those edge cases where artifact tracking is not easy/possible.

This step allows your pipeline to explicitly define a run of an upstream Jenkins job via the job name, run ID and master URL.

Consume a specific upstream job run

Using the step in this form is as simple as follows:

gateConsumesRun masterUrl: '<master-url>', jobName: '<job-name>', runId: '<run-id>'
masterUrl

(Optional) The exact URL of the Jenkins master hosting the upstream job. The same URL used on the upstream gate configuration in the CloudBees DevOptics Application. If not defined, the URL will default to the URL of the master running the pipeline i.e. it assumes the upstream job is on the same Jenkins master.

jobName

The exact name of the upstream job you want to consume from.

runId

(Optional) The ID of the upstream job run to be consumed. This can come from a job parameter or from a job trigger. If not defined, it defaults to the runId of the last successful run of the upstream job.

withMaven pipeline step

The CloudBees DevOptics plugin includes an integration with the Pipeline Maven plugin. This integration allows the CloudBees DevOptics plugin to automatically notify CloudBees DevOptics about the dependencies used and the artifacts produced by a Maven build, which is executed from within a withMaven pipeline step.

Important
To use this feature, Jenkins will need to have both the CloudBees DevOptics plugin and the Pipeline Maven plugin installed.
Example

Consider the following pom files. There is a plugin-a and a plugin-b. plugin-b uses plugin-a as a dependency:

<project xmlns="http://maven.apache.org/POM/4.0.0"
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">

	<modelVersion>4.0.0</modelVersion>
	<groupId>com.cloudbees.devoptics</groupId>
	<artifactId>plugin-a</artifactId>
	<packaging>jar</packaging>
	<version>1.0-SNAPSHOT</version>
	<name>plugin-a</name>

</project>
<project xmlns="http://maven.apache.org/POM/4.0.0"
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">

	<modelVersion>4.0.0</modelVersion>
	<groupId>com.cloudbees.devoptics</groupId>
	<artifactId>plugin-b</artifactId>
	<packaging>jar</packaging>
	<version>1.0-SNAPSHOT</version>
	<name>plugin-b</name>

	<dependencies>
		<dependency>
			<groupId>com.cloudbees.devoptics</groupId>
			<artifactId>plugin-a</artifactId>
			<version>1.0-SNAPSHOT</version>
		</dependency>
	</dependencies>

</project>

Plugin A can have a Jenkinsfile scripted pipeline like the following. Notice that there are no explict calls to CloudBees DevOptics needed:

// Plugin A Jenkinsfile scripted pipeline
node {
    stage ('checkout') {
        checkout scm
    }
    stage ('build') {
        withMaven() {
            sh "mvn clean install"
        }
    }
}

Running this will result in CloudBees DevOptics being notified about 2 events:

  1. plugin-a pom file as a produced artifact.

  2. plugin-a jar file as a produced artifact.

Plugin B can also have a Jenkinsfile scripted pipeline like the following. Again, notice that there are no explict calls to CloudBees DevOptics needed:

// Plugin B Jenkinsfile scripted pipeline
node {
    stage ('checkout') {
        checkout scm
    }
    stage ('build') {
        withMaven() {
            sh "mvn clean install"
        }
    }
}

This will result in CloudBees DevOptics being notified about 3 events:

  1. plugin-a jar file as a consumed artifact.

  2. plugin-b pom file as a produced artifact.

  3. plugin-b jar file as a produced artifact.

Disabling the integration

The integration with the Pipeline Maven plugin can be disabled in 2 ways:

  1. Disable the integration just for a specific withMaven pipeline step:

    // Pipeline Maven plugin integration disabled
    node {
        stage ('checkout') {
            checkout scm
        }
        stage ('build') {
            withMaven(options: [ gateArtifactPublisher(disabled: true) ]) {
                sh "mvn clean install"
            }
        }
    }
  2. Disable the integration globally by going to Jenkins  Manage Jenkins  Global Tool Configuration  Pipeline Maven Configuration  Options.

    If the CloudBees DevOptics Gate Artifact Publisher is already listed, tick the Disabled tickbox. If it is not already listed, first add it using the Add Publisher Options dropdown, then tick the Disabled tickbox.

    67 cloudbees publisher disable

Freestyle job build steps

These CloudBees DevOptics build steps allow a Freestyle job to explicitly declare that it produces artifacts, or consumes the changes made by other Jenkins jobs.

Freestyle job build step

The CloudBees DevOptics build step for Freestyle jobs is called Inform DevOptics of consumed artifact.

This step is a CloudBees DevOptics extension that allows a Jenkins job to explicitly declare that it consumes artifacts that have been marked as produced by the gateProducesArtifact step.

This step allows your job to explicitly define what artifacts it is consuming, that you are interested in for CloudBees DevOptics. Explicitly defining the artifacts you want CloudBees DevOptics to track allows you to more accurately follow work as it moves across your Value Streams.

68 consumed artifact build step

69 consumed artifact build step

Consume a specific artifact with a known ID

Using the step in this form requires filling out the following fields:

id

The ID you have assigned to the artifact you want to consume. This ID should match the ID used in a gateProducesArtifact step call. This ID can be whatever identification scheme you use for artifacts. The only requirement is that the ID is unique within the context of your CloudBees DevOptics Organization.

type

The type of artifact you are consuming. Common values are file, docker, rpm. This type value should match the type value in a gateProducesArtifact step call. This type can be whatever name you use for classifying artifact types.

Consume a specific file artifact

In order to consume a file within the workspace:

file

The file within the workspace you want to consume. This will hash the file to produce an ID. Note that the artifact must be in the workspace in order for this to work i.e. the pipeline script may need to "get" the artifact first e.g. by copying from another job run, or pulling from an artifact repository.

type

(Optional) The type of artifact you are consuming. Common values are file, docker, rpm. This type value should match the type value in a gateProducesArtifact step call. This type can be whatever name you use for classifying artifact types. If not defined, it defaults to file.

Freestyle job post-build action

The CloudBees DevOptics post-build action for Freestyle jobs is called Inform DevOptics of produced artifact.

This step is a CloudBees DevOptics extension that allows a Jenkins job to explicitly declare that it produces artifacts that can be consumed by the gateConsumesArtifact step.

This step allows your job to explicitly define what artifacts it is producing that you are interested in for CloudBees DevOptics. Explicitly defining the artifacts you want CloudBees DevOptics to track allows you to more accurately follow work as it moves across your Value Streams.

70 produced artifact post build action

71 produced artifact post build action

Produce a specific artifact with a known ID

Using the step in this form requires filling out the following fields:

id

The ID you have assigned to the artifact you are producing. This ID should match the ID used in a gateConsumesArtifact step call. This ID can be whatever identification scheme you use for artifacts. The only requirement is that the ID is unique within the context of your CloudBees DevOptics Organization.

type

The type of artifact you are producing. Common values are file, docker, rpm. This type value should match the type value in a gateConsumesArtifact step call. This type can be whatever name you use for classifying artifact types.

label

(Optional) A readable label, providing contextual information about the artifact produced. This label should be human readable as it will be used in the DevOptics UI.

Produce a specific file artifact

In order to notify CloudBees DevOptics that this run produces a file:

file

The file within the workspace that you want to notify CloudBees DevOptics about. This will hash the file to produce an ID.

type

(Optional) The type of artifact you are producing. Common values are file, docker, rpm. This type value should match the type value in a gateConsumesArtifact step call. This type can be whatever name you use for classifying artifact types. If not defined, it defaults to file.

label

(Optional) A readable label, providing contextual information about the artifact produced. This label should be human readable as it will be used in the DevOptics UI.

DevOps Performance metrics

CloudBees DevOptics calculates and displays a set of four key metrics (as popularised in the Annual State of DevOps Report). The State of DevOps report is the industry bible on CD and DevOps adoption and its correlation to improved organisational performance.

These DevOps performance metrics allow you to objectively and reliably measure and monitor improvements to your software delivery capability.

The availability of these metrics within CloudBees DevOptics brings several important benefits

  • Continuously updated data to drive improvement across teams

  • Trustworthy dashboards to guide informed decisions for better business outcomes

  • Data-driven discovery and use of best practices across teams

Four Key Metrics

The following four metrics are calculated on a continual basis for Value Streams defined in CloudBees DevOptics.

Deployment Frequency (DF)

  • The frequency of successful run of any gates identified (in the Value Stream definition) as deploy gates. Where multiple deploy gates exist in a value stream, the Value Stream metric is an aggregation

  • High performers deploy more often

A gate has to be marked as a Deploy Gate before the deployment frequency can be calculated for that gate. You can achieve this by editing the gate and checking the option This is a deployment job

Deploy Gate

Computation

Deployment frequency (DF) of a deploy gate = Count of successful deploys / number of days

Deployment Frequency (DF) of a Value Stream = Count of successful deploys of all deploy gates / number of days

Mean Lead Time (MLT)

  • The mean time for a ticket and associated commits to successfully flow through a gate. At the Value Stream level it is the mean time for a ticket and associated commits to successfully flow from their entry point in the Value Stream to their final gate(s)

  • High performers have lower mean lead times

Computation

For an individual gate, we compute the Lead Time (LT) of commits in that gate. Lead Time is computed as follows:

Lead Time (LT) = Time when commit exited the gate - Time when the commit entered the gate

Mean Lead Time (MLT) of a Value Stream = Mean of all Lead Times in a Value Stream

Mean Time To Recover (MTTR)

  • The mean time it takes for a gate to return to a successful state from when it enters an unsuccessful state. Also aggregated to the Value Stream level

  • High performers recover from failure faster

Computation

For an individual gate, we compute the Time to Recovery (TTR) of failures in that gate. TTR is computed as follows:

Time to Recover (TTR) = End time of the most recent successful build - End time of the first of a consecutive sequence of recent failures

A gate is considered in the failed state if the underlying job is in one of the following states:

  • Failure

  • Unstable

Mean Time To Recover (MTTR) of a gate = Mean of all TTR’s of the gate

Mean Time To Recover (MTTR) of a Value Stream = Mean of MTTR’s of all gates in a Value Stream

Change Failure Rate (CFR)

  • The percentage of unsuccessful runs of a gate caused by new changes. Also aggregated to the Value Stream level.

  • High performers are less likely to introduce a failure with any change.

Computation

For an individual gate, we compute the CFR as follows:

Change Failure Rate (CFR) = Total number of unsuccessful runs of a gate caused by new changes, as a percentage of total number of runs of the gate

Change Failure Rate (CFR) of a Value Stream = Total number of unsuccessful runs of the value stream as a percentage of total number of runs of the value stream

Gate Metrics

Open the gate details of a gate to view metrics specific to that gate

Gate Metrics

Value Stream Metrics

You can press Metrics to view Value Stream level metrics

Value Stream metrics

Export Metrics to CSV

DevOptics lets you export value stream and gate level metrics as .csv for additional analysis and reporting. Metrics can be exported within the context menu of the value stream or the gate. In the dialogue that opens it allows you to specify details on what metrics to export and the timeframe.

The export will group metrics per day so you can analyze change over time of these metrics.

Metrics Export

Metrics for All Value Streams

You can view metrics for all Value Streams by navigating via the Value Streams menu . You can see the list of all Value Streams and metrics listed next to each. This Value Stream list can be sorted via each metric column

All Value Stream metrics