Ready to Start Your Career?

DevSecOps: Pipeline Development

Philip Kulp's profile image

By: Philip Kulp

July 20, 2020

Over the coming weeks, Cybrary will be posting a series of blogs that correspond with the newly released course, DevSecOps Fundamentals. This post is the fourth in a series covering core principles to assist in the automation of a secure pipeline.

  1. Securing the Development Cycle
  2. What Are We Defending?
  3. Pipeline: Planning and Awareness
  4. Pipeline: Development
  5. Pipeline: Delivery
  6. Pipeline: Deployment
  7. Pipeline: Operation & Monitor
  8. Summarize the learned concepts

Please follow the posts and provide feedback or questions to the author via LinkedIn.

Overview

In the previous post, we discussed the requirements for the planning phase and the metrics needed for a modern DevSecOps pipeline. In this article, we start integrating security into the evaluation of development and build phases.

alt_text

The article will cover the following topics:

  1. Pipeline Orchestration
  2. Statics Application Security Testing (SAST)
  3. Software Composition Analysis (SCA)
  4. Using Jenkins for SAST/SCA
  5. OWASP DevSecOps Maturity Model

Pipeline Orchestration

The development phase involves turning the requirements into code, and the build phase combines the source code into the final product. Some code such as Java and C require a compiler in the build phase to create bytecode or a binary representation which can be executed. Other code such as Python or Javascript, known as interpreted, does require compiling but is packaged into the build for deployment. A later post will discuss Infrastructure as Code (IaC), which includes scripts to build or configure the infrastructure. An orchestration tool, such as Jenkins, which we referenced in these posts, handles the building of the application.

The orchestration tool also handles the execution of the security toolchain, which is the focus of our DevSecOps activities. Some analysis tools can run on uncompiled or interpreted code, and other security tools require compiled code. The pipeline orchestrator is configured to prepare the code into the required format before executing the security tools. The orchestrator can also fail the process at this stage if errors exist in the code, and it cannot be built.

Static Application Security Testing (SAST)

Most security tools perform SAST checks before the code is compiled. Unlike a penetration test, there are no firewalls, access control lists, or other security to block probes, so SAST can evaluate the raw code; most vulnerabilities are identified at this stage. Static analysis tools can follow variables across multiple source code files and evaluate how the program will sanitize input and output. The tools understand SQL injection, cross-site scripting (XSS), and other attack vectors, so they scan the source code for insecure handling of data. The SAST tools will usually map to some standards, such as the Common Weakness Enumeration (CWE), which includes a Top 25 list of the common software errors.

Many commercial and open-source SAST tools exist for scanning code to identify vulnerabilities. Some open-source tools specialize in a single programming language, while most commercial tools can scan a majority of the common languages. SpotBugs and PMD are open source tools for scanning Java and have plugins for the Eclipse Integrated Development Environment (IDE). IDE integration allows developers to identify bugs and fix them during the coding phase before they propagate into the later stages. PumaScan is an example of an open-source tool that can scan .Net source code directly within the VisualStudio IDE.

Other security evaluation tools exist, such as Lint-based checks. Lint-based tools focus on the structure of code to evaluate the readability. While poorly formatted code may not present an immediate risk, it can cause issues when future developers have to debug or modify the code. Lint can check for long lines of code, complex logic, or other issues which may cause difficulties. For example, unclear logic checks may be confusing, and the developer could make an error that introduces a bug with security implications. After the custom software is scanned, a DevSecOps pipeline must also evaluate third-party software or libraries which introduce external software into the code.

Software Composition Analysis (SCA)

Coders use third-party libraries to reduce development times and improve the software by using robust implementations. The evaluation of external libraries is called Software Composition Analysis (SCA). All software contains vulnerabilities discovered over time and requires patches; third-party libraries require patches. DevSecOps orchestration should include tools that can evaluate the versions of third-party tools to identify known vulnerabilities.

Open source tools, such as the OWASP Dependency Check, pull the latest Common Vulnerabilities and Exposures (CVE) from the National Vulnerability Database (NVD) to identify known vulnerabilities. An SCA tool must support the programming language to evaluate the inclusion of third-party libraries, especially if there are cascading dependencies. Each language has a file for defining external libraries, for example:

  • Java pom.xml
  • .Net .nuspec
  • Python requirements.txt
  • Node package.json

OWASP dependency check produces human-readable reports, and XML versions, which can be consumed by the orchestration tools for gate decisions to determine the success of a build.

alt_text

Modern applications may also build the underlying operating system or microservices from images such as docker; therefore, those images should also be scanned to avoid supply chain attacks. The NIST Secure Software Development Framework (SSDF) includes the PW.4 metric, which defines the need for an internal trusted repository. The repository should be used for external libraries, docker images, or other pre-made artifacts to create a trusted store. Developers would then only be able to select from the pre-evaluated artifacts, and the orchestration tool would also use the latest version of artifacts to maintain patching.

Jenkins for SAST/SCA

Throughout this series of blog posts, we have demonstrated the use of Jenkins as the orchestration tool for a DevSecOps pipeline and will continue to use it for the development phase. The Jenkins Warnings Next Generation plugin is used to consume the reports generated by the SAST, Lint, and SCA tools. An example pipeline script for executing the SAST stage would look like the following:

alt_text

Based on the results of the tool execution and requirements for the stage, Jenkins would pass or fail the stage according to the exit criteria. Exit criteria for the SAST and SCA tools define maximums such as the number of Criticals or High vulnerabilities found, which will cause the stage to fail. The following image includes an example of the stage view in Jenkins after each tool has run, and the results are processed.

alt_text

The Jenkins Warning Next Generation plugin consumes the reports and generates a synopsis within the stage for each tool.

alt_text

The results of each tool can be clicked on to drill down into the actual findings, which can then be used for review by the security staff. The security staff would then create issues for developers in the tracking tool.

OWASP DevSecOps Maturity Model

The goal of creating a robust DevSecOps pipeline is to continue to grow and mature to produce secure software. Vulnerabilities are continuously discovered, and secure programming practices evolve, so organizations need to create a road map for continuous improvement. The OWASP DevSecOps Maturity Model presents a unique structure for improving your organization’s implementation. The model defines four levels of maturity and 16 dimensions for types of DevSecOps categories.

alt_text

The levels define a more robust level of implementation, which requires increasing complexity as the organization reaches level 4. All organizations may not need to implement the most complex levels or may not have the budget. The reason for a maturity model is to assist the organization in identifying the goals and then create a roadmap for reaching those levels. The 16 dimensions include categories such as:

  • Test and Verification
  • Static Depth of Applications
  • Build and Deployment

Visit the OWASP DevSecOps Maturity Model page to review all of the defined dimensions. Each dimension does not include four levels of maturity, but most dimensions have defined multiple levels. An organization can use the structure to define additional requirements at each level and customize a solution that fits their goals.

Cybrary offers courses that cut across the entire DevSecOps topic. You can take classes on secure programming, hacker fundamentals, system administration, cloud certifications, and much more. The knowledge will be fundamental to implementing a secure application across the lifecycle of DevSecOps. Signup for Cybrary to learn more about the topic and stay informed of the continuous release of cybersecurity training at Cybrary.

alt_text

About the author Dr. Philip Kulp is a Cybrary Fellow and instructor of several courses on the platform. In his current role as a cybersecurity architect and incident responder, he combines his passion for IT and cybersecurity to develop realistic approaches to secure the enterprise. He performs the roles of an independent assessor, incident responder, and secure code reviewer. Philip seeks opportunities to balance his cybersecurity skills between academic, technical, and compliance roles. He holds the CISSP certification and two Offensive Security certifications of OSCP and OSCE. In his educational capacity, Philip serves as a chair, committee member, and mentor for doctoral students in the Ph.D. and D.Sc. programs at Capitol Technology University. Visit his author page on Cybrary or contact him via LinkedIn.

References [1] OWASP DevSecOps Maturity Model. Retrieved from: https://owasp.org/www-project-devsecops-maturity-model/

Schedule Demo