DevSecOps: a software factory for secure software in mission critical capabilitiesProfiles
The Device Chronicle interviews Alan Hohn, Director of Software Strategy at Lockheed Martin to learn the latest developments in DevOps, DevSecOps and chains of automation in developing highly secure digital products for use in defense.
Alan has held various roles in software engineering, architecture development and team management at Lockheed Martin over the past 25 years of his career. He is also an esteemed Lockheed Martin Fellow Emeritus. His primary interest area and strategic expertise is in DevOps and DevSecOps. Currently, Alan directs the software strategy for Lockheed Martin’s Corporate Engineering organization supporting different business areas through programs and ultimately Lockheed Martin’s customers.
Alan has a razor sharp focus on improving the ways in which Lockheed Martin develops and delivers software. This involves ensuring that all the programs have what they need to do this and the goal, Alan says, is to get better and better at this over time.
Frequent delivery of secure software and dev sec ops
Alan leads the software factory initiative which is transforming DevSecOps and cybersecurity at Lockheed Martin. Alan explains that Lockheed’s customers are seeking frequent delivery of high quality secure software. Speed is of the essence in this regard, and Alan’s group are responding by using infrastructure automation to build systems in minutes and not days. Containerisation used to update systems continuously, so there is no waiting for full system integration and delivery events. New technologies such as container orchestration with Kubernetes are being used in the software factory to drive modularity through containerized microservice architecture all the way into the deployed system. This serves to simplify test and deployment, and enables a hardware agnostic solution that moves seamlessly between cloud providers and into embedded environments.
Software is central to everything now
Alan says “It is a cliche that software is in everything, and it is a cliche because it is true.” Alan further reflects that when he started out in his career, the focus was on custom real time operating systems. Most of the software development on top of these systems was custom. The transition since has been to off the shelf software and in particular commercial off the shelf software, which has become highly important in the aerospace and defense industries. Alan says that Lockheed Martin is now using an open source operating system such as Linux where they would have traditionally used a bespoke real time operating system, but also is going up the stack to introduce containers and Kubernetes into those kinds of environments. “There is an effort to find places where we can raise to the platform level the amount of off the shelf software we are doing, so that we are really only focusing on that last few percent of mission critical capabilities.”
Education in coding best practices
In DevSecOps, Alan believes that secure coding starts with education and ensuring that Lockheed Martin developers understand the best practices to follow and the pitfalls to avoid. Experts in information assurance should provide this training but also provide a “second pair of eyes” for the developers as they are writing their code and to ensure that secure coding practices are adhered to. This also gets extended to tooling and it follows on from DevOps which is mostly about cultural change to continuous delivery. Tools are important enablers and secure coding must include making it as easy as possible for teams to apply static coding analysis, automated testing, automation vulnerability scanning, and software composition analysis. It should be automatic for these tools to become part of the DevSecOps pipelines. Lockheed Martin has done tremendous work to establish these tools into CI CD workflows at the enterprise level.
Focus on software supply chain security and control of SBOM
As off the shelf software is becoming such a large part of what Lockheed Martin does, it becomes critically important to control and get a handle on software supply chain security. “We need to understand just what is in our products. This can be challenging as we move to containers.” Alan explains that the traditional mechanism used for ensuring control of off the shelf software – taking a bunch of existing source code and recompiling it – is no longer an efficient practice as they move to a place where most of the capabilities are built with off the shelf software and a small percentage is left for mission critical capabilities. There is a need now to take the existing, pre-compiled binary packages, operating systems and container images and use them, so the right analysis tools must be used to know really what went into those software versions and builds. “We need to have a complete handle on the bill of materials.”
The defense industry is characterized by complex systems with lots of different suppliers and parts. Alan says this is challenging when trying to build up a software bill of materials. All it takes is for one supplier to miss an item and then you have a software dependency that trickles all the way through your system and gets delivered to the customer that you may not be aware of.
In order to improve software supply chain management, Alan says his group is having conversations on software supply chain security with professional groups in the community such as the CycloneDX project in the Linux Foundation and the OpenChain team working SPDX. He says “We want to contribute to the open source community on this very important and challenging topic.”
Software image delivery
Secure software image delivery from the CI CD pipeline to target devices is a hard challenge for Lockheed Martin for several reasons. Alan says one reason is when you think of edge or embedded devices with software on board, they tend to be doing something tactical as part of a mission. “In addition to automatic concerns of connectivity to the other side of the world, sometimes these devices are in emissions control (EMCON) and they cannot talk to us. Also, we wouldn’t be thanked for rebooting a device in the middle of a mission. To a large extent, the device belongs to the customer once it is handed over.” So it is a case of building true partnerships with customers, working with them to figure out what is the right level of independent validation and verification needed before they accept a software update This means Lockheed Martin is focused on how best to build an infrastructure for the customer for over the air software delivery of new capabilities that they can operate.
From a pipeline perspective, Alan explains that Lockheed Martin programs need to have infrastructure with multiple pipelines, with some handoffs in between. “Our software factory includes DevSecOps pipelines, continuous integration, and continuous delivery to an integration facility or to a customer, and then the government or customer software factory has to take that over. They have to have their own pipelines to do their own verification and validation before the software upgrade can be pushed out to production. It is exciting that the government is looking to create facilities where multiple different parties can collaborate together to get mission critical solutions out there faster.”
Growth in Yocto adoption and 5G gets attention
Alan says there is a lot of take up for Yocto OS and a lot more programs are exploring it, as it provides a high level of customization and yet you end up with an embedded product with a tiny and efficient footprint.
The history of defense connectivity includes new satellite technologies, high speed data links, and a move away from message based communication to IP-based communication to exchange data and update software. Under the aegis of 5G.MIL®, Lockheed Martin is investing in applications for 5G and looking at the opportunities the large bandwidth offers for edge nodes. Alan says of 5G, “We are working through what the potential implications are and enjoying the collaborations to work this out. It is an exciting future.”
Balance capabilities against risk
Alan reflects that the traditional approach to cybersecurity was checklist-based, using Security Technical Implementation Guides (STIGs), but that now there is a move to a more continuous approach and Lockheed’s customers now use the Risk Management Framework (RMF). RMF allows you to identify the most critical risks associated with a system and then balance the mission and operational capabilities you get in exchange for the risk. Alan believes that with OTA software updates, the operational capability of being able to change a deployed system to react more quickly to new mission needs is so significant that the approach to cybersecurity will focus on making it as secure as possible with virtualized test environments, gates on the pipelines, and test verification and validation that the OTA will perform as expected. Proactive monitoring on the platforms is key to detect if something that has been deployed has something nefarious and not quite right about it, and do a roll back if necessary. These are compensations that reduce the risk as opposed to the older checklist and waiver approach. The technology updates are important, the security updates are important, but the cultural piece of managing risk that comes with new capabilities is most important.
OTA software deployment case by case
Alan explains that a software update might be delivered OTA to a ship at sea, but that update would first go to an authorized person as a gatekeeper on board and then they would roll it out locally on the systems. Alan explains that typically in the defense industry, OTA software updating is planned step by step and that you have to plan it out for each specific scenario. He concludes “It’s unlikely we will get to the stage where one person will push one central button and an update will be pushed to release an update globally across systems.”
We wish Alan and the Software Factory group well as they continue to improve the ways they develop and deliver software to their customers.