Rapidly Patching Legacy Software Vulnerabilities in Mission-Critical Systems
October 16, 2019 | DARPAEstimated reading time: 4 minutes
There are a vast number of diverse computing devices used to run the critical infrastructure our national security depends on—from transportation systems to electric grids to industrial equipment. Much like commercial or personal computing devices, these systems utilize embedded software to execute and manage their operations. To fix certain security vulnerabilities, commercial and personal devices must undergo frequent updates, and are replaced every few years—or on occasion, more frequently when an update fails. Mission-critical systems are built to last for decades, and rarely have the same short upgrade cycles. These systems are expensive to develop and hard to replace, and as they become increasingly connected for the purposes of maintenance diagnostics and data collection, this proliferation of connected software is opening them to compromise. While the amount of deployed vulnerable software is growing exponentially, the effective means of addressing known vulnerabilities at scale are limited.
“Patching vulnerabilities in legacy software used by mission-critical systems is a challenge that is only growing in importance and complexity,” said Dr. Sergey Bratus, a program manager in DARPA’s Information Innovation Office (I2O). “Even after a particular flaw is fully understood, and a remediation approach has been developed and expressed as a source code change in the software, a vendor’s ability to produce patches for all of their deployed devices in a timely, assuredly safe, and scalable manner is limited. This results in mission-critical software going unpatched for months to years, increasing the opportunity for attackers.”
Today, identifying and remediating software vulnerabilities in legacy binaries requires highly skilled software engineers who are able to make expert assumptions based on what source code samples and/or limited knowledge of the original development environment may be available. The engineers are responsible for understanding the structure of the binary, developing and applying a patch by hand, and then manually analyzing and testing the binary to ensure it works properly. The process is arduous and time consuming with minimal assurances that the system will continue working as intended after the fix is applied. Further, this approach is becoming increasingly untenable as the amount of deployed software continues to grow within mission-critical systems.
The Assured Micropatching (AMP) program seeks to address these challenges and accelerate the process of patching legacy binaries in mission-critical systems and infrastructure. AMP aims to develop tools and methodologies for analyzing, modifying, and fixing legacy software in binary form with the assistance of assured, targeted “micropatches.” Micropatches are small patches that change the binary as little as possible in order to achieve an intended objective while also minimizing the potential side effects of the fix. AMP aims to create breakthrough technologies to reason about these small software fixes and, perhaps most importantly, provide proofs to assure that the system’s original baseline functionality is not lost or altered by the fix.
“Think of how many times you have updated software on your personal device and the update inadvertently caused some of the software to stop working, or worse, “bricked” the device. With current patching approaches, we are not given the assurance that the system will continue working as intended after the fix is applied. Assured Micropatching aims to create and apply fixes in an automated and assured way, giving us a means to expedite the time to test and deploy the patched system from months and years to just days,” said Bratus.
To enable the creation and rapid implementation of assured micropatches, the AMP program will explore novel breakthroughs in binary decompilation and analysis, compiler techniques, and program verification. Today, engineers utilize software decompilers to understand the executable binary, which is a key step in the process of patching legacy software. While helpful, today’s decompilers are largely heuristic and only able to generate a “best guess” at what the original source code may have been like. AMP seeks to develop goal-driven decompilation, which would use existing source code samples, any available knowledge of the original build process, and other historic software artifacts to improve decompilation and direct it towards a specific goal, such as situating a known source code patch. By being able to guide decompilation, an engineer developing a binary micropatch is better able to translate knowledge of flaws from the source code to the binary, accelerating the identification, analysis, and repair of vulnerabilities in the binary.
In addition to goal-driven decompilation, AMP aims to develop “recompilers” that compile the desired source-level change against the existing binary and provide assurances that the intended functionality of the software is maintained after it is patched. Today, it is difficult to analyze changes in the binary as compilers take a clean-sheet approach—throwing out the existing binary and starting from scratch with each analysis. The AMP program will work to develop recompilers that preserve the binary as much as possible when the patch is applied and analyzed. Once a fix is applied, the novel recompilers will analyze the effects to ensure it does not disrupt the baseline functionality of the software.
To ensure the tools and techniques in development work as intended, AMP will run a number of challenges throughout the life of the program. The challenges will explore various cyber-physical mission-critical system use cases, and assess how effective the technologies are at rapidly patching legacy systems.
Suggested Items
Reducing Nitrogen Consumption in Convection Soldering with Rehm Thermal Systems' Patented Mechatronic Curtain
03/28/2024 | Rehm Thermal SystemsCurrent developments indicate a need for larger throughput heights due to the trend towards e-mobility, which in turn increases nitrogen consumption for process inertization. Rehm Thermal Systems responds to this issue with an innovative solution: the mechatronic curtain.
CentraTEQ to Showcase Wide Range of Vibration & Environmental Test Systems at Battery Tech Expo 24
03/26/2024 | CentraTEQEnvironmental and vibration test specialists CentraTEQ are excited to be exhibiting at the upcoming Battery Tech Expo, scheduled to take place at Silverstone on April 25, 2024.
SEMI 3D & Systems Summit To Spotlight Trends In Hybrid Bonding, Chiplet Design And Environmental Sustainability
03/26/2024 | SEMILeading experts in 3D integration and systems for semiconductor manufacturing applications will gather at the annual SEMI 3D & Systems Summit, 12-14 June, 2024,
iNEMI/IPC White Paper on Complex Integrated Systems Highlights Future Technology and Manufacturing Ecosystem Needs
03/25/2024 | IPCToday’s system solutions combine more varied functionality, such as digital, analog, optical, micro-mechanical, etc., packed into smaller form factors. As a result, electronics manufacturing has to deliver increasingly complex integration of diverse technologies with system designs that blur the distinction between chip, package, board, and assembly.
Synopsys Announces New AI-Driven EDA, IP and Systems Design Solutions At SNUG Silicon Valley
03/25/2024 | PRNewswireSynopsys, Inc. kicked off its annual flagship Synopsys User Group (SNUG) conference in Silicon Valley at the Santa Clara Convention Center with a keynote presentation by Synopsys president and CEO Sassine Ghazi. Ghazi discussed the unprecedented innovation opportunities and challenges that technology R&D teams face in this era of pervasive intelligence.