Regardless of how long we’ve been building software and related security and audit activities, it appears that we’re still treading water. Cybercriminals are still creating havoc, as seen by high-profile instances such as Kaseya and SolarWinds, as well as less visible attacks that inflict tremendous damage on organizations and their consumers, typically through supply chains. So, why do these types of situations continue to occur, despite all of the fantastic tools and enhanced automation at our disposal? Until we have better knowledge, every software developer on the planet should be thinking, “I might have been like Kaseya, but I was lucky.”

To begin with, implementing DevSecOps is part of the solution to generating safe software, but it is not sufficient. Even firms on the DevSecOps road must add more components to their CI/CD (continuous integration and deployment) pipelines to mitigate supply chain vulnerabilities. This necessitates the addition of greater capability, the use of capability maturity models, and the implementation of faster processes.

Also read,

When new features are added to the software, organizations employ regression tests to ensure that the software continues to work as intended despite the new features. Unfortunately, full testing, including regression testing, can take weeks or even months to complete. As a result, one of the most significant challenges that businesses face is deploying patches for detected vulnerabilities more quickly. This puts practitioners in an awful position: do they run their entire test suite and take the risk that the vulnerability will not be exploited in the meantime, or do they “patch and pray” without running the entire test suite? Regrettably, hoping is not a good tactic. When I go the latter path, I wind up playing whack-a-mole with new problems that occur as a result of the update being issued without adequate testing.

To solve these problems, static application security testing (SAST) and dynamic application security testing (DAST) should be considered standard practice. Methodologies that can test faster than vast suites of traditional regression and other tests, such as network comparison appliaction security testing, are required (NCAST). This allows for rapid, side-by-side comparison of network requests and responses using either test or production traffic. Any changes found are either expected or marked as requiring additional investigation. When there are no more discrepancies, the product can be released with confidence.

As a professional community, we have embraced the concept of utilizing a large number of vendors and open-source software, resulting in extensive supply chain exposure. Software composition analysis tools are widely used to highlight the third-party software component of your code and warn of potential issues, particularly with regard to licensing. They are, however, less effective at drawing attention to software security concerns.