In 2016, the then Chancellor of the Exchequer, Rt Hon Philip Hammond MP, wrote in the National Cyber Security Strategy 2016 “The UK is one of the world’s leading digital Nations. Much of our prosperity now depends on our ability to secure our technology, data and network from the many threats we face”. Since that time, various activities, including the creation of the Nation Cyber Security Centre, has significant advanced the broader awareness and engagement in Cyber Security.
Unfortunately, the rate and sheer number of vulnerabilities reported has rocketed, as has the damage they cause.
Need for Investment in Hardware
So why has the significant growth and investment in tackling cyber-crime done little against the apparent increasing number of security vulnerabilities, and the increasing harm they cause?
To potentially understand this better, you need to understand more about what a cyber vulnerability is, and how they are often caused. Though analysing the entire CVE database would be a impossible task, Microsoft has looked at the vulnerabilities it has published since 2006 and found that around 70% are caused by a software “bug” related to something known as a memory safety issue. What can be even more distressing, is that even as early as the 1970, these issues were being discussed with various proposals on how to address the issue.
So why is this still the case? with billions being spent to try and make cyber secure, while trillions are still being lost, all because we’re still seeing the same class of vulnerability for 50 years? What I think is fair to say is that today cybersecurity is the science around the protection of what is fundamentally an insecure system. Other than the investments against world hunger and disease could we see examples at a similar level of cost against potential benefit.
A big difference here, is, the issue is both caused by us on something we have built – software running on computers. So why haven’t all those techies out there fixed it decades ago? One potential answer is legacy and the market dynamics between building hardware and building software. To bring a new feature from concept to deployed hardware can often take 10 years. Today with the delivery of software patches over the Internet, updated software can be delivered in just a few months. When you then look at amount of software out there, and that in any practical measure, only couple of small teams across Arm and Intel/AMD define the central architecture of all the worlds computer processors, you can hopefully start to see the sort of challenges that have left the computer processor unable to stop memory safety issue for software.
When the UK government reached out to British industry around any significant challenges it saw regards the implementation of its Industrial Strategy, a consortium led by Arm, proposed a project for funding to address this cyber failure in the market. Leveraging research from a programme known as CHERI at Cambridge University, the consortium proposed a programme of activities that would build a technology prototype and create a pathway between the introduction of technologies within the architecture of the central processor and the billion of devices across the future’s digital infrastructure.
A Step Change in Digital Security
The resulting Digital Security by Design (DSbD) programme has just had its first birthday, with the first of its milestones starting to deliver, with other projects aiming to better understand the various DSbD technologies and activities required to bring about a step change in the way cyber security can tackle the threats using digital security designed into the heart of the computer.
- You can find out more about the Industrial Strategy Challenge Fund here
- Sign up for email notifications on funding, connections & support opportunities
- Find out more about our diversity and inclusion work here
Author: John Goodacre, DSbD Challenge Director