The home of cyber security best practice: public or private Sector?
Dr Bernard Parsons, Chief Executive Officer , Becrypt
Whilst parts of the public sector are not generally held up as shining beacons of security best practice, there are areas where private and public sector can take a leaf out of each other’s books, as the security challenges facing both continue to escalate. The recent reinforcement by the Chancellor of the Exchequer, Philip Hammond, of a £1.9 billion investment in bolstering the UK’s cyber defences also highlights the increasing need for cooperation between business, government, academia and industry to confront the growing menace of cybercrime.
Over the last decade, one could argue that parts of the private sector have demonstrated more examples of best practice in cyber security. That doesn’t mean to say that all businesses are adequately secure – on the contrary. However, by the same token those businesses, whose very existence in a global competitive market depends on good security, offer a good blueprint for success in protecting sensitive data. One fundamental principle that such organisations have embraced is the importance of balancing security against the competing challenges of usability and cost. An inability to focus on all three will result in failure, as users will find ways to sidestep security measures if they prove too onerous and managers will continue to weigh up cyber risk and the cost of compromise against the corresponding cost of investing in cyber security. Only relatively recently has this triple imperative been widely recognised by government; a reality which has in the past been hampered by out-dated practices including slow and cumbersome certifications and accreditation processes.
Cost, risk and usability – the triple imperative
In the past three to four years there has been a cultural shift within government as the term ‘commercial best practice’ became pervasive. This has had a profound effect on the way that systems have been architected, procured and deployed and how government is looking to the private sector for both inspiration and guidance in the introduction of technology and practices. The recently introduced Government Classification Scheme (GCS), is reflective of this approach to security, which in part seeks to redress the balance between cost, risk and usability. For example, today processes like the Commercial Product Assurance (CPA), run by the National Cyber Security Centre, which dictates the process for new products to be certified for government use, is much more flexible and efficient than its past equivalents. There has also been a real drive to give responsibility for informed risk management to the data owner rather than using process to obscure responsibility. However, the nature and scale of threats faced by government within the cyber domain today is of an unprecedented scale and magnitude. This means that some differences will continue to exist between the public and private sector, however the principle of efficiency, cost and usability is now well established.
On a global scale the UK has a world leading reputation for security expertise, but arguably this has not yet translated into a vibrant home-grown cyber security industry of a scale that fulfils national potential. Cyber security is recognised by the British government as a tier one national threat that is attracting substantial government funding and driving an increased need for collaboration between government, academia and industry, which is driving innovation in the security ecosystem.
Both private and public sectors face a fundamental challenge: to address the asymmetry that exists between the capabilities most businesses present to the world and the huge number of adversaries wishing to exploit them, reflecting the cost and effort required to detect and respond effectively to today’s threats. One area that government is arguably ahead of industry, is in gaining confidence in the identity and state of end user devices. Most high-profile data breaches involve the exploitation of vulnerabilities on end user devices. In the field of identity and access management, technologies exist to enable the authentication not only of users but also to determine the level of trust that can and should be conferred on devices. By increasing the level of trust in both devices and users, businesses significantly reduce their attack surface.
A move towards secure desktop
Many of the building blocks in use today in government have evolved out of the commercial space. One such example is the Trusted Platform Module (TPM), a cryptographic chip that ships with most Intel devices (with Trust Zone a similar technology for ARM-based devices). These ‘trust anchors’, as they are known, are hardware standards becoming increasingly adopted in government circles, to enable the establishment of a level of trust in the state of a device by taking cryptographic measurements of systems and patches deployed on that device. Initiatives such as these are leading to the widespread deployment of secure desktops in government. Systems for accessing cloud based platforms, containing some of these trust-supporting features to offer secure browser-based access to virtual applications across varying form factors.
This move towards secure desktops is making it an order of magnitude more difficult for attackers to exploit than common desktop systems. Typically using open-source operating systems at their core, they are mature enough to address cyber threats, using a robust architecture, whilst balancing the triple challenge of security, usability and cost efficiency that is critical for success.
[su_button url=”https://www.securitynewsdesk.com/newspaper/” target=”blank” style=”flat” background=”#df2027″ color=”#ffffff” size=”10″ radius=”20″ icon=”icon: arrow-circle-right”]For more stories like this click here for the Security News Desk Newspaper[/su_button]