Why do we need new ultra-high assurance IT paradigms, certification body, and computing base?

Why do we need a new standard-setting and certification body, and related open target architecture, that achieves levels of trustworthiness that are radically beyond state-of-the-art, while increasing public safety and cyber-investigation capabilities?!

Wikileaks recent revelations, about the widespread availability of CIA hacking tools in the deep web, has made it clear that large corporate, financial and public institutions – and of course simple citizens – are much more exposed to scalable and targeted endpoint attacks by an ever larger number of competitors, criminals, and abusive states, than previously thought.

What is often unreported – but well known in top boardrooms and governments – is the impressively low cost and high scalability of carrying out such attacks. State tools like NSA Turbine and NSA FoxAcid, or their private equivalents like Hacking Team RCS, are capable of the automated or semi-automated exploit and remote management of up to hundreds of thousands of exploited mobile devices.

Todays’ commercially available IT technologies – even those meant for the most societal critical use cases – are radically below the level of trustworthiness that is desired, remanded or required by its users for sensitive or critical use case scenarios. Current standards and certifications are not strong nor comprehensive enough to deliver such levels of trustworthiness. This produces enormous societal costs and risks of hampered economic and social progress, especially given their impact on our democratic institutions and on the future of artificial intelligence.

Current IT security standards, standard setting and certification processes like NIST, ISO, ETSI – even those of the highest levels of security, such as Common Criteria, FIPS, SOGIS, EU Top Secret, NATO Top Secret – have one or more of the following severe shortcomings:

  • do not certify any complete end-to-end computing experience and device service and lifecycle, but just parts of devices, server-side service stacks or components;

  • include only partially, if at all, critical hardware designs and fabrication phase, and when they are included the requirements and very inadequate and incomplete to resist a determined attacker;

  • are developed in opaque ways by standard organizational processes that are only very indirectly (and inadequately) user- or citizen-accountable, and subject to various undue pressures;

  • make dubious crypto requirements, such as “national crypto standards”, including custom elliptic cryptographic curves, that leave substantial doubts about the ability of certain national agencies (and potentially others) to bypass them;

  • certify devices that are embedded into or critically connected to other devices that are not subject to the same certification processes;

  • have very slow and costly certification processes, due to various organizational inefficiencies and to the fact that they mostly certify large (and often new) proprietary target architectures, rather than an extension of certified and open ones.