Q&A on Trustless Computing and Backdoors

Q: Are you creating a backdoor system?

A: NO, we are not. On the contrary, we are creating the World 1st IT client device that can be comprehensively and transparently be assess by users and their experts in per their likelihood to contain critical vulnerabilities or back-doors.

Our proposal for extreme safeguards certification for voluntary compliance to lawful access requests, by ultra-high assurance IT providers, is out of the declared scope of the foundational report, “The Risks of Key Recovery, Key Escrow, and Trusted Third-Party Encryption” by some of the world’s top IT security experts, because it is:

  • Not mandated by the state. Instead, it is a voluntary practice, i.e.  in addition of current legal requirements – by certified ultra-high assurance IT providers, certified by an international certification body, and only in selected jurisdictions where laws and practice allow for the provider, or others they delegate, to safely exercise discretion on the basis of constitutionality of the lawful access request;

  • Not regulated, designed, standardized or certified by the state. Such functions would be managed by a trusted third-party, in the form of an extremely technically-proficient and citizen-accountable international standard setting and certification body, the Certification Body, and by temporary organizational entities made of groups of randomly sampled citizen-jurors and citizen-witnesses, tightly regulated by such body;

  • Not universal for all IT systems. It is reserved only to ultra-high assurance IT devices and services, such as Trustless Computing for wide market use, which can truly be expected to be beyond the targeted or large-scale exploitation capability of a large number of democratic nation states’ cyber-investigation agencies. 

For a full answer refer to our 1-pager summary of the 46-pager:
Position Paper – Case for a Trustless Computing Certification Body

Why state-mandated back-doors are a very bad idea. Let’s see why state-mandated back-doors are a very bad be useless for public safety and extremely dangerous for citizens’ security and privacy

In recent times, several state authorities and intelligence agencies have proposed to solve the “going dark” problem by mandating some kind of back-doors into all IT systems. The FBI has more specifically proposed a “legislation that will assure that when we get the appropriate court order . . . companies . . . served . . . have the capability and the capacity to respond”, while the NSA has been generically referring to organizational or technical safeguards ensuring backdoor access authorization approval by multiple state agencies.

  Since the 1990s, in the legitimate pursuit of extending the lawful access state had to all other means of communications, to IP Systems, many national legislative proposals for exceptional access (or state back-doors) have aimed to mandate technical systems that enable covert remote access into all IT server-side services or user-side devices – sold or introduced in the country – by lawful agencies. These proposals, if enacted in laws or treaties, would have a decisive negative impact on both citizens’ privacy and for public safety, for the following main reasons:

  • The most advanced public security agencies have had, and likely will continue to have, the continuous capability to break into nearly all endpoints, at nearly all times, for targeted surveillance; they have in fact needed to resort to such comprehensive capability precisely from the 1990s when unbreakable encryption became popularly available.

  • Given the enormous complexity and diversity of IT systems and providers, it would be both highly expensive and practically impossible to verify and certify implementations that are sufficiently trustworthy.

  • Legislative and public security branches of government have proven deeply and repeatedly their lack of competency in constructing technical standards and oversight processes to reasonably limit their abuse.

  • Criminals could still surreptitiously fabricate, modify or import – or use while abroad – IT systems without such built-in access, and could still pre-encrypt messages externally to the device or use other means, such as steganography, to communicate covertly over such IT systems.

For a full answer refer to our 1-pager summary of the 46-pager:
Position Paper – Case for a Trustless Computing Certification Body

Or else look at this blog post written about that over the last 3 years: