Articles & White Papers

May 04,2020

Build Trust in Silicon: A Myth or a Reality?

By Albert Jeng (PUFsecurity, the subsidiary of eMemory)

Abstract:

Currently, there is a strong belief among the cyber security experts that hardware security is imperative since it is more efficient, effective, reliable and tamper-resistant than software security. As a matter of fact, providing trusted execution environment (TEE) and embedding a hardware root of trust (HRoT) as the anchor are necessary to provide a firm foundation for electronic systems security.

However, since hardware is almost impossible to patch once it is fabricated, therefore how to build Trust/Security in silicon and how to verify the security of the hardware components at design time become very important hardware security issues and challenges.

There are several well-known techniques used to build security features into silicon including adding security extensions, implementing Trusted Platform Modules (TPMs), and incorporating a Physical Uncloneable Function (PUF). All of these design principles and primitives are necessary to ensure that the resulting silicon has the tools in place for software to build a trusted computing environment.

There are both US and EU funded hardware security projects with special focus on development of hardware security architectures and associated design tools which are able to block hardware attacks on HRoT. Each US or EU project has its own goal, use cases, technical approaches/research fields, and outstanding research results.

In this column, we will first discuss the importance of providing TEE & HRoT as a firm foundation for electronic systems security. Next, we will address issues of trust on chip and how to build trust in silicon. Third, we will provide an overview on the US and EU R&D projects on chip security. Finally, we will comment on the status of building trust in silicon followed by a short conclusion.

Introduction

It has been well publicized that computer hardware and firmware are perceived as more dependable and trustworthy than software since the latter is susceptible to design and implementation flaws and not impervious to subversion by malicious code. Hardware security is a physical device using a dedicated security IC, or a processor with specialized security hardware specifically designed to provide cryptographic functions and also protect itself and the associated critical data against attack. The typical examples of hardware security include Hardware Security Modules (HSMs), Trusted Platform Modules (TPMs), Physical Uncloneable Functions (PUFs), etc.

Securing distributed 5G or IoT/AIoT networks requires verifying the authenticity of data and identities of devices as well as an effective and efficient encryption to provide secure communication between 5G or IoT/AIoT nodes. This could possibly be done by providing trusted execution environment (TEE) and embedding a hardware root of trust (HRoT) in hardware to provide the firm foundation necessary for a more secure 5G or IoT/AIoT implementation.

However, since hardware is almost impossible to patch once it is fabricated, therefore how to build Trust/Security in silicon and how to verify the security of the hardware components at design time become very important hardware security issues and challenges.

Recently there are many proposals for applying TEE & HRoT to 5G or IoT/AIoT security. Although most of them claimed they leveraged TEE & HRoT to provide an efficient, flexible and scalable security for 5G or IoT/AIoT, many of them actually overlooked or failed to address key secure issues involved in injecting TEE & HRoT -based solutions. The two key issues are “whether TEE & HRoT itself is built securely?” and “how to develop effective vulnerabilities mitigations and secure design rules so that TEE & HRoT could be securely integrated in a larger system?”.....more

Newsletter