­The Promise of Intel SGX — Have You Read the Fine Print?

Intel Software Guard Extensions (SGX) is one of Intel’s next big promises for enhanced security and privacy for cloud services. Intel provides a new instruction set and a supporting development APIs (SDK) that allows a developer to create security “islands” (called “enclaves”) as part of the application. Specifically, the application should be comprised of a trusted part, the enclave, and an untrusted part that includes the operating system, external libraries, hypervisor, etc. The development of enclaves requires a set of libraries containing the C runtime, cryptographic functions, memory management and the SGX runtime. As we all know, the devil is in the details…the security of a system is only as strong as its weakest link, and in our case, it is the development environment itself!

Few words about Intel SGX

Intel’s SGX is a set of hardware/software extensions that allow a developer to have a secure computation executed on a computer where all the privileged software (kernel, hypervisor, etc) is potentially malicious. The trusted hardware establishes a secure container, and the remote computation service enables the user to upload the desired computation and data into the secure container. The trusted hardware protects the data’s confidentiality and integrity while the computation is being performed on it.

Sounds Good, Right?

So far it seems like Intel provides us with a robust security solution, which allows creating a safe execution environment to be used for cryptographic calculations. But, as we said before “the security of a system is only as strong as its weakest link”.

When building the enclave library (the “trusted component”, a DLL file, in our case), all the necessary libraries including the C runtime and standard C functions libraries are linked together. Unlike regular compilation, while building Intel SGX DLL, we link against Intel’s static libraries — libraries provided by Intel itself.

After the linking process is completed, the entire trusted component (the DLL file) is signed with the ISV private key, so that the library won’t be compromised after creation.

What happens if we modify one of the static libraries, before building and signing the entire enclave library?

Surprisingly, it is possible to patch Intel’s “safe” CRT libraries and inject them into the enclave’s code, and force these changes to be executed inside our trusted environment.

Intel addressed this issue (from the SGX manual):

Even though Intel is aware of this issue, we believe it is a major security gap that makes the SGX solution not as secured as it is expected to be. Additionally, it raises questions regarding the recommended development process of cloud services that rely on secure enclaves. Since one of the reasons of using SGX in the first place is that we don’t trust the cloud provider, we will never ship code to one that offers a development toolchain that gets source code as input and outputs a deployable enclave based service. The only alternative would be to use a “sanitized” environment for the development process. The risk for enclaves produced by enterprises is that a rouge employee (or hacker) could jeopardize this environment leading to compromised enclaves that are signed by the enterprise.

Manipulating the Library Files

Our environment basically consists of an Intel i5 processor (64-bit, with SGX enabled), Windows 8 operating system and Microsoft Visual Studio 2015 with the SGX plugin installed. Visual studio provides a C/C++ compiler, and like every other compiler it comes with C runtime libraries, packed as *.lib files. Lib files are containers (7zip files) for object files (*.obj).

By viewing the Visual Studio linker options, it is easy to find out which libraries will be statically linked into the output binary:

Once you found the desired lib file you can easily examine its content using 7zip. After locating Intel’s SGX CRT libraries we can further inspect them and extract the relevant object files from them:

Unlike lib files, which are easy to manipulate, changing object files requires a binary patching.

We chose to patch the ceil () function, which is part of the C standard library (later on we also patched the sgx_ecdsa_verify function). As a POC, we made a tiny patch — replacing two bytes in the ceil object file (ceil_stub.obj). We used the following Python script do perform the patching:

Before (ceil_stub.obj)

After patching (ceil_stub.obj)

After patching the object file, we needed to re-pack it together with all other object files in order to create the modified lib file. The lib re-packing was made using the LIB.EXE utility, provided by Visual Studio, which allows you to pack a set of object files into a lib file. It also allows you to change an existing lib file by adding or removing object files. For example:

· Creating new lib file: LIB.EXE /OUT:MYLIB.LIB file1.obj file2.obj
· Adding object file to an existing container: LIB.EXE MYLIB.LIB +file3.obj

After the patching process we rebuilt the enclave DLL (AKA “the trusted component”) of a tiny program that calls to ceil function. Surprisingly, the patched version of ceil was statically linked into our enclave’s binary and was then signed using our private key by Visual Studio. Without any problems, we were able to load and execute our compromised DLL as a “safe” enclave.

Results

Due to the success of this POC, we decided to create a demonstration with a practical use. We patched one of the SGX environment cryptographic functions, sgx_ecdsa_verify, which basically verifies signatures for a particular message. In this demo, the patched version returns SIGNATURE_CORRECT for every given signature, regardless of its correctness.

We intentionally chose only naïve examples to demonstrate the use of this ability. There are much more dangerous scenarios which can take advantage of this ability:

· Patching the PRNG function in order to create a “weak” random generator.

· Creating a Side Channel and Data leakage attacks by forcing strcpy or memcpy to expose the private key in the output buffer.

The ordinary programmer will never validate or test the built-in C libraries functions which he uses to create his output binary. We usually trust our compiler and the development environment.

Conclusions

“Reflections On Trusting Trust”: Injecting malicious code into the compiler isn’t a new idea. In 1984 Ken Thompson held a lecture called “Reflections On Trusting Trust”. In this lecture he described the KTH (Ken Thompson Hack), where he injected a computer virus into his compiler, after which the compromised compiler infects every computer program it compiles.

It isn’t so difficult: replacing lib files didn’t require any sophistication or special tools, all we needed was IDA, 7zip and lib.exe — all of which can be downloaded freely. There is no mechanism to prevent binary patching of core libraries. Microsoft and Intel could easily solve this problem by adding a mechanism to the operating system that sets special system permissions for “sensitive” files or uses cryptographic signatures in order to avoid binary changes.

Intel addressed this issue: Intel demands a “clean” development environment. Is it a reasonable demand? And how can we be sure that our development environment is 100% “clean” from malicious software?

Final words

This research was conducted with Prof. Danny Dolev and Dr. Yaron Weinsberg (IBM Research, Israel) as part of the course “Advanced Operating Systems & Cloud technologies” at The Hebrew University of Jerusalem, Spring Semester 2016.

We would be happy to receive your comments and thoughts. You can e-mail us at:

Nir Moshe — baghdad@mail.huji.ac.il

Iris Altman — iris.altman@mail.huji.ac.il

References

1. Intel SGX homepage: https://software.intel.com/en-us/sgx

2. 7zip: http://www.7-zip.org/

3. Lib.exe (from MSDN): https://msdn.microsoft.com/en-us/library/e17b885t.aspx

4. More information about lib.exe: http://www.digitalmars.com/ctg/lib.html

5. The Ken Thompson Hack: http://c2.com/cgi/wiki?TheKenThompsonHack

6. Reflection on trusting trust: http://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thompson.pdf

7. IDA: https://www.hex-rays.com/index.shtml