Susan Hohenberger defended her thesis Friday at MIT. Susan’s thesis work is on developing secure algorithms for proxy cryptography. These are new cryptographic constructions that are designed to allow a third party, the proxy, to take a cryptographic object produced for (or by) a particular key and transform it so that it is a valid object for (or by) a different key. Susan presented new definitions for security of proxy re-encryption and re-signatures and algorithms that meet these definitions.
Why are these constructions interesting? Susan gave an example where proxy re-encryption might help with secure distributed storage systems; proxy re-signatures could be used to provide proof of flow in, say, an immigration and customs scenario. I’ll present another example here where proxy re-encryption could help improve encrypted mailing lists.
Mailing lists present encryption of e-mail with one basic problem: regular encrypted e-mail requires that the sender (usually called Alice) know the public key of the recipient (Bob). Mailing lists would require that Alice know the public key of all members of the mailing list: a membership that might fluctuate with time and may even contain people that Alice has never heard of. For example, suppose Alice has a sensitive bug that she wishes to send to the security team at CERT. CERT today has a single key for receiving incoming encrypted mail and presumably multiple people have online access to the secret key in order for them to read encrypted messages. This structure allows Alice to only worry about a single key for reaching CERT while CERT as an organization has the flexibility (and burden) of managing which of its employees have access to that key.
Of course, each of CERT’s employees presumably have their own personal encryption keys. An alternative to allowing each employee access to the master secret key would be to have a single person decrypt an incoming message, re-encrypt it with the keys of the persons on call and then send these newly encrypted messages to the final recipients. This would reduce the vulnerability of the key to a single person.
In 2002, I supervised David Chau in a UROP project to develop a
program that acts as this trusted person. While David built a functioning prototype, I never
took it to completion or published it. In the interim, at least two alternate
(but basically identical functionally) solutions have been developed: one for
ezmlm and one for simple
/etc/aliases-style lists. However,
these software solutions have the problem that they require a computer to have
programmatic access to the decryption key. It also exposes the unencrypted
message to the computer that is performing the transformation. For sensitive
data, this may be an unacceptable risk. Proxy re-encryption can remove this risk.
In the world of proxy re-encryption, Alice (with her secret key ka) can create a special proxy key kab that the proxy can use to transform a message encrypted for ka to a message encrypted for kb. The proxy does not ever get to see the unencrypted message, and Alice’s real key does not need to be available during any transformation. In the CERT example, a trusted person would create proxy keys to allow messages encrypted for the master CERT key to be proxy re-encrypted to each person actually authorized to view those sensitive messages. These proxy keys do not compromise the security of any of the end recipients, and compromise of the mail server and loss of the proxy keys would not result in a compromise of the master CERT key.
This idea has been implemented by Himanshu Khurana at UIUC: his Secure E-mail List Server does not make use of the algorithms Susan helped develop but uses a separate proxy re-encryption scheme with slightly different features and requirements. For example, it requires that the list server generates a user/list-specific decryption key. Susan’s thesis shows how proxy re-encryption can be achieved efficiently using user’s regular key and improves on previous results by demonstrating a construction that allows only unidirectional re-encryption.
This work is new and is still maturing. For example, it is not yet available for day to day use in programs like GnuPG. Susan’s defense also highlighted additional important research questions in this field that should be investigated to improve the understanding of and confidence in these algorithms. However, it sounds like there is already parties adopting the algorithms into domain specific applications and a significant amount of interest exploring in the theory underlying her work. I suspect that in the next few years, we will be seeing some more significant applications of these ideas. Congratulations to Susan!