Written By: Audrey Woods
Ever since it first came into effect on May 25th, 2018, the European Union’s General Data Protection Regulation (GDPR) has been resonating through the tech industry. Right after it was launched, there was a flood of updated privacy policies as companies hurried to comply with the new legal framework. Opt-in boxes showed up on commercial websites, allowing users to choose what data to share, and personalized advertising had to be rethought or else risk hefty legal fines. CSAIL Assistant Professor Henry Corrigan-Gibbs says of the GDPR, “it's really forced people to think more carefully about what types of data they're collecting, how they're storing it, whether they really need to collect certain types of information, and architecting their systems in such a way that they can make a defensible case that they're doing things in line with best practices in terms of privacy protection.”
While data protection may be a ubiquitous topic today, Professor Corrigan-Gibbs jokes that when he first started grad school it was the “era of privacy is dead.” Back then, with tech companies growing and social media platforms perfecting the art of targeted advertising, privacy had yet to become the thriving subfield of computer science it has turned into. But even before he discovered a topic of such strong personal interest, Professor Corrigan-Gibbs says he was always interested in computers. In his first college internship at General Electric, he was surprised to find the inner workings of the import/export process “fascinating.” Then, while writing about technology at the New York Times, he realized he was drawn to the more technical side of real-world problems, especially those with a social component. In 2009, he traveled the rural mountains of Nepal, updating wireless equipment and bringing the internet to some communities that had never had it before, an experience he describes as “just awesome.” Along the way, Professor Corrigan-Gibbs began to work specifically on the technical challenges surrounding privacy, writing his Stanford PhD thesis on Protecting Privacy by Splitting Trust. After a postdoc year at EPFL in Lausanne, Switzerland, Professor Corrigan-Gibbs came to MIT CSAIL to study how cryptography can be applied to protect user data and create safer, more trustworthy systems.
The Importance of Privacy
When asked why privacy drew his interest, he explains how these days “it feels like we have less and less of it. You’re less and less free to explore yourself, to explore different ideas, and to explore different ways of thinking because there’s a record of everything you do, everything you read, everything you look up, and who you talk to,” all of which he sees as a hindrance to basic human expression and curiosity. Indeed, the expanding field of privacy research reflects his concern, with a heavy focus now on data stewardship, data protection, and preventing data breaches. Much of Professor Corrigan-Gibbs’s research revolves around the idea of: “if a company doesn’t have your data, they can’t lose your data.” He elaborates, “we’re trying to design computer systems in such a way that the user’s data can remain on the user’s device and not ever have to be stored in the cloud in unencrypted form.”
One hurdle to overcome in achieving this goal is to bring down the computational overhead. Professor Corrigan-Gibbs explains, “the big problem is that if you want to compute on encrypted data, you pay something gigantic in terms of the computational cost.” Therefore, a big part of his research is trying to either use systems engineering to speed up and more efficiently utilize hardware or design custom cryptographic tools for a particular application. Professor Corrigan-Gibbs is also working on systems like SafetyPin—a project led by UC Berkeley PhD student Emma Dauterman—which offers strong hardware-backed security protections, methods for privacy-preserving statistics aggregation like Prio and Poplar, and studying preprocessing attacks.
Bringing About Change
Professor Corrigan-Gibbs says that a common misconception about privacy technology is that “if you're encrypting data that’s good enough to protect your privacy.” While encryption is an important primary safety measure, it’s not sufficient. He says, “even if you have a very strong encrypted pipe between your user and your server, if you’re storing that data on your server in unencrypted form, it’s still at risk.” One way to combat this issue is to encrypt data at rest, which provides another layer of protection. However, the only real way to protect user data is to gather and store as little of it as possible.
For example, he discussed problems related to the “right to be forgotten.” While it’s reasonable to imagine that a customer has the right to request that their information be erased at any given time, “in a big computer system, data is cached and stored all over the place and the company may not even know where all your data is stored.” While businesses are beginning to architect systems that can handle these requests, it’s important for users to be empowered to advocate for themselves. Comparing it to surgery, Professor Corrigan-Gibbs says the best situation would be one in which customers can trust their privacy is being preserved—in the same way patients can trust surgical equipment will be sterilized—without having to worry about the specifics. “Ideally [these systems] will be transparent. They’ll just protect your privacy, and you won’t have to think about it.”
According to Professor Corrigan-Gibbs, privacy-preserving technology has already made huge strides, putting into practice ideas that have, until recently, been only theoretical. “I would’ve thought it would be much farther out,” he says of recent cryptographic practice, adding that “in two to five years, we’re going to see a lot of really cool applications of this.” Specifically, he predicts that privacy-preserving search engines—something he’s working to create, with PhD student Alexandra Henzinger leading the effort in his lab—are close at hand. However, “for more complicated applications, like collecting a bunch of encrypted images from users and training neural network on those images in encrypted form, I think we’re very far from figuring out how to do that.”
In terms of market changes, Professor Corrigan-Gibbs says, “I think the increasing demand for privacy in the services that we use is really exciting. It’s going to drive adoption and development of these technologies.” As users grow more concerned about how their data is being used, this will motivate forward-thinking businesses to implement the technical tools that will give their customers peace of mind, especially as the cost of doing so decreases.
When asked what he ultimately wants readers to know, Professor Corrigan-Gibbs answered, “students and industry folks should learn more about cryptography because it’s just a fascinating, fun subject.”