Meet the Woman Who Leads NightWatch, Google’s Internal Privacy Strike Force

Lea Kissner is back at her alma mater, the University of California at Berkeley, armed with a crisp gray blazer, a slide deck, and a laptop with a ‘My Other Car Is A Pynchon Novel’ sticker on it. Since graduating in 2002, she’s earned a PhD at Carnegie Mellon in cryptography and worked her way up at Google, where she manages user privacy and tries to keep things from breaking. She’s here to tell a hall of computer science students how she did it—and also how to create privacy-protective systems at a scale that you won’t find outside a handful of massive tech companies.

When privacy breaks down at a tech company, especially one the size of Google, it inevitably leads to countless headlines and congressional hearings. The word “Equifax” or “Yahoo” is more synonymous today with hacking than with any service either company offered. If its exploitation by Russian intelligence was not enough, Facebook’s reputation has been battered over the past month as its years-long negligence to protect user data from Cambridge Analytica has been revealed.

It’s a fate that Google, of course, would very much like to avoid. And making sure that Google products protect the privacy of users around the world—and that Google accounts for individual users’ varying definitions of privacy—is Kissner’s job.

Kissner’s responsibilities include making sure that Google’s infrastructure behaves the way it’s supposed to, transmitting user data securely and not leaving bits of data hanging around in the wrong spots. If someone sends an email, it needs to not leak in transit. If that person deletes the email, it has to actually go away without leaving a residual copy on a maintenance server. Another part of the job is making sure Google’s products behave the way users expect them to. This also involves considering how someone with malicious intent might take advantage of a Google product and patching up those holes before they’re exploited.

Kissner leads a team of 90 employees called NightWatch, which reviews almost all of the products that Google launches for potential privacy flaws. Sometimes, products just need a bit of work to pass muster—to meet the standard of what a former colleague of Kissner’s, Yonatan Zunger, calls “respectful computing.”

The fundamental challenge for a team like NightWatch, Zunger says, is making computing systems that people feel comfortable using. “They don’t feel safe, they don’t feel trust. They look at companies and they don’t know: Does this company have my best interests at heart at all? If you don’t deeply and intuitively understand the company’s business model, you can assume the worst,” Zunger explains.

Being respectful of a user can be as simple as giving her a way to respond to a product that bothers her, whether its an ad for a chicken recipe that’s not relevant for her because she’s a vegetarian or an abusive message that she wants to report. Sometimes, products have privacy failings at their core and they don’t get NightWatch’s signoff—and so they don’t launch.

“I’ve had a fair number of teams come out of that and they say, ‘We need to find a new project now because we need to cancel our project,’” Kissner tells me. “I heard a rumor that I’m scary when I go into these conversations, which I find very surprising because I don’t think I’m a very scary person.”

Kissner has even had to hit the kill switch on her own projects. She recently tried to obscure some data (which exact data she won’t say; Google is cagey about going into detail on its sidelined ventures) using cryptography, so that none of it would be visible to Google upon upload. She was looking forward to whiteboarding it out for Google’s lawyers—“Trying to explain crypto to lawyers is always exciting”—but it turned out that making the feature work would require more spare computing power than Google has in all of its data centers, combined.

“I’m keeping an eye on the crypto conferences in case something comes up that we can use,” Kissner says sadly. “I hope somebody else figures out how to solve a problem if I can’t solve it. One of the advantages to working at Google is that you have choices that would just be considered completely out of the question anywhere else. Even so, I can’t always get the answer right.”

Kissner is here at UC Berkeley to pitch paranoia—paranoia as a career asset, a life skill, a North Star. “There are a number of ways system failure can get really really tricky. Paranoia is awesome because then you can find them!” she cheerfully declares.

It turns out that implementing privacy at scale isn’t very captivating, at least not to this group of students gathered in a lecture hall after dark. I can see several of them chatting with each other on Facebook Messenger, one playing online chess, and another livestreaming a sports game. The problem with getting students—or anyone, really—excited about privacy at scale is that, when everything’s working as it should, it’s not exactly thrilling.

“Security is a basically adversarial thing. You are studying failures deliberately introduced by hostile actors. Your job is paranoia—literally, that is your job,” Zunger tells me.

The thing is, Kissner is not a naturally paranoid person. Kissner, who started programming in elementary school by mocking up elaborate password schemes in Basic on her dad’s calculator, spent her childhood as an amicable, trusting geek. She played a lot of mahjong and was part of her school’s band and worked on a Mars rover the summer after graduating from high school.

Kissner gave up her early fascination with robots to study cryptography instead, which she jokes was a decision owed to both the toxicity of solder fumes and a fascination with difficult math. “I have a lot of feelings for combinatorics and for group theory and for number theory—that stuff is beautiful,” she says, then corrects herself. “Sorry, number theory is cute. Abstract algebra is beautiful.”

I laugh, but she’s not kidding. “I’m serious!” she exclaims. “Abstract algebra is very, very, very elegant. Number theory, it’s just like all these cute things fall out when you do abstract algebra with actual numbers.”

Getting a read on people and gauging their trustworthiness was a skill Kissner learned later in life, she says—an uncomfortable experience, but one she immediately applied back into her work. “I am extremely aware that not everybody experiences the world the way I do. I’m actually surprised when I meet somebody who experiences the world the way I do,” Kissner says. “It took me a lot of work to be able to understand other humans at all—not at all, but reasonably well.”

Intimidatingly, precisely focused and eager to drill into complex topics, Kissner talks me through the finer points of building robots before breaking down the particulars of the European Union’s General Data Protection Regulation. She’ll spend the next several months painstakingly preparing Google’s products for compliance with GDPR—regulation that sets sweeping new data-privacy rules and requires companies to comply if they serve customers within the EU—but the company is fortunately in pretty good shape already, thanks to its prior commitments to data portability.

As a grad student, Kissner attended her first CRYPTO, one of the largest academic cryptography conferences in the world. She’d gotten a paper accepted to the conference—a career-making opportunity.

But the trip was soured by a run-in with another conference attendee, a man who followed her around the conference making remarks about how he’d been dreaming about her. Kissner won’t say who he was, just that he was influential enough to have his own Wikipedia page.

“I didn’t know whether they would directly be able to influence my career. But I didn’t want that to be the thing people knew about me when I went to go look for a job in a few years,” she recalls. “The best case would be, ‘Oh it’s that girl that that guy did that creepy thing to,’ which is not really what I wanted to be known for.”

Kissner certainly isn’t the only woman working in cybersecurity who would probably prefer to be evaluated on the merits of her work. But even when you’re not thinking of your gender, cybersecurity is one of those industries that will remind you, often abruptly, that you are practically alone in the room. When the massive security conference RSA announced its keynote speakers earlier this year, there was just one woman in the lineup—which prompted women in the industry to spin up a conference of their own. OURSA, a one-day conference taking place on Tuesday, is the result of that work. The entire event was planned in less than five days and sold out in under 12 hours, indicating the high demand for diverse conversations about security and privacy. Kissner will chair one of the conference tracks, Practical Privacy Protection.

Kissner has worked to make sure that diversity is reflected not just at OURSA but at NightWatch. The team recruits people with as many different skill sets and backgrounds as possible so that they’ll be better at recognizing privacy problems that others might not see.

Still, though, there are limits that NightWatch can’t recruit its way out of, Kissner concedes. For instance, the team won’t be able to have members that are currently experiencing unemployment, obviously, since they would need to be employed by Google to review its unreleased products.

“I think it’s incredibly important to have a lot of different ideas when you are designing security and privacy,” Kissner says. “You are taking care of the incredibly diverse set of people in the world and it’s hard to understand what they need and take care of that unless you have voices from all different backgrounds and skill sets.”

It seems like the Berkeley students aren’t buying into Kissner’s paranoia—until the end of the lecture when a student asks whether Google actually deletes data when it claims to and how long that process takes. Never mind that Kissner has just walked through exactly how Google deletes data—in moderate technical detail, no less.

“We really delete the data, like for reals,” she replies. But she can’t say exactly how long it takes for the data to disappear, even when the student presses the point.

“I want to tell people things we’ve learned. I want to build the world I want to live in, and the world I want to live in includes things like products being designed respectfully of users and systems being designed respectfully for users. I don’t think everybody has to learn everything the hard way,” Kissner tells me later. Then, the mathematician in her kicks in and she adds, “It’s very inefficient if nothing else.”

Leave a Reply

*