Policy

Going Public on Privacy

Ex-White House tech team member Dipayan Gosh on ethics, game theory, and a (mostly) market solution 7/10/2017 8:00 AM Eastern
Dipayan Ghosh on broadband privacy: "[I]t helps to think about it from a game theory perspective: Pit the end user against the corporation..."

Dipayan Ghosh would like to be an “algorithmic ethicist.” If that sounds daunting and a bit indecipherable, no worries. He’s got a Ph.D. from Cornell in electrical and computer engineering and is working with Harvard and think tank New America on web civil-rights issues. Then there were the years spent working on tech policy at the White House under President Barack Obama, trying to put legislative legs under the Broadband Bill of Rights — which, like many other efforts over the past few years, never became law. That was followed by a stint as privacy and policy adviser at Facebook. He’s now with New America as part of a new public-interest technology team of fellows. Ghosh talked with Multichannel News Washington bureau chief John Eggerton about network neutrality, the search for algorithmic diversity and how game theory could drive a marketplace model for privacy.

[For the full-length, uncut version of this interview, click here.]

MCN: Tell us about yourself.
Dipayan Ghosh: I’m a computer scientist by training and did my Ph.D. in electrical engineering and computer science. In graduate school, I grew more interested in information theory, which is the study of how you send information from point A to point B as securely and efficiently as possible.

I grew very interested in the subfield of security and privacy. After the [Edward] Snowden disclosures, the Obama administration was under a lot of pressure to respond to the public outcry and, at the time, there was a lot of scrutiny about what surveillance practices the government, more broadly, might employ and what kind of protections there were and should be in the future on individual privacy.

MCN: So you decided to join that effort?
DG: Yes, it was a really interesting time to be there. John Podesta came in to lead the approach to these really difficult issues. The president commissioned a report back to him, and led by Podesta, that would be a comprehensive review of Big Data and privacy and the implications for individual rights across society.

MCN: And what has happened to the report since then?
DG: Out of that report came the Consumer Bill of Rights Act, a legislative proposal that came out of the White House in February 2015, as well as the Student Digital Privacy Act. That garnered bicameral, bipartisan support on the Hill, even though it hasn’t moved forward. The conversation with Europe has progressed to where we renegotiated safe harbor and made a lot of progress, though there are new questions they are asking given the new political situation in the U.S.

We worked on net neutrality, which ultimately resulted in a recommendation by the president to reclassify common carriers under Title II. That recommendation came out on YouTube, of all places. My appointment was across the National Economic Council and also the Office of Science and Technology.

MCN: What made you move to Facebook?
DG: The safe harbor negotiations with Europe. I was particularly interested in moving to a company at the forefront of privacy. I worked in the privacy team within the policy organization at Facebook and worked partly on how we develop products in a way that is sensitive to users’ needs.

MCN: What are those needs, and does this generation even want privacy?
DG: Obviously, the conceptualization of privacy has evolved over time from decades ago being behind a wall and having your own physical space where you can have a reasonable expectation that nobody is spying on you, to today, where there are a lot of different platforms and a lot of different universes where we need to be aware of how we are perceived by others.

When I was studying this issue through an academic lens, I found it helps to think about it from a game theory perspective: Pit the end user against the corporation, say, a utility for example. They both want to maximize their bottom lines and they don’t want to compromise that bottom line. For the consumer, it’s their utility; for the corporation, it’s their profits over time. As a consumer, you might value the privacy and security of your information. As a company, you might value that information as well so that you can route power more effectively or be able to monetize it in different ways, let’s say the advertising ecosystem. At the end of the day, both want to maximize their value.

MCN: What are you working on now?
DG: I started this fellowship at New America and I want to look at the intersection of privacy and civil rights, how privacy and anti-discrimination policy can be incorporated into algorithms. There was an issue in the White House report on Big Data discrimination, or algorithmic discrimination, or algorithmic bias. There are different ways that people reference it, but essentially it is the idea that algorithms [sets of computer commands to accomplish a task] can be designed in such a way that they could have a discriminatory impact on society if not designed responsibly.

MCN: Sort of like algorithmic red-lining?
DG: Yes. One example that has been referenced in the White House report is with an application called Street Bump, which basically crowd-sources pothole information.

Boston partnered with them to learn where potholes were in real time, then go out and repair them. It turned out that the potholes that were getting repaired were in richer, younger neighborhood. Obviously, you don’t fault the people who designed the algorithm from the get-go. It’s very hard to perceive that this issue might surface until you observe it. But then they went in and compensated for it, which is the most incredible thing. If they hadn’t, that would have potentially created a mechanism for discriminatory impacts.

Read More: Click here for the full-length, uncut version of this interview.

Want to read more stories like this?
Get our Free Newsletter Here!