The Big Idea: Bruce Schneier
Posted on March 4, 2015 Posted by John Scalzi 7 Comments
What’s your electronic data worth to you? What is it worth to others? And what’s the dividing line between your privacy and your convenience? These are questions Bruce Schneier thinks a lot about, and as he shows in Data and Goliath, they are questions which have an impact on where society and technology are going next.
BRUCE SCHNEIER:
Data and Goliath is a book about surveillance, both government and corporate. It’s an exploration in three parts: what’s happening, why it matters, and what to do about it. This is a big and important issue, and one that I’ve been working on for decades now. We’ve been on a headlong path of more and more surveillance, fueled by fear – or terrorism mostly – on the government side, and convenience on the corporate side. My goal was to step back and say “wait a minute; does any of this make sense?” I’m proud of the book, and hope it will contribute to the debate.
But there’s a big idea here too, and that’s the balance between group interest and self-interest. Data about us is individually private, and at the same time valuable to all us collectively. How do we decide between the two? If President Obama tells us that we have to sacrifice the privacy of our data to keep our society safe from terrorism, how do we decide if that’s a good trade-off? If Google and Facebook offer us free services in exchange for allowing them to build intimate dossiers on us, how do know whether to take the deal?
There are a lot of these sorts of deals on offer. Waze gives us real-time traffic information, but does it by collecting the location data of everyone using the service. The medical community wants our detailed health data to perform all sorts of health studies and to get early warning of pandemics. The government wants to know all about you to better deliver social services. Google wants to know everything about you for marketing purposes, but will “pay” you with free search, free e-mail, and the like.
Here’s another one I describe in the book: “Social media researcher Reynol Junco analyzes the study habits of his students. Many textbooks are online, and the textbook websites collect an enormous amount of data about how — and how often — students interact with the course material. Junco augments that information with surveillance of his students’ other computer activities. This is incredibly invasive research, but its duration is limited and he is gaining new understanding about how both good and bad students study — and has developed interventions aimed at improving how students learn. Did the group benefit of this study outweigh the individual privacy interest of the subjects who took part in it?”
Again and again, it’s the same trade-off: individual value versus group value.
I believe this is the fundamental issue of the information age, and solving it means careful thinking about the specific issues and a moral analysis of how they affect our core values.
You can see that in some of the debate today. I know hardened privacy advocates who think it should be a crime for people to withhold their medical data from the pool of information. I know people who are fine with pretty much any corporate surveillance but want to prohibit all government surveillance, and others who advocate the exact opposite.
When possible, we need to figure out how to get the best of both: how to design systems that make use of our data collectively to benefit society as a whole, while at the same time protecting people individually.
The world isn’t waiting; decisions about surveillance are being made for us – often in secret. If we don’t figure this out for ourselves, others will decide what they want to do with us and our data. And we don’t want that. I say: “We don’t want the FBI and NSA to secretly decide what levels of government surveillance are the default on our cell phones; we want Congress to decide matters like these in an open and public debate. We don’t want the governments of China and Russia to decide what censorship capabilities are built into the Internet; we want an international standards body to make those decisions. We don’t want Facebook to decide the extent of privacy we enjoy amongst our friends; we want to decide for ourselves.”
In my last chapter, I write: “Data is the pollution problem of the information age, and protecting privacy is the environmental challenge. Almost all computers produce personal information. It stays around, festering. How we deal with it — how we contain it and how we dispose of it — is central to the health of our information economy. Just as we look back today at the early decades of the industrial age and wonder how our ancestors could have ignored pollution in their rush to build an industrial world, our grandchildren will look back at us during these early decades of the information age and judge us on how we addressed the challenge of data collection and misuse.”
That’s it; that’s our big challenge. Some of our data is best shared with others. Some it can be “processed” – anonymized, maybe – before reuse. Some of it needs to be disposed of properly, either immediately or after a time. And some of it should be saved forever. Knowing what data goes where is a balancing act between group and self-interest, a trade-off that will continually change as technology changes, and one that we will be debating for decades to come.
—-
Data and Goliath: Amazon|Barnes & Noble|Indiebound|Powell’s
Read an excerpt. Visit the author’s Web site. Follow him on Twitter.
Is this “Applied Cryptography” Bruce Shneier? I have first and second editions of that work. It’s a freakin’ masterpiece.
Same Schneier who wrote Applied Cryptography. He has a blog, linked above.
Jerome: Yes, it is. And yes, it is.
This looks to be very much what we all need to understand in this age of internet.
And as someone involved in genealogical research, this has been a big question for a while: when do things like government census records become public information so people can use that information to track ancestry?
In Canada, the general rule is after 92 years, but of course there are a number of people who are still alive after that long, which can make that problematic. Indeed, the laws handling this have been changed over the last few years.
Shneier is a national treasure. His monthly Crypto-Gram newsletter is well worth signing up for.
We need to make the users of this data accountable for what they are doing with it.
We need to be discussing how we can develop technical ways to monitor the users of this data,and develop the laws and regulations about what the public gets to know about its use.
It doesn’t really make sense to be concerned with what they know, if we don’t know who the “They ” is, and what “They” are doing. Is this technically possible? What kinds of constraints can be put on this?
One of the concerns I have is that much of the internet and pan-opticon surveillance data is a public-private partnership, where the corporations, including private security companies, use the power of the government to compel disclosure, but then the data is moved to the private sector where it is “proprietary” and not accessible through FOIA and other Open Government laws.
At the very least, the laws should require that everyone in a database get notified when their data is sold to another organization.
I’m thinking about all those millions of poor fools who filled out on-line job applications, not knowing they were just spilling personal details to be sold on the open market to anyone who would pay for it.