Published on October 13th, 2014 | by EJC0
The Ethics Of Sensor Journalism: Community, Privacy, And Control
This article was written by Josh Stearns, and commissioned and edited by Fergus Pitt on behalf of the Tow Center for Digital Journalism at Columbia University. Republished with permission.
The rise of sensor journalism comes at a unique moment of flux in the debate around journalism ethics and raises its own set of questions and concerns. Three of the leading professional organizations serving reporters—the Society for Professional Journalists, the Poynter Institute, and the Online News Association—are undertaking major examinations of journalism ethics focused on how new technology and practices are changing the field.
At the same time, we face complex and critical questions about society’s changing relationship with data and personal information. Revelations about both commercial tracking and government surveillance have sparked a contentious national debate about our fundamental rights regarding privacy and security. Meanwhile, community participation in the journalism process has blurred the lines between news professionals and the audience. It is into this public uncertainty and ambiguity that sensor journalism emerges. This social and political context is critical to thinking about how journalists work with, and collect data about, their communities.
We should acknowledge that we are trying to define an ethical framework for a multifaceted and emergent practice. Sensor journalism, as currently constructed, already embodies a number of different forms and models, each raising different ethical questions. There are choices to be made both about what data to collect and what to do with that data once you have it.
Similarly, the ethical questions differ depending on whether you are working with willing or unwitting participants. We also need to account for the ethics surrounding sensor data collected by journalists, versus data gathered through crowd-sourced efforts for newsrooms. Below, you’ll find a focus not so much on answering questions but asking them—questions we can ask ourselves, questions we can discuss in our newsrooms, and questions meant to spark conversations between journalists and communities.
Ethics of Community
In announcing the Poynter Institute’s new book, The New Ethics of Journalism: Principles for the 21st Century, co-editor Tom Rosenstiel wrote, “Technology has so drastically altered how news is gathered, processed and understood, it has taken journalism closer to its essential purpose. […] News always belonged to the public. Now the ethics of journalism must consider the role of the audience, and the impact of community and the potential of technology more fully.”1
The justification for our nation’s strong protection of press freedom is rooted in the idea that gathering and disseminating the news contributes to the common good. Therefore, many ethical debates in journalism turn to questions of whether potential harms are outweighed by the public interest. So, as we consider the legal and ethical questions surrounding sensor journalism, the role of the public should loom large.
However, since both our communities and our technology are in such a state of flux, this is no simple task. From the start, C.W. Anderson has argued, “The public was journalism’s totemic reference point, its reason for being.” Yet, in the digital age, while “the journalism-public relationship is still paramount,” the public is no longer a narrow, easily defined category— it is everywhere.2 Indeed, journalism is still produced for the public, but it is also increasingly produced with the public.
Passive Participants and Active Transparency
There are at least two communities to consider in the design of sensor journalism projects: The members of the community where we are collecting data, and any community members who are helping us collect that data.
There is still a lot people don’t know or don’t understand about sensors, so we should approach our journalism with as much openness and transparency as possible. Trust takes a long time to build, and a very short time to break. Because of both the perceived and real concerns to privacy inherent in sensor journalism we should not rely on passive transparency, but instead must invest in a more active kind. Passive transparency means sharing the minimum amount of information possible, and requires communities to seek that information out. Instead, an active transparency approach calls on us to engage in proactive outreach and education about our sensor use.
Whenever possible we should make the work of sensor journalism visible to those who might encounter our physical tools. We can do so by labeling sensors with signs to make them visible, by publishing maps of where the sensors are placed online and in print, by attending community meetings to explain our projects, and—if we are working in radio or TV—we can talk about the sensors on the air and point out where they are gathering data. Ideally, this public awareness work can and should begin before the sensors are deployed.
Kord Davis gave a concrete example of the sensitivities of transparency during the 2013 Tow Center sensor journalism summit. Davis pointed to the use of aerial drones to collect geocoded data in Africa that may include indigenous populations whose values—when it comes to the collection and tracking of that data—we don’t know or understand and who have not been part of the design of our study.3 While this may be an extreme case, Davis’ point is that our newsrooms and communities have values, and those values may not always be in alignment. As such, our ethical framework has to account for both our professional norms as well as the values of the communities with whom we are working.
Data from the Crowd, Responsibility for the Newsroom
Some of the most exciting early projects in sensor journalism, such as the WNYC “Cicada Tracker” project, have included crowdsourcing elements that engage communities deeply in the process of journalism and data collection. As more and more newsrooms look for meaningful ways to connect with their communities, sensors offer an exciting, hands-on opportunity for collaboration. However, employing the crowd to help collect data also raises new ethical questions about how to prepare and protect the community members with whom we work. While we need to communicate and discuss these issues with community participants, newsrooms also have to consider issues like the legal and safety risks participants might face in collecting data.
An Online News Association working group looking at the ethics of user- generated content and social journalism asked a series of questions that are also relevant here: “Is it okay to ask members of the public to compromise their safety in order to gather information or content for the news media? Or even to accept material from them when they choose to take such a risk?
What are our responsibilities to them?”4 For example, if we are working with communities on a project like Public Lab’s water quality testing effort, what if one of the volunteers drowns while trying to place his or her sensor, or is sued for trespassing on private property while sojourning to the river?
If we treat community as an end, not simply a means, as Poynter suggests in its new book, we have to understand that our ethical responsibility extends beyond the subjects of our stories and the community of our readers to the collaborators with whom we work during crowd-sourcing efforts.5 This approach demands that newsrooms think about how they train, support, and engage their communities, all of which can actually help create more accurate journalism and more deeply connected audiences.
The Ethics of Privacy and Control in an Age of Surveillance
It would be impossible to discuss the intersection of sensor journalism and our communities without addressing the impact of recent revelations about privacy and surveillance by governments and corporations. While the privacy issues that sensors raise exist independent of factors like the Snowden revelations, news about the National Security Agency (NSA), and commercial data collection and tracking, these are still deeply influencing the national dialogue about privacy and security in the digital age.
While public response to the NSA revelations was initially muted, as a fuller picture of the agency’s surveillance programs emerged, public polls shifted dramatically. The Snowden documents—paired with security breaches at major retailers like Target—have spiked interest in digital privacy and security. In many respects, sensors may seem even more intrusive to the general public than the NSA’s bulk collection of metadata. Sensors collect data about our movements and our environment, which for many people feels much more immediate than recording what we do on our phones and computers. Journalists are late to the big data game and we are entering it amidst increased questions and concern about sur- veillance in our lives in general.
During the Tow Center’s 2013 sensor journalism ethics panel, Robert Lefkowitz, the CTO at Sharewave, argued that the scope of the ethical questions we face are directly related to how we define sensor journalism. If sensor journalism is only about collecting environmental data, “then the issues are fairly clear,” he said, but once we begin collecting personal data they expand exponentially.
Of course, defining the lines of privacy and personal information is increasingly challenging. At the same event, another panelist, Professor Joanne Gabrynowicz, pointed out, “The whole idea of public and private spaces is being turned on its head.” This concept is reinforced in an important paper on data protection laws by Eloïse Gratton, in which she writes, “A literal interpretation of the definition of personal information is no longer workable.”6
Changes in technology and in the amount of data people leave behind, both intentionally and unintentionally, are challenging what we mean by personal data. As the power and accuracy of sensors expand and tools for parsing data sets develop, analysts and algorithms can connect the dots in ways that can turn seemingly anonymous information into personally indefinable data. In her paper, Gratton argues that, given these changes, a literal definition of personal data can create three contradictory consequences when used to guide ethics and policy. The literal definition of personally identifi- able information could be too inclusive (threating the free flow of informa- tion), too lax (misunderstanding how the context of individual data points), or simply too vague (leaving uncertainty regarding the point at which a piece of data becomes personally identifiable).7
Thus, returning to Lefkowitz’s initial point above, it becomes more difficult to differentiate “benign” environmental data, for example, from “sensitive” personal information. Instead of drawing arbitrary lines based on outdated notions of personal information, Gratton suggests we focus on the purpose of the protection—to avoid harm. Using risk of harm as a guide in thinking about how we collect and handle data has its own layers of uncertainty and interpretation, but it resonates both with the community-focused approach outlined earlier and with journalism ethics that have long grappled with balancing the needs of privacy with the public interest.
In a very real way, sensors are expanding the web of surveillance, and so we need to consider the various risks of harm embedded in our actions. Much as journalists should base their own security precautions on real-life threat models, we should assess what threats our sensor projects pose to our communities. This raises important questions for journalists regarding how they collect, store, and protect the data they gather (as well as legal questions around how governments can and will request access to that data).
Controlling the Data
In a 2012 article, Alistair Croll argues that big data is this generation’s key civil rights issue. He points out how personal data collected on and offline is being used to make discriminatory decisions in everything from credit cards to loan offers. “Data doesn’t invade people’s lives,” he writes, “lack of control over how it’s used does.”8 Indeed, the question of who gets to control the data collected about us was also a key point in a set of recommendations delivered to the White House in early 2014 by a coalition of civil and human rights groups. “Individuals should have meaningful, flexible control over how a corporation gathers data from them, and how it uses and shares that data,” wrote the groups.9
Both Croll and the civil rights coalition recognize that data can be a powerful force for positive change in health, environment, and social justice, but that without real agency from communities it can also be a double-edged sword. “While data is essential for documenting persistent inequality and discrimination,” said Wade Henderson, President and CEO of The Lead- ership Conference on Civil and Human Rights, “no one—no matter their color, ethnicity, or gender—should be unfairly targeted by businesses or the government for dragnet surveillance, discriminatory decisions, or any other unwarranted intrusions.”
It is easy to see how journalists bringing sensors into a community might be seen as an unwarranted, or at least unwelcomed, intrusion. We have to understand the history and current state of surveillance in diverse communities around the United States, and the discrimination and threats linked to that surveillance.10 Our decisions have to be informed by that history and in conversation with communities.
In addition to the transparency and community engagement outlined above, whenever possible journalists interested in employing sensors should develop meaningful ways for communities to access, control, and discuss their participation and data. Developing these systems will only come through hands-on application, and will demand flexibility based on the journalism project, the newsroom, and the community. A few questions for consideration as newsrooms develop sensor journalism projects include:
- Who has control over the data we collect (both during collection and afterwards)? Can communities and individuals access the data?
- Can individuals and communities opt-in or opt-out? When is such an opt-in process important and when does the public interest outweigh the need for an opt-in?
- If publishing open data sets, can we (or our communities) con- trol whether or not the data gets used for commercial purposes?
Protecting the Data
A key part of how we control the data we collect relates to how we store and protect it. There has been a growing emphasis on transparency and calls for data journalists to publish their data openly on the Web. While that level of transparency for most data should be the standard, depending on the data collected by our sensors and the feedback from our communities, we’ll need to think about how and if we should share what we collect publicly. In the planning stages of a data journalism project, newsrooms should discuss their policies and procedures around anonymizing and sharing data.
Part of that planning must also address how it will be stored. Newsrooms need to understand the best practices for safely storing information, and be up front about how long the data will be kept and for what purpose. In his article on big data and civil rights, Croll notes, “If I collect information on the music you listen to, you might assume I will use that data in order to suggest new songs […] But instead, I could use it to guess at your racial background. And then I could use that data to deny you a loan.” This example implores us to question how we’ll control future uses of the data we collect and store.
Also, when our sensors are our sources, there will be times when we need to consider how we will protect that data just as we would protect a source. Jonathan Peters, the press freedom correspondent for the Columbia Journalism Review, has written about the security and legal risks of journalists’ increased reliance on cloud storage for files and data. “The legal protections for journalists using the cloud is, well, cloudy,” he writes. Historically, the Privacy Protection Act of 1980 protects journalists from government searches and seizures but Peters notes, “It’s unclear whether the things journalists store in the cloud enjoy the same legal protection as the things they store on personal computers, on local servers, and in desk drawers.”11
In the early days of sensor journalism it might be difficult to imagine a project whose data set could be of interest to government investigators, but in many ways, that is the point. We should grapple with these questions now and develop ethical guidelines for sensors that address all the kinds of data we collect—recognizing that, as our identities are increasingly defined by our data, more and more of that data is personal.
The intersection of changes in technology and shifts in newsroom processes and ethics mean it is nearly impossible to define any clear-cut guidelines for sensor journalism ethics. Instead, we can and should identify values and questions that can guide us appropriately. Recognizing that sensor journalism projects will vary greatly in project design, the technology they use, and the context of the communities where they are working, our ultimate goal should be to align those various elements in ways that build both new knowledge and safer communities.
Alister Croll suggests that one of the most dangerous things we do when we collect data is “collect first and ask questions later.” In this model, he argues, “This means we collect information long before we decide what it’s for.” If we don’t ask these questions in advance there is no way we can provide our communities any control, or ensure that we are protecting their privacy and storing their data securely; Croll suggests “This act of deciding what to store and how to store it is called designing the schema, and in many ways, it’s the moment where someone decides what the data is about. It’s the instant of context.”12
While I’ve referenced the ongoing ethical debates and inquiries at the major journalism institutions, the questions outlined here need to be better integrated into those debates. As a field, we need to weave these threads together and consider ethical guidelines that are not so narrow that every new technology challenges them and sends us back to the drawing board. To that end, we should also draw on the expertise of those groups working at the intersection of press freedom and civil liberties like the American Civil Liberties Union, Free Press, and the Electronic Frontier Foundation. These groups, which have long worked with questions at the heart of journalism and individual privacy, can add a lot to our discussions and serve as a check against our industry’s assumptions and culture.
If sensor journalism is to expand and thrive, however, we’ll need communities, elected officials, technologists, and newsroom stakeholders to buy in. It is not enough to define the ethics of sensor journalism through the lens of newsrooms and journalism ethics; we must also shape our choices through the lens of communities and their values. We need to be aware of historic issues around power and racism that are also embedded in debates about surveillance and move forward in a way that invites more people to the table to discuss these issues. This is an organizing challenge and we should rise to meet it.
“Poynter Publishes Definitive New Journalism Ethics Book,” Poynter Institute, 11 Sep. 2013,
C.W. Anderson, “The Public is Still a Problem, and Other Lessons From ‘Rebuilding the News,’”
Nieman Journalism Lab, 18 Jan. 2013, www.niemanlab.org/2013/01/c-w-anderson-the-public- is-still-a-problem-and-other-lessons-from-rebuilding-the-news/.
“The Laws, Ethics, and Politics of Sensor Journalism,” panel at the Sensor Journalism Workshop,
Tow Center for Digital Journalism, Columbia University, 6 Jun. 2013, http://towcenter.org/blog/ sensors-the-laws-ethics-and-politics-of-sensor-journalism.
Fergus Bell and Eric Carvin, “Social Newsgathering: Charting an Ethical Course,” Online News
Association, 6 Mar. 2014, http://journalists.org/2014/03/06/social-newsgathering-charting-an- ethical-course/.
Poynter replaced the traditional “minimize harm” philosophy with “engage community as an end
rather than as a means.” See a discussion of the changes here: www.poynter.org/latest-news/
Eloïse Gratton, “If Personal Information is Privacy’s Gatekeeper, then Risk of Harm is the Key: A
Proposed Method for Determining What Counts as Personal Information,” Albany Law Journal of Science and Technology, Vol. 24, No. 1 (2013), http://papers.ssrn.com/sol3/papers.cfm?abstract_ id=2334938.
Alistair Croll, “Big Data is Our Generation’s Civil Rights Issue, and We Don’t Know It,” Radar, 2
Aug. 2012, http://radar.oreilly.com/2012/08/big-data-is-our-generations-civil-rights-issue- and-we-dont-know-it.html.
“Civil Rights Principles for the Era of Big Data,” The Leadership Conference, 2014, www.
For a good discussion of some of these issues related to the history of surveillance and
communities of color, see the summary of a panel discussion held by Free Press, www.freepress. net/blog/2013/11/01/panelists-say-surveillance-nothing-new-communities-color and remarks from Amalia Deloney at the Media Consortium meeting, delivered in February 2014, www.mag- net.org/sites/default/files/amalia_TMC_comments_media_consortium.pdf.
Jonathan Peters, “Updating the Privacy Protection Act for the Digital Era: Law Protecting
Journalists From Searches Didn’t Anticipate Cloud Computing,” Columbia Journalism Review, 30 Jan. 2012,www.cjr.org/behind_the_news/updating_the_privacy_protectio.php.
About the Author: