Data-Crunched Democracy: Data-Driven Campaigning’s Lessons for Re-Imagining Governance

datapolitics

Joseph Turow and Daniel Kreiss – @NoraADraper

On Friday May 31st, the Annenberg School for Communication at the University of Pennsylvania hosted Daniel Kreiss and Joseph Turow’s Data-Crunched Democracy Conference on the growing use of data analysis and voter modeling to inform campaigning. The conference brought together practitioners, journalists and academics interested in this burgeoning but still under-examined field. Rasmus Kleis Nielson, author of the book Ground Wars on the renewed importance of (data-informed) campaign volunteerism, posted an extended recap of the event on his blog, and #datapolitics provides another look at the lively discussion and debate inspired by the conference.

A conference on data-driven campaigning would appear to be the last place to find insights on opening government, considering how much of the discussion revolved around journalists’ and academics’ frustration with campaigns’ lack of transparency regarding the practice, despite the fact that, as Yale’s Eitan Hersh noted, voter targeting is largely a function of public record availability. However, putting this lack of transparency somewhat to the side, campaigns’ embrace of data, both in the form of modeling and evidence-driven decision making, can teach us a number of valuable big data lessons as we try to make government more effective.

Below are five key points made by conference participants regarding political data-mining and voter-modeling that can help guide efforts to re-imagine governance:

Data Can Be a Source of Empowerment, Through Protection or Availability

A number of participants, notably Personal Democracy Media’s Micah Sifry and Ohio State law professor Peter Swire, characterized personal or institutional control over data as a means for gaining or retaining power. Sifry’s argument largely revolved around the control of data granting power to campaigns and removing it from citizens; Swire, on the other hand, discussed the tension between “data empowerment” and “data protection.” As it becomes more and more evident that greater access to data results in greater empowerment, entities that control data are, not surprisingly, finding more and more reasons to protect that data and fight against relinquishing control over it. Campaigns could be reluctant to ease their grip on data due to privacy concerns and the fear of losing their competitive advantage, among other reasons.

Government, on the other hand, runs the risk of being overly protective of public data—and, in the process, stunting the empowerment of the populous—due to the fear of disrupting the status quo or releasing information that portrays the government in an unfavorable light, among other concerns. While there is certainly cause for vetting data and ensuring that potentially harmful information isn’t opened to the public, government needs to recognize when it is being overly protective of public data, and, in effect, lessening the empowerment of its citizens, who, when given that power, can create a wealth of public value.

Incomplete Data Can Still Yield Results

Rayid Ghani, former Obama for America Chief Data Scientist and current advisor to the University of Chicago’s Data Science for Social Good fellowship program, noted that voter modeling is particularly difficult for campaigns because voting is an action people don’t take very often, and there’s a drastic change in context between times they take that action. Ghani and other practitioners consistently referenced the challenge of gaining actionable insights with such inconsistent data. Despite this challenge, Obama for America was successful in the, somewhat tongue-in-cheek, stated goal of achieving insights that were “better than random.” While Ghani and others at the conference downplayed the role of data in winning President Obama the election—despite popular belief, and the continued success of members of the campaign’s data analytics team, like Chief Analytics Officer Dan Wagner—it is clear that the intelligent use of incomplete data still produced results.

Unlike campaigns, in some areas, government can already boast full access to exhaustive, actionable datasets. As government continues to work toward developing systematic processes for data collection, aggregation and storage, resulting in the continued improvement of the availability and usability of public information, we are likely to witness even greater impacts. If Obama for America could win an election by modeling actions taken every four years and leveraging inconsistently produced data, what could government achieve through the intelligent leveraging of the wealth of constantly produced, multi-faceted public data that is currently, or will soon be, at its disposal?

Data Analysis Can Disrupt Inertia and Guide the Smarter Allocation of Resources

While most of the attention paid to the use of data in the 2012 election revolved around voter modeling and microtargeting, data also played an essential role in guiding the allocation of resources. Carol Davidsen, Obama for America’s Director of Integration & Media Targeting, told the audience that, “using data to model how much things cost is also really important.” In particular, rejecting the inertia of traditional television advertising buys, and, instead, taking a close, evidence-based look at how campaign money could be more intelligently distributed led to a more effective—and cost-effective—television advertising strategy for the campaign.

Like Obama for America, government could take great steps toward improved efficiency by rejecting inertia and pre-conceived notions of what works, and, instead, using evidence to guide decision making. Just as Davidsen and her team found that potential voters could be reached more efficiently and inexpensively in previously unexpected areas of the TV universe, government could use data and experimentation to discover surprisingly cost-effective replacements for entrenched practices.

Big Data Doesn’t Overrule Human Common Sense

Davidsen made another important point when an audience member asked if Obama for America truly ignored the content of the television programs during which they advertised, and instead focused solely on the results of data analysis. While Davidsen’s efforts clearly demonstrated an attempt to move beyond content bias—like, for instance, holding to the assumption that the best time to reach potentially valuable voters is during a local news broadcast, not during Seinfeld reruns, when advertising is cheaper—she was quick to respond that despite data overtaking content as the central arbiter of where to spend television advertising dollars, big data did not overrule human common sense. Data analysis did not make advertising decisions; it guided decisions made by humans. Davidsen assured the audience that while the campaign rejected traditional advertising practices that lacked evidence of efficiency, there was never any risk of an Obama for America ad appearing on a TV program that would reflect poorly on the campaign.

As government works to become more evidence-driven and embrace the capabilities engendered by data analysis, it will be important for systems to be put in place that ensure common sense and human intervention still play a role in decision making. Especially considering the belief that an increased focus on data-driven decision making can help mitigate the effect of shrinking budgets, it will be important for governments not to lose sight of the human element that is absolutely essential for the effective leveraging of big data.

“It’s not the data; it’s the data science”

Closely related to the importance of human intervention and common sense in the big data space, Peter Pasi, the vice president of big data advertising firm Collective’s political arm, reminded the audience that, no matter the extensiveness, “it’s not the data; it’s the data science” that yields results. In the midst of extended debates regarding the types of data available to current campaigns, and whether the sheer expansiveness of available data rendered modern targeting efforts qualitatively different from data-driven direct-mail and telephone targeting projects of the past, Pasi made it clear that the most essential need for any big data project is a collection of practitioners capable of operationalizing that unprocessed information. The Obama campaign, for one, was able to make the most of the data available to it because of employees like Rayid Ghani and Dan Wagner who could mine raw data to find actionable insights for the campaign.

A government commitment to collect, aggregate and store data will have little effect if the personnel tasked with acting on that data lacks the skills necessary to create value. While open data looks beyond the walls of government to engage people in other sectors with the skills necessary to use public data to benefit society, as more internal government big data projects are undertaken, departments and agencies need to focus on staffing accordingly.

The Tags . . .

No comments yet.

Leave a Reply