The following essay by Charles Lewis is excerpted from a new book, “Global Teamwork: The Rise of Collaboration in Investigative Journalism,” edited by Richard Sambrook and published by the Reuters Institute at the University of Oxford.
The future potential for increased collaborative research and journalism is enormous and exciting to imagine. And the dynamics driving the almost boundless ‘possible’ are the ever-advancing, new computer and other dynamic, related technologies.
Of course, the evolution of communications in general has always been directly related to technological advances, with redounding benefits to the inherently inquisitive professional journalistic, academic and non-government organization (NGO) research-related communities in particular.
For example, in 1846, combining the low-tech pony express with the invention of the telegraph made it possible for four New York-based newspapers attempting to cover the Mexican–American war to “actively collect news as it breaks, rather than gather already published news.” And that new technology allowed them to pool their money and send a single reporter to Mexico, his dispatches wired back to them from the closest telegraph office in the US (Alabiso et al. 1998: 173–5). That led to the creation of the Associated Press, an independent, New York-based, not-for-profit, tax-exempt news co-operative that today is the largest and oldest such news-gathering organisation in the world, with “approximately 1,700 newspaper members, 5,000 radio and television outlets, and 8,500 international broadcasters in 121 countries who received their news in five languages (Dutch, English, French, German, and Spanish).” It has staff teams in 263 locations worldwide, producing multimedia news content that is “seen by more than half the world’s population every day.”
These kinds of dynamic, technological advances have benefited news organisations throughout the world, including the two other oldest and largest, highly respected, international news services, Reuters (1851) and Agence France-Presse/AFP (1944). In all cases, they have also fostered additional communication and professional collaboration within these individual organisations and their far-flung staff personnel but also outside with their thousands of “client” member media organisations, too. Everyone contributes, everyone benefits.
The Associated Press acknowledges that it “often has the right to use material from its members and subscribers; we sometimes take the work of newspapers, broadcasters and other outlets, rewrite it and transmit it without credit.” And, of course, the individual “client” news outlets benefit substantially from the national and international news information they cannot otherwise gather for financial and other reasons.
Another, very different kind of U.S.-based, non-profit news organisation, but considerably smaller and younger – begun nearly 150 years later – the Center for Public Integrity (which I founded and began leading in 1989) began exploring journalistic collaborations with news organisations. For example, it had editorial consulting contracts at separate times with two American television network news divisions in the 1990s, in which they could have embargoed, pre-publication access to national news “findings” from its large, months-long investigations in order to give them time to plan and prepare their broadcast coverage but not “break” it exclusively.
Case Study: Investigating State Legislative Ethics Issues in the US
By 1994, the five-year-old, US-based, non-profit, non-partisan investigative journalism organisation, the Center for Public Integrity, decided to expand its national accountability “watchdog” research/reporting far beyond its base of operations, Washington, DC, to the “heartland” state of Indiana. Why? Because of the urging of a frustrated local citizen there, who suggested that the Center “help [journalists] look at their state legislature the same way the Center had examined Washington,” with numerous investigations utilising and cross-meshing various primary, government records about the uses and abuses of power (Renzulli and Center for Public Integrity, 2002: 2–3).
Over the next two years, Center researchers obtained and shipped 2,000 pages of paper Indiana state legislative campaign contribution records in Indianapolis, the capital, back to their offices, and painfully typed the records of some [19,000 campaign] contributions into a single database for news organisations to use as a starting point for investigations into the legislature. The hope was that for the first time ever, news organisations across the state [would have] computerised access to campaign records that, up until then, had sat gathering dust in filing cabinets at the state capitol (Renzulli and Center for Public Integrity, 2002: 2–3).
The Center made this embargoed information available via individual computer disks to a state-wide consortium of the most respected news organisations in Indiana, including the largest circulation state newspaper, the Indianapolis Star-News, the most-watched local television station in Indiana, WTHR-TV (an NBC affiliate television station in Indianapolis), the Fort Wayne Journal-Gazette, the Evansville Courier, and several other news organisations throughout the state. In addition, the same information was also provided to eight, respected political scientists knowledgeable about state politics at college and universities throughout the state.
Just weeks before joint publication, a day-long, private meeting of all the journalistic and academic individuals involved and their organisations was subsequentlyheldin Greencastle, Indiana(thehostsite: DePauw University), to discreetly analyse and discuss the major findings and trends from the political influence-related data, first conveyed in a confidential, advisory 60-page Center editorial and methodologically detailed memorandum to all participants, prior to the face-to-face discussion. And a precise, public release date, when every news organisation would begin publishing its stories, was mutually agreed upon and set in February 1996 (Renzulli and Center for Public Integrity, 2002: 2–3).
The resulting, multiple news organisation exposés outraged citizens throughout Indiana. The Indianapolis Star-News published a hard-hitting, five-part series of articles titled, “Statehouse Sellout: How Special Interests Have Hijacked the Legislature,” and WTHR-TV aired a multi-day series of stories, “Legislators for Sale,” including a “confrontational” investigative interview with one of the state legislative leaders. The Indianapolis Star- News reported that the state legislators “wanted to make it tougher to win product liability lawsuits. They got it. They wanted lower wages on public construction projects. They got it. They wanted teacher unions to stop collecting money from non-union teachers. They got that one, too.” According to the newspaper, lobbyists in Indiana “out-numbered lawmakers by an 8-to-1 ratio. [And] they found lawmakers from both parties who sponsored bills that would help their employees.”
The public outrage came quickly. In just a few weeks, in the case of just one of the publishing partners, “2,500 angry citizens contact[ed] the Star- News’ and soon afterwards reform legislation became law ‘mandating that all contribution records be made available to citizens online.”
At the same time, of course, not everyone was pleased with the aggressive investigative journalism, particularly the leaders of the Republican- controlled Indiana legislature on the receiving end of the substantial, critical news coverage. But also, less predictably, the then media critic of the Los Angeles Times, Eleanor Randolph, criticised the “outside research” done regarding public state records by the Center for Public Integrity: “the state media tackled this issue because of outside help… instead of a mystifying flutter of 19,000 paper documents, there was one, tidy computer disk, courtesy of a private, nonpartisan organisation called the Center for Public Integrity. “However, the editor of the Indianapolis Star, Frank Caperton, strongly disagreed with her criticism: “We take information every day from hundreds of people. The real question is the integrity of the information, and Chuck Lewis and his troops met every level of integrity that I know of.”
In terms of background atmospherics, context, and pushback by politicians, the Center for Public Integrity just weeks earlier also had released a highly publicised major national exposé involving the role of money in politics in the 1996 presidential campaign and the “Top Ten Career Patrons” of every major presidential candidate in both political parties during their respective careers. The book, released days before Americans began to cast their votes in the Iowa and New Hampshire and subsequent state caucuses and primaries, was titled The Buying of the President. Relatedly, a book- embargoed collaboration with PBS Frontline, a documentary, So You Want to Buy a President?, was broadcast at about the same time. And months later, the Center broke the national, Clinton administration “Lincoln Bedroom” campaign fundraising scandal identifying 75 wealthy donors rewarded with overnight stays in the White House, in an award-winning report entitled Fat Cat Hotel.
A year later, in the considerably more populous and per capita prosperous neighbouring state of Illinois, the Center for Public Integrity “States Project” team worked with University of Illinois political science professor Kent Redfield, and together they coded roughly 90,000 campaign contributions by industry type so we could determine the state’s most influential donors [and] analyzed nearly 23,000 campaign expenditures to find out exactly how state lawmakers spent their money. Because we put [this] database up on our website, for the first time ever, Illinois citizens could find out where state lawmakers got their money with the click of a mouse.
At least a dozen news organisations throughout the state, including the largest news organisation, the Chicago Tribune, aggressively reported on the substantial influence of money there – indeed, 30 front-page news stories hit Illinois newsstands in just one week, informing the citizens of the $73 million that went to state campaigns in the 1996 election cycle. Just four legislative leaders, known as the “Four Tops,” took in one-third of the total raised and controlled the purse strings of candidates across the state.
The first five years of the Center for Public Integrity’s data research and reporting collaborations with traditional news organisations tracking state- based campaign finance and political influence and corruption issues, that began with Indiana and then Illinois, culminated in Our Private Legislatures: Public Service, Personal Gain. It was a national investigation of conflicts of interest by state lawmakers, displayed on the Center website. That 2000 report was discreetly disseminated in embargoed, pre-publication fashion to a consortium of 50 leading participating newspapers in 50 states. We posted, analysed, and reported on the annual financial disclosure filings of 5,716 state lawmakers throughout the nation, exposing literally hundreds of apparent conflicts of interest.
We found, for example, that 41 of America’s 50 state legislatures have part- time “citizen legislators” with other day jobs, but only seven states actually have conflict of interest ethics laws pertaining to their conduct of official business. According to an analysis of financial disclosure reports filed in 1999 by state legislators throughout the US (in 47 of 50 states – three states had no publicly available personal financial disclosure information about lawmakers), Center journalists discovered that “more than one in five lawmakers sat on a legislative committee that regulated their professional or business interest (in 41 of the 50 states, elected legislators only serve part-time, drawing an average annual salary then of $18,000).” And at least 18 percent of the nation’s state lawmakers “had financial ties to businesses or organisations that lobby state government… leav[ing] the public interest to career lawyers, bankers, farmers, lobbyists and insurance brokers in the legislature.”
This was the first national investigative journalism about apparent conflicts ofinterest (or the appearance of what I have called “legal corruption”) in state legislatures and it won the second, annual Investigative Reporting and Editors (IRE) online investigative reporting award. The award judges noted that “this is the first comprehensive look at all state legislators in one place and the interactive nature of the project allows voters to see for themselves how their lawmakers measure up.”
That national, state-level scrutiny has continued through the years, with major Center reports in 2004, 2006, and 2009. After these investigative revelations and the ripple effects of local media coverage, 21 states changed their financial disclosure laws, forms, or rules pertaining to lawmakers. Similarly, after the Center exposed the lax disclosure systems in states regarding lobbying, 24 states improved their lobbyist transparency requirements.
But the 2012 States investigation was the largest such effort to date, an unprecedented, data-driven analysis of transparency and accountability in all 50 states… a collaboration [between] the Center for Public Integrity, Global Integrity and Public Radio International (PRI), in co-operation with the Investigative News Network (now called the Institute for Non- profit News, INN, comprised of over 100 non-profit news member organisations). Each state received a ranking, based on 330 “Integrity Indicators” in 14 categories, such as access to information, campaign finance and executive accountability, along with others.
The project caught the public’s imagination, garnering over 1,200 news stories nationwide, including 89 local public radio stories produced and aired by 16 local public radio stations in California, Washington, New York, Texas, Pennsylvania, Massachusetts, Florida, Colorado, Oregon, North Carolina, Ohio, Missouri, New Hampshire, and Washington, DC. Several states subsequently passed new transparency and ethics-related laws.
Returning to the commercial journalism milieu, inside large news organisations intra- and inter-newsroom, domestic and foreign bureau editorial collaborations also have become substantially more feasible because of the various new media technological advances. And some of the most outstanding public service journalism certainly has benefited enormously from technologically enabled, multimedia collaborations between various news organisation bureaus, as well as editorial coordination and communication on a heretofore unimaginably large scale on important, exceedingly difficult, timely news-making projects.
For example, in the United States, the New York Times won an unprecedented seven Pulitzer Prizes in a single year, 2002 – six of them about the terrorist attacks on Sept. 11, 2001 (in the previous century, no US newspaper had ever won more than three Pulitzer Prizes in a single year). More than 160 Times reporters, photographers and editors around the US and the world were involved in the remarkable, herculean daily and long- form media coverage, which included 2,000 brief “Portraits of Grief” stories chronicling the lives and deaths of the missing at “Ground Zero” where the attacks occurred, as well as a large, heartrending book with “charts, graphs and 250 full-color photographs documenting the gripping scenes.”
And in 2010, The Washington Post two-time Pulitzer Prize winner Dana Priest and author/journalist William Arkin, both respected national security journalists, and a team of 28 “investigative reporters, cartography experts, database reporters, video journalists, researchers, interactive graphic designers, digital designers, graphic designers and graphics editors” conducted an extraordinary two-year investigation into the US government’s nearly decade-long response to the horrific terrorist attacks on Sept. 11, 2001.
The first in a series of investigative articles on “Top Secret America,” was headlined, “A Hidden World, Growing Beyond Control,” and the opening sentence was:
The top-secret world the government created in response to the terrorist attacks of Sept. 11, 2001, has become so large, so unwieldy and so secretive that no one knows how much money it costs, how many people it employs, how many programs exist within it or exactly how many agencies do the same work.
The investigative team learned that some 1,271 government organisations and 1,931 private companies work on programs related to counter-terrorism, homeland security and intelligence in about 10,000 locations across the United States’, and that in the Washington, DC, area, “33 building complexes for top-secret intelligence work are under construction or have been built since September 2001. Together they occupy the equivalent of almost three Pentagons or 22 U.S. Capitol buildings.” The reporting/researcher team found that “many security and intelligence agencies do the same work, creating redundancy and waste. For example, 51 federal organisations and military commands, operating in 15 U.S. cities, track the flow of money to and from terrorist networks.”
Besides the series, their related book, Top Secret America: The Rise of the New American Security State, was a national bestseller and it was also accompanied by a PBS Frontline documentary by the same name. The investigative project’s methodology was highly sophisticated, and it detailed how they analysed an extraordinarily complex labyrinth of “hundreds of thousands of public records of government organisations and private-sector companies.” The project team ‘scraped’ thousands of corporate and local, state, and federal government agency websites, and upon publication, also presented extraordinary, state- of-the-art data visualisation graphics for the reader to better understand the myriad issues involved via straightforward presentations such as “See the map,” “Explore connections,” “Find companies,” and “Search the data.”
The operative word here is data. Beginning in the 1952 US presidential election with the advent of sophisticated public opinion polling by CBS during elections and other times in the United States which especially accelerated in the 1970s, news organisations were increasingly beginning to realise the critical importance of the need to gather, sort, sift, and analyse massive amounts of computer data in order to better inform their journalism and the public.
A pre-eminent pioneer in “computer-assisted reporting” has been American journalist Philip Meyer, not only about public opinion research regarding vital matters of the day, but because of a “seminal book” first published in 1973 and still read by journalists all over the world, Precision Journalism: A Reporter’s Introduction to Social Science Methods. In it he elucidated a simple but very significant idea, with amplification, “that journalists should learn adequate research methods from scientists.”
In the United States, the National Institute for Computer-Assisted Reporting (NICAR, within Investigative Reporters and Editors, IRE, a not-for-profit organisation that is the largest, oldest investigative reporting membership organisation in the world, located at the University of Missouri School of Journalism) was created in 1989. And “since then, thousands of reporters from the USA and more than 30 other countries have been trained in applying computing to their journalistic activities… and investigative journalists have built their own quantitative databases since the early 1990s” (Gynnild 2014: 718). And every year since 2005, NICAR/IRE presents the prestigious Philip Meyer Award, which “recognises the best journalism done using social research methods.”
Separately, facilitated because of the evolution of the Web and the computerisation and thus the increased accessibility of government data and other, heretofore paper records, another important development has been “data-driven journalism.” It is obviously related but somewhat different from “traditional” computer-assisted reporting because it refers specifically to open data – data that is freely available online and can be analyzed with freely accessible open-source tools. The Guardian calls its Content API and Data Store the “open-platform initiative,” and as Astrid Gynnild of the University of Bergen (Norway) has noted, the Guardian not only does “original research on data they have obtained; their Data Blog also provides a searchable index of world government data which contains more than 800 datasets (as of Feb. 13, 2013)”.
No news organization in the world has advanced open data more than the Guardian, which has proudly (and properly) noted that its “journalists have been working with – and visualising – data since the Guardian first published in 1821.” The creator and first editor in 2009 of the online Guardian’s internationally popular, daily Datablog website, guardian.co.uk/ data, was Simon Rogers, author of Facts are Sacred: The Power of Data (2013), who was named the “Best UK Internet Journalist” by the Oxford Internet Institute at Oxford University. The Guardian Datablog is “the first systematic effort to incorporate publicly available data sources into news reporting,” and it is very possibly the “world’s most popular data journalism website.”
Of course, the “biggest news” regarding data and journalism in recent years has been about the unprecedented, massive amounts of leaked secret government data, which have substantially aided and abetted collaboration between competing journalists and their respective news organisations. The three largest, most complex, and controversial, secret leaked “Big Data” projects ever undertaken and reported by professional journalists in the world, according to Wired magazine (and others), have been, in chronological order: (1) Julian Assange-led Wikileaks “Cablegate,” a 1.73 gigabyte collection of US State Department documents that was “almost a hundred times bigger” than the leaked US Department of Defense “Pentagon Papers” (7,000 pages) in 1971, (2) Former National Security Agency (NSA) contractor Edward Snowden’s leaks of approximately 1.7 million internal documents, which represents only 15 percent of the size of (3) the anonymous leak that led to the online publication of the “Panama Papers: Politicians, Criminals and the Rogue Industry that Hides their Cash” by the International Consortium of Investigative Journalists, which I founded in 1997 as a project within the Center for Public Integrity in Washington.
The Panama Papers leak consisted of 11.5 million documents that belonged to the Panamanian law firm “and corporate service provider” Mossack Fonseca, including financial and attorney–client information pertaining to over 214,000 offshore entities, including 4.8 million emails about “how rich and powerful people hide their wealth.” They were anonymously leaked by a confidential source to reporters Bastian Obermayer and Frederik Obermaier at the German newspaper Süddeutsche Zeitung and subsequently shared, organised, and published by the International Consortium of Investigative Journalists in Washington. Edward Snowden himself has correctly called the Panama Papers “the biggest leak in the history of data journalism.”
Case Study: The International Consortium of Investigative Journalists
This investigation has received numerous, prestigious awards around the world, including the Pulitzer Prize for Explanatory Reporting (along with US publishing partners McClatchy and the Miami Herald) in the United States. The Pulitzer Prize Board praised the Panama Papers exposé for its collaboration of hundreds of reporters ‘on six continents to expose the hidden infrastructure and global scale of offshore tax havens’. According to ICIJ senior editor Michael Hudson, ‘in the end, more than 400 journalists – reporters, editors, computer programmers, fact-checkers and others – worked on the project’, studied ‘millions of confidential emails and corporate documents written in French, English, Spanish, Russian, Mandarin and Arabic and us(ing) shoe-leather reporting to track down additional documents and verify facts on six continents’.
To date, the Panama Papers investigation has prompted over ‘150 inquiries, audits and investigations in 79 countries and exposed offshore companies linked to more than 150 politicians in more than 50 countries … including 14 current or former world leaders’. It also has revealed a network of people close to Russian President Vladimir Putin that ‘shuffled as much as $2 billion around the world’. And in February 2017, Panamanian government officials arrested the founders of Mossack Fonseca, the Panamanian law firm from which all of the data emanated, for money laundering.
And who was the leaker of the biggest trove of private, sensitive financial and other documents ever revealed? Intriguingly, no one knows, including Bastian Obermayer, the Süddeutsche Zeitung reporter at the receiving end of an encrypted email with this tantalising lead: “Hello, this is John Doe. Interested in data?” Seeking unequivocal anonymity, the leaker set the ground rules: “My life is in danger, we will only chat over encrypted files. No meeting ever.” Obermayer replied, “We’re very interested.” His or her motive was apparently related to income inequality issues, explaining the largest leak in history with this message, “I understood enough about their contents to realise the scale of the injustices they described.”
The reason the ICIJ could undertake and orchestrate the extensive, indeed unprecedented, global collaboration dissemination of leaked, sensitive financial and other records is because the staff and ICIJ member journalists had navigated similar complex, international financial and tax- related issues for the preceding five years. ICIJ Director Gerard Ryle and Deputy Director Marina Walker Guevara in Washington and Mar Cabra, who is based in Madrid, Spain, and is the Editor overseeing the ICIJ Data & Research Unit, previously had shepherded to international publication with media partners throughout the world other then-unprecedented tax avoidance (legal), evasion (illegal), and “avoision” (a murky grey area of uncertain illegality or likelihood of government prosecution) exposés also possible because of substantial bank and other leaked data. They included “Secrecy for Sale: Inside the Global Offshore Money Maze,” “Swiss Leaks: Murky Cash Sheltered by Bank Secrecy,” and “Luxembourg Leaks: Global Companies’ Secrets Exposed.”
The massive Panama Papers project was actually the ICIJ’s 26th cross- border investigation, and at the time it was published, the ICIJ was a project within the Center for Public Integrity, as it had been since its inception in late 1997. Thus, there was a substantial, 19-year, 25-investigations precedent and logistical and technical learning curve by the organisation and its member journalists leading up to the largest investigative (or any other type of) reporting collaboration in the history of journalism.
Those prior investigations ranged widely in subject matter from illegal cigarette smuggling by the major tobacco manufacturers; the growing role of private military companies; the privatisation of water on six continents; the international trade in asbestos; the illegal black-market overfishing of the world’s oceans; the financial “windfalls of war” to the private military companies involved in the US wars in Afghanistan and Iraq, etc.
The ICIJ had been created in the autumn of 1997, as an internal project of the Center for Public Integrity, following five full years of exploration, planning, fundraising, etc. The admittedly audacious, even outlandish idea was to create an assemblage of the pre-eminent investigative reporters in the world, who I described jokingly in private as the “Jedi Knights” of investigative journalism in each of their respective countries around the world. I pondered the possibility and the logistical encumbrances to be surmounted for over five years, personally also convinced that the commercial media organisations would never be able to create such a collaborate entity, frankly because of their overweening individual pride, arrogance, competitiveness, and thus their overall inability to “play in the sandbox with others.”
And at the same time, I was firmly convinced, then and now, as I have noted in the past, that “amid a world of debilitating political dysfunction with the most dire potential consequences, the crucial concept of public accountability cannot and should not be narrowly confined by local or national borders, or the rigid strictures, orthodoxies, conceits and insecurities of traditional journalism.”
In February 2017, nearly two decades after it had been proposed and had begun as a new project of the Center for Public Integrity, for various reasons the Center and the International Consortium of Investigative Journalists agreed that it was finally time for the latter to become a separate, independent, non-profit news organisation. Incorporated in the United States, at this writing the ICIJ is awaiting formal approval by the US Internal Revenue Service (IRS) of its request to become a 501(c)(3) non-profit, tax- exempt corporation. The Panama Papers global investigation was thus the final ICIJ project published while still a project of the Center.
The Promise of Crowdsourcing and Academic–Reportorial Synergies
The relatively recent journalistic application of social science methods more common to academia nationally and internationally has been a de facto, implicit first stage in overall collaboration between these important spheres. From the telegraph to the computer age, the creation of the internet, the World Wide Web, and our brave new world of algorithms, bots, drone journalism, and satellite imagery, etc., what is already possible in the 21st century almost defies credulity and it is all moving at lightning speed. Consider that recent, significant phenomena in the context of journalistic application and their linguistic terms such as “crowdsourcing” and “Big Data” were not even added to the Oxford English Dictionary until 2013!
“Crowdsourcing” was first used in print in a Wired magazine article in 2006 written by Jeff Howe and edited by Mark Robinson, titled “The Rise of Crowdsourcing.” And that concept and new word had been inspired in part by an important, well-received 2004 book, The Wisdom of Crowds, by James Surowiecki. The meaning of “crowdsourcing” is, according to Howe in a subsequent online blog, “the act of a company or institution taking a function once performed by employees and outsourcing it to an undefined (and generally large) network of people in the form of an open call… (a) large network of potential laborers.”
The Columbia Journalism School’s Tow Center for Digital Journalism Guide to Crowdsourcing, after 51 interviews and analysing 18 survey responses, defines journalism crowdsourcing as “the act of specifically inviting a group of people to participate in a reporting task – such as newsgathering, data collection, or analysis – through a targeted, open call for input; personal experiences; documents; or other contributions” (Onuoha 2015).
In roughly the past decade, there have been numerous, dramatic, and remarkable examples of the power and rapid evolution of citizen participation in information-gathering, including in the midst of significant national and international “news” events. Indeed, citizens’ involvement in assisting and contributing to the newsgathering process has been evolving in a dynamic, very engaging way.
For example, on the day of the “worst terrorist atrocity on British soil,” the July 7, 2005, London bombings in which four suicide bombers “with rucksacks full of explosives attacked central London, killing 52 people and injuring hundreds more,” the BBC “received 22,000 emails and text messages about the bombings and 300 photos, of which 50 were within an hour of the first bomb going off.”
On April 13, 2013, in the United States, two bombs exploded near the finish line of the Boston Marathon, killing three people and injuring 170 others, and the pursuit of the perpetrators ended four days later, with one of them killed and the other captured. Besides the official investigation led by law enforcement officials, there was a “parallel investigation conducted by a growing movement of online sleuths, often referred to as cyber-vigilantes, or ‘digilantes.’ These groups, organically formed in ad hoc fashion, harness the power of collective knowledge and resources – ‘crowdsourcing’ – towards a common purpose. In the Boston Marathon case, cyber-sleuths were pooling information and resources in order to assist the police in their criminal investigation of the bombing.”
Nearly 12 years after the London Tube bombing and four years after the Boston Marathon US bombing, in Manchester, England, a 28 May 2017 suicide bombing at a large pop concert killed 22 people and injured dozens more. Within hours the police urged those who might have “photos or video from their smartphones or dashcams to upload them to a dedicated server set up by the national U.K. authorities at ukpoliceimageappeal.co.uk.”
Possibly the most interesting and pioneering, non-crime related examples of crowdsourcing was in 2009 when the Guardian created its pioneering, searchable online database with thousands of spending receipts of the Members of the British Parliament and asked the public to “help mine the dataset for interesting information… Over 20,000 volunteers searched more than 170,000 documents, setting a new standard for the potential of crowdsourced journalism to produce high audience engagement and tangible journalistic outcomes” (Onuoha 2015).
As Alan Rusbridger, the editor-in-chief of the Guardian for 20 years from 1995 to 2015, explained the phenomenon and importance of news organisations directly consulting their readers about various important issues of the day: Would it be better as a newspaper to have as many other views as possible, and the answer is always yes. It has to be true. So, well, that’s it, that’s open journalism… Everywhere we tried it [crowdsourcing], it turned out to be true. We did it in sports, we did it in war reporting, we did it in education, [in] science, the environment. It was always true.
Among U.S. news organisations, no one is more involved with gathering crowdsourced information for its reporting than ProPublica, the non-profit news organisation based in New York which has won four Pulitzer Prizes for its reporting since it began operation in 2008. And no other American news organisation “has cultivated the art of crowdsourcing like ProPublica. With patience and acumen, it has both embraced a unique mindset and developed a robust toolkit to transform enterprise journalism,” according to a Columbia University Tow Center for Digital Journalism report. Its crowdsourcing has enriched several ProPublica exposés “focusing on patient safety, nursing home inspections,” surgeons, etc. (Onuoha 2015).
But no publisher in the world utilises the combined energies and wisdom of the crowd more broadly or extensively than Wikipedia, the self-described “free online encyclopedia” founded by Jimmy Wales and Larry Sanger in 2001 and owned by the non-profit, US-based organisation Wikimedia Foundation. Not only is it “the largest and most popular general reference work on the Internet,” it is “ranked among the ten most popular websites” in the world. According to Wales, 70,000 to 80,000 people around the world edit Wikipedia at least five times a month, and within that, there is a smaller group of approximately 3,000–5,000 “core editors.”
Another important development in the annals of the 21st-century journalistic progress has been the creation of The Conversation, an independent, not-for-profit media outlet that primarily publishes information from the academic and research communities. It was launched in Australia in 2011 and in the UK in 2013, co-founded by Jack Rejtman (formerly with Yahoo News) and veteran British and Australian newspaper editor Andrew Jaspan, who was the Editor and Executive Director of The Conversation for six years, from 2011 to 2017. During that time, he “secured funding and led the launches of the UK, US, Africa, French and Global editions.” As of April 2017, The Conversation worldwide has published “58,700 articles contributed by 26,000 scholars and researchers and scientists from 1,990 universities and research universities around the world.”
The Conversation is the first entrepreneurial attempt to develop and publish editorial content written by thousands of academic and research scholars around the world, and also derive substantial operating revenue from financial contributions from colleges and universities.
In the US, at the American University (AU) School of Communication, I have informally proposed the creation of a new multidisciplinary academic field called Accountability Studies that “would involve professors with different types of accountability knowledge and expertise from throughout the university” (Lewis 2014: 66–7). I am also a member of ECOllaborative, an informal network of AU professors across six schools interested in environmental-related policy and other issues. And, separately, in 2015, the non-profit news organisation I lead, the Investigative Reporting Workshop (which co-publishes/co-produces with the Washington Post and the PBS documentary programme Frontline), collaborated with a public anthropologist member of the AU Faculty, Associate Professor David Vine, assisting him with the graphics design and global mapping work relating to his book, and publishing an excerpt from it about the astonishing number and extent of US military bases and installations throughout the world.
All of this is positive and productive in terms of ‘the possible’ eclectic, research collaborations and the increasing needs to “tear down” the various walls impeding their evolution and progress.
Fundamentally, for citizens of the world, the extraordinary reading, writing, and publishing possibilities and opportunities online are without precedent in history. There is a greater collective clamouring for information, for truth, for accountability now than at any previous time in history. And thanks to the internet and the World Wide Web and the ever-evolving global search engines and other recent computer-related capabilities, infinitely more information is also now readily available to us, and that will keep increasing exponentially. Now, we find ourselves in a previously unfathomable, symbiotic moment, a wholly new dimension in terms of professional, scholarly, technological, and creative communication and cooperation.
Imagine a world in which non-government organisation researchers, public interest activists, lawyers, government prosecutors and investigators, corporate investigators, forensic accountants, political scientists, computer and other scientists, investigative historians, public anthropologists, and journalists are occasionally looking in all the same places. Imagine that, to varying degrees, they are all beginning to utilise the same exciting new data technologies and analytics and other intellectual cross-pollination possibilities, exchanging ideas and sometimes working and writing together, side by side, across borders, genres.
These are collaborative, 21st-century fact-finders, fact-checkers, and more broadly, truth-travellers and truth tellers, searching for information, its verification and “the truth,” each of them coming from very different perspectives, education backgrounds, interests, professional expertise, not to mention internationally and culturally diverse geographic and economic circumstances. But despite these differences, they have much in common – they are all intrinsically curious and have an inordinate amount of patience, determination, and mettle. They are willing, if necessary, to persevere in their quest for answers for months, years, and sometimes even decades.
I find this suddenly noticeable, global community of interest in verifiable knowledge and understanding to be very exciting and auspicious, when it comes to the future of truth and, more narrowly, the future of journalism. For it is in our common interest, as citizens living in a representational democracy predicated on the principle of self-determination and self-government, to be reasonably well informed and to be able to distinguish between reality and unreality, fact from fiction and faction. We therefore all necessarily have a shared value in needing to know the basic truth of the matter, whatever that specific matter is. As Bill Kovach and Tom Rosenstiel noted in their seminal book, The Elements of Journalism, “Journalism’s first obligation is to the truth.” But as they also note, “that, in turn, implies a two- way process. The citizen has an obligation to approach the news with an open mind and not just a desire that the news reinforce existing opinion” (Kovach and Rosenstiel 2007: 36–50, 249). As citizens, fundamentally, we all have an obligation to the truth. And I have never believed that the search for truth is, can be, or should be the exclusive preserve of journalists.
Facing the Future: Beyond the Current Conventions of Communication
All of the above explorations and initiatives regarding journalistic and other creative collaborations are important, constructive, and connote forward progress. But it is not unreasonable to also ask an inconvenient question. Are they sufficiently responsive to the serious, profound issues confronting this troubled world and, in particular, its pressing information and public accountability needs?
In this “World Wide Web” era with its shared information, increasing collaboration, “wisdom of the crowd” sensibilities, and also vast social networks in the millions of people, broadly interested in the same subjects or thematic, cross-border transcendent issues (e.g. health, environment, human rights, security, etc.), 21st-century newsgathering must rise above traditional but ultimately parochial metropolitan and nation-state geographic boundaries. The aperture of journalists’ and citizens’ lens must necessarily become much, much wider, outside borders, geographical and otherwise. Accountability of those in public and private power can and must continue to be precise and granular, of course, informed by specific, publicly available, accountability- related data. But it is the view of this author that the overall concept of public accountability – and, in particular, the important journalism about it – increasingly cannot and should not be narrowly confined by mere geographic boundaries, whether a town, city, county, state, or country.
Instead, it must consist more of broader, amassed knowledge and understanding, across borders, professional disciplines, and cultures, perhaps through the precise prism of documented, reliably sourced, public accountability issues in the world, in the context of the uses, the occasionally glaring, wilful non-uses, misuses, and abuses of political, corporate, and other power in the world. Imagine if you could combine the most authoritative, known information from various disparate sectors, including journalism, but also such academic areas of expertise such as investigative history, forensic accounting, computer science and statistics, political science, economics, public anthropology, human rights, public interest, and other law-related fields?
That kind of collaborative, accountability journalism, across fields, sectors, borders, and cultures, is all quite possible but it is still insufficiently explored because of various professional, political, logistical, and other encumbrances and realities.
The need to more fully illuminate the uses and abuses of power is quite obvious. Imagine a place online where you could go to find amassed, online searchable, accountability-related, primary documents-based information in the world from national and multilateral government offices that is credible, documented, and authoritative. Information, for example, about who exactly the worst corporate, financial scofflaws are, who the documented (based on government or criminal/civil court information) worst corporate violators of national or international safety, environmental, health, financial, and other laws and regulations are. Imagine a central, public registry online for all of the companies in the past decade decertified by one or more of the world’s stock exchanges for fraud or other misbehaviour, all of the private interests found to have violated national and international laws worldwide, etc.
In terms of transparency, accountability, and responsive journalism and democracy, all of these things ideally should be available and accessible to the public today. But they aren’t and it is probably quite unlikely that will change anytime soon.
But perhaps, as the 18th-century English writer Samuel Johnson reportedly found in a different context, we will unexpectedly encounter “the triumph of hope over experience.”