“Wikimedia and Censorship”

By Jim DeBrosse

“Wikimedia and Censorship”

A Case Study Comparison of the Top 10 Censored Stories of 2010-2011

By Jim DeBrosse

In a case study of Wikipedia and Wikinews coverage of Project Censored’s Top 10 Censored Stories of 2010-2011, this paper found that Wikinews failed to cover any of the 10 censored stories while Wikipedia covered one story and provided partial matches for five others, the latter results affirming the discourse theories of Habermas. Wikinews’ total failure may be attributed primarily to its lack of reporting resources and its emphasis on breaking news. And while Wikipedia had greater success than the mainstream media in capturing censored factual content, its neutral point of view and “reliable sources” requirements resulted in a failure to capture alternative points of view and to analyze and synthesize news topics in ways that challenged the hegemonic view. The more complex and controversial the topic on the Project Censored list, the more likely Wikipedia had no match or only a partial match for the story.


In recent years, the online encyclopedia Wikipedia and its sister online news site, Wikinews, have become the foci of a scholarly debate over whether their open-access, non-commercial structure and transparent, collaborative editing process are sufficient to protect both their reliability and their claim to neutrality. This is not a trivial issue given that Wikipedia, in its own words “the free encyclopedia that anyone can use” (Wikipedia, 2012a), is the nation’s most frequently consulted online reference site and the sixth most popular site overall (Alexa, 2012a). More than a decade after its start-up in 2001, the English version listed nearly four million articles, more than 16 million registered users and 1,507 administrators (Wikipedia, 2012a). Worldwide, Wikipedia is available in more than 250 languages, including Greenlandic, which is spoken by fewer than 60,000 people (Smith, 2011).

Wikinews was spun off from Wikipedia in 2004 with the idea of capturing a greater number of breaking, less issue-oriented news stories; however, the site has been far less successful than its more established sister project. According to Alexa, Wikinews ranks 28,590th in traffic among U.S. websites (Alexa, 2012b). Perhaps more revealing, in the first 10 days of February, 2012, Wikinews ran an average of about one story per day while Wikipedia’s “In the News” section averaged more than 10 stories per day (Wikipedia, 2012c). Wikinews hews to the same principles and process as its sister site, but because its stated aim is to capture a “snapshot” of history rather than encyclopedic knowledge, its stories are “locked” from any further edits after three days on the site and sent to its archives “fully protected” after a week (Wikinews, 2012a).

The growing popularity of Wikimedia has brought with it increasing academic scrutiny of both their promise and pitfalls as online public forums. Researchers have praised their openness, transparency and collaborative approach to information building as a possible means to challenging the constraints of mainstream media. But the same researchers have also expressed fears of dominance and selection bias by its most frequent and/or insistent contributors and editors, some of whom may have an agenda or at least qualities that are not representative of a larger diversity in society (Thorsen, 2008; McIntosh, 2008; Hanson et al., 2008; Hartelius, 2010; & Oboler et. al., 2011). In essence, the very qualities that make the sites an alternative to mainstream media – lack of ownership, cooperative format and open access to contributions and editing by anyone – also put them at risk of inaccuracies, vandalism and lop-sided struggles for control of content.

As Habermas tell us, all media are human constructs and therefore can only approach in varying degrees but cannot reach the ideal of rational public discourse where every point of view can be presented and weighed in a dispassionate, collaborative assessment of the “facts.” In Habermas’s “ideal speech situation,” the following rules apply: 1) Every competent person is allowed to take part in the discussion; 2) Every person can question any assertion, introduce any assertion and express his or her attitudes, desires and needs; and 3) No person may be prevented, by internal or external coercion, from exercising their rights in 1 and 2 (Habermas, 1990).

The question to be examined in this paper is whether Wikipedia and Wikinews, despite their obvious risks and flaws, better approximate the Habermasian ideal in their coverage of contemporary issues than do the mainstream media. In an attempt to shed light on that question, I will search Wikinews and Wikipedia for the top 10 censored stories of 2010-2011, as determined by the media watchdog Project Censored, and compare their coverage against the deficits in the mainstream media cited by Project Censored authors.

As a non-profit, university-based organization whose board members include prominent theorists of political economy such as Noam Chomsky, Project Censored can be expected to list among its censored stories those topics most affected by corporate ownership and hegemonic framing. Because Wikipedia and Wikinews have no owners or commercial interests and are open and free to all users, we should expect those sites, especially when considered in combination, to have done a more complete job of covering the top 10 censored stories than did the mainstream media.

Literature Review

Almost since its inception, the Internet has been seen as a democracy-enhancing technology that would give users more control over “what they say, what they are told and whom they talk to” (Williams, 2003, p. 91). Because of the Internet’s open access and nearly unlimited venues, Rheingold (1994) predicted that it would be a tool to repel the domination of information flow and the management of public opinion by large corporations and the state. Toffler (1980) and Negroponte (1996) also believed that the Internet would broaden choices and empower individuals to resist the influence of large, monopolistic media corporations.

Wikipedia, more than many websites on the Internet, has striven to fulfill that early promise through both its technical structure and its guiding principles for information exchange. By allowing anyone to contribute to or edit a topic, Wikipedia holds that “censorship or imposing ‘official’ points of view is extremely difficult to achieve and usually fails after a period of time. Eventually, for most articles, all notable views become fairly described and a neutral point of view reached” (Wikipedia, 2012a). The site states that its articles are written mostly by amateurs, “with expert credentials given no extra weight.” Its reliance on a diverse range of amateurs with more free time helps assure that the site “can make rapid changes in response to current events” and “significantly reduce(s) regional and cultural bias found in many other publications, and makes it very difficult for any group to censor or impose bias.”

To safeguard against vandalism and manipulation of content, Wikipedia software retains an accessible history of all edits and changes as well as the identities of those making the changes. Anyone can report and correct vandalism – that is, intentional falsification or malicious deletion of content. Contentious topics also have discussion or “talk” pages, separate from the content pages, in which contributors can air their differences. Those who abuse their user accounts, such as creating undetected multiple accounts (“sock puppets”), soliciting other parties to support their point of view or disrupting the editing process “in an annoying manner,” may have their accounts blocked, their access to the site banned and even their identities and purposes exposed to prevent future abuse of the site (Wikipedia, 2012a). As a final safety check, Wikipedia has a team of volunteer administrators who can step in when disputes are irresolvable. Administrators are “trusted with access to restricted technical features” that allow them to “protect, delete and restore pages, move pages over redirects, hide and delete page revisions, edit protected pages, and block other editors” (Wikipedia, 2012a).

Some scholars believe that Wikipedia’s open, transparent and collaborative approach to producing content “embodies the promise of technology-enabled rational discourse” that Habermas (1976, 1984) argued would guide the public sphere away from unwarranted societal control (Hansen et al., 2009, p. 38). Habermas tells us that it is only when participants can contest the validity of all claims while also sincerely desiring to arrive at a mutual understanding that rational discourse can ensue. This presupposes, above all, that the dialogue is free of all force (Habermas, 1984, p. 25). Habermas’s “ideal speech situation” assumes that every actor has the ability to participate and express themselves and to question or introduce any proposal and that no actor is subject compulsion (Habermas, 1990).

After examining case studies of three controversial topics listed on Wikipedia – the Armenian genocide, ethanol fuel and intellectual property – Hansen et al. conclude that the site “offers an example of an information system that supports the emancipatory objectives of critical social theory” (2009, p. 53). Although they acknowledge that Wikipedia is potentially biased toward the educated and those with more persuasive verbal skills, the researchers say the site approximates the Habermasian ideal by engaging contributors “in a cooperative search for truth” (Hansen et al., 2009, p. 42). “Discourse is not only allowed, but is enabled and fully mediated, by an information system” that assumes “the good faith” of its participants to reach a consensus (Hansen et al., 2008, p. 41). Only when the consensus process fails do administrators intervene by blocking or banning certain editors who demonstrate that they can’t play by the rules (Hansen, 2009, p. 49). By giving equal weight to all editors, Wikipedia reflects Habermas’s anti-elitist ideal that “truth claims should be mediated by the force of the argument rather than by the credential of the individual” (Hansen et al., 2009, p. 49). The authors concede that the neutral point of view requirement on Wikipedia cannot eliminate the strategic intent of some participants to skew the content in their favor, “yet the overall outcome, due to the structure of the discourse, can be considered rational discourse,” especially since the discourse remains open-ended in a sustained effort to arrive at the truth (2009, p. 50).

Oboler et al. (2010), however, argue that Wikipedia is vulnerable to many of the same distortions of the truth that afflict traditional media, in particular gatekeeping, defined by McQuail as “the process by which selections are made in media work, especially decisions whether or not to admit a particular news story pass through the ‘gates’ of the news medium” (1994, p. 213), and also framing, defined by Gitlin as “persistent patterns of cognition, interpretation, and presentation, of selection, emphasis and exclusion by which symbol-handlers routinely organized discourse” (1980, p. 7). The authors say their examination of the public edit histories of 16 non-governmental human rights organizations active in the Arab-Israeli conflict show evidence of framing by certain editors who “eliminated criticism published by reliable sources,” thus “allow(ing) parts of Wikipedia to be dominated by those with an agenda” (Oboler et al., 2010, p. 295).

Their analysis of the public edit histories of the 16 NGOs broke the offending editors into four categories: 1) campaigners who edit across many topics toward a larger goal, 2) advocates who are concerned with one page or a very limited topic in which they may have a vested interest, 3) single-issue editors who drop in and briefly interact with other editors and 4) casual editors whose attention is “divided and thinly spread” over many topics (Oboler et al., 2010, p. 293). The authors conclude that “these problems limit Wikipedia’s ability to improve in quality and must likewise limit our faith in what we read there, especially on contentious topics” (Oboler et al., 2010, p. 295).

From a rhetorical perspective, Hartelius (2010) holds that Wikipedia is susceptible to “Wikilobbying” by those “seeking free advertising on its site” (p. 505) and is “compromised by the establishment of a ‘technocratic’ hierarchy” p. 505) of younger, more educated English-speaking males (p. 516). Even so, the author says, those forces are counter-balanced by others that create hope for its democratizing potential. The site’s open and continuous dialogic process “forces scholars to reexamine fundamental assumptions and accepted symbolic orders” and, through the transparency of its editing process, renders them more accountable to others (Hartelius, 2010, p. 519).

Because of its more transitory content, Wikinews faces additional challenges to its neutral point of view and to its thoroughness as a news gathering site. Bruns (2006) looked at the stagnant growth of Wikinews articles and concluded that its strict adherence to a neutral point of view and its time-limited editing often resulted in dull prose, too-heavy reliance on traditional media and a restricted number of points of view on any article. Because of its belief in being free of personal bias, Bruns argues, Wikinews “perpetuates and condenses the traditional story format of journalism, rather than developing a mode of news coverage which is better suited to dealing with complex events and presenting multiple perspectives on the news” (p. 9)

Thorsen (2008) took a closer look at the editing histories of Wikinews articles and concluded that its neutral point of view policy often drove away contributors who saw their efforts nitpicked and eviscerated of all but the “facts” as determined by an insistent group of other contributors. In an article about a mercury spill in a residential area, she found that quotes from people directly affected by the spill were excised in the name of neutrality while quotes from official sources were more often left intact, which “clearly suggests that a greater importance is placed upon professional people or members of the media than the local residents” (p. 947). She concludes that the application of the neutral point of view to Wikinews “is clearly inconsistent and more likely to reflect the individual contributors’ interpretation than a unified concept” (p. 952). In a similar vein, McIntosh (2008) concluded that by “emphasizing an ill-defined notion of ‘facts’,” Wikinews strips stories of their meaning and context for readers, limits how many points of view are presented and fails “to tackle some of the thorny issues of the day” (p. 208).

If, indeed, Wikipedia and Wikinews shift the locus of power in creating and shaping content away from societal control and toward the collaborative (albeit neutralizing and limited) perspective of its contributors, we might still expect those sites to present many of the topics and points of view that Project Censored says “are underreported, ignored, misrepresented or censored by the U.S. corporate media” (Project Censored, 2012a).


Censorship Background

Project Censored is a media research program based at Sonoma State University in California that collaborates with independent media organizations to advocate for free press rights in the United States. Through a partnership of faculty, students, and the community, Project Censored publishes a ranking each year of the top 25 most censored nationally important news stories (2012a). The group defines censorship “as the subtle yet constant and sophisticated manipulation of reality in our mass media outlet” via the intentional exclusion of part or all of a story “based on anything other than a desire to tell the truth” (2012a). Manipulation can come from government officials, advertisers and funders as well as the threat of lawsuits from “deep-pocket individuals, corporations, and institutions” (2012a).

While some critics have labeled the program as “left leaning” (2012b), Project Censored denies such claims, pointing out that “over 200 faculty and students from multiple disciplines and political orientations work with Project Censored each year.” The project compiles its annual list of the top 25 censored stories by examining between 700 and 1000 story submissions each year “from journalists, scholars, librarians, and concerned citizens around the world” (2012c).  With help from SSU faculty, students, and community members, the group reviews the story submissions “for coverage, content, reliability of sources and national significance” and selects 25 stories to submit to a panel of prominent scholars and journalists who then rank them in order of importance (Project Censored, 2012c).

Project Censored’s sourcing for the censored stories can include mainstream and alternative English language media, public documents, research papers and academic websites. In 4 of the 10 censored stories, Project Censored listed a mainstream medium as a source. No mainstream medium was listed as a source for more than one censored story.

Search Method

This study will look at the top ten censored stories most recently compiled for Project Censored’s Annual Yearbook 2012 (stories from 2010-2011). The search for matches on Wikinews and Wikipedia will be done via the following steps for search entries:

1. Complete headline of censored story.

2. Keywords in headline of censored story.

3. Keywords drawn from summary of censored story.

Categorizing Search Results

Search results for Wikinews and Wikipedia will be broken down into the following categories:  Exact Match, Partial Match, and No Match.

Censorship Details

Each Partial or Fragmented Match will be broken down further into:

1) Missing Content.

2) Misreported Content.

3) Under-Represented Point of View.

4) Misrepresented Point of View.

Talk Pages and Revision Histories

Partial Matches will be further examined for evidence of deleted content or points of view as well as disputes during the editing process. A talk page, or discussion page, is a page where editors can discuss improvements to a Wikipedia or Wikinews article or deal with disputes over editing changes (Wikipedia, 2012e; Wikinews, 2012d). Both Wikipedia and Wikinews provide a revision history for each article, showing in chronological order each step in the editing process and by whom it was done. The talk pages and revision histories will be examined in the discussion section of this paper.


The failure to find any of the top 10 censored stories on Wikinews is not surprising given the array of complex, highly interpretive stories on the Project Censored Top 10 list and the recognized limitations on the site’s ability to go beyond “the traditional story format” (Bruns, 206, p. 9). As Bruns points out, Wikinews’ time limit for gathering news (3 days) and its mandatory neutral point of view prevent the site from “dealing with complex events and presenting multiple perspectives on the news” (p. 9). The results likewise support McIntosh’s argument that Wikinews’ non-interpretive notion of “facts” strips stories of their larger meaning, limits their points of view and diverts the site’s coverage from “the thorny issues of the day” (2008, p. 208). Finally, as a volunteer operation launched in 2004, Wikinews lacks the resources to touch on more than just the most superficial events of the day. As of March 11, 2012, the site was posting an average of 10 stories a day with a list of just 20 active accredited users and 21 inactive accredited users (Wikinews, 2012b). Wikinews had 202,000 registered users and 61 administrators as of April 2010 compared to Wikipedia’s 12 million users and 1,700 administrators (Wikinews, 2012c; Wikipedia, 2012a).

With no time limits on its updates and 60 times the number of potential contributors as its sister project, Wikipedia achieved better but mixed results from the search. The findings suggest that Wikipedia suffers from some of the same delimiters pointed out by critics of Wikinews – a neutral point of view and a belief in uninterpreted “facts.” The more complex and controversial the topic on the Project Censored list, the more likely Wikipedia had no match or only a partial match for the story.

The one exact article match on Wikipedia was for Project Censored’s No. 2 censored story, “US Military Manipulates the Social Media.” The article was found under the unlikely heading of “Ntrepid,” the name of the California firm under U.S. military contract to provide the software for “Operation Earnest Voice.” OEV will enable U.S. service personnel to fake up to 10 online personas each for the purpose of influencing Internet conversations and spreading pro-American propaganda overseas.  The article is a straightforward presentation of “facts” culled entirely from official sources. There was little discussion or debate of the project on the article’s Talk page, an indication of low controversy. However, critics may not have noticed the article under the heading of “Ntrepid” rather than the better-known name of the project, “Operation Earnest Voice.”

“Global Food Crisis Expands” (Censored Story No. 4) was found partially covered in a single paragraph and chart provided in article under the heading “2010-2011 global food crisis.” The article had no Talk page and no links to other Wikipedia topics. In fairness, Wikipedia did have an extensive 13-page posting under the heading “2007-2008 world food price crisis.” However, when world food prices began to fall in 2009, interest in the article apparently waned and its content was not updated. Controversy and debate are absent from its Talk pages, with only two requests to add additional causes for explaining the crisis. Out of nearly 1,000 contributors to the original article, only six revisited the issue when world food prices again began to climb in 2010-2011.The lack of follow-up could be a drawback of Wikipedia’s collaborative approach. With so many contributors to the original article, few may have felt enough ownership of the topic to revisit it when the crisis began anew. This warrants further research.

“Google Spying?” (Censored Story No. 6) about the Federal Trade Commission’s decision to drop its investigation into Google’s illegal collection of personal data from home and business Wi-Fi networks while conducting mobile surveillance for its Map Street View was partially matched under two Wikipedia headings – “Criticism of Google” and “Google Street View privacy concerns.” But while Project Censored raised the issue of possible donations to the Obama campaign having quashed the FTC investigation, the Wikipedia articles mention only that Google said it had inadvertently collected the personal data while gathering information for its Street View project. Both Wikipedia articles quote critics of the privacy violation but do not include a reference to the FTC investigation. A final request on the Talk page for “Google Street View privacy concerns” asks for more information on how and why the Wi-Fi data was intercepted: “Neither the article nor the sources explain this except to say that it was inadvertent. I’m just wondering why the (Google surveillance) car would be set up to do that in the first place and I think an explanation would be a good addition to this article.” The request, posted December 14, 2010, had not been answered as of March 4, 2012.

Partial coverage of “U.S. Army and Psychology’s Largest Experiment Ever” (Censored story No. 7) was found under the article heading of “Comprehensive Soldier Fitness,” again the official name of the U.S. Army program designed to reduce or prevent adverse psychological consequences to soldiers who endure combat. Criticism of the program has come from two directions: those who think it violates the professional ethics of psychology by “desensitizing” troops to violence and those who see an abridgement of religious freedom because the program is designed to work best with soldiers who believe in a deity. The Wikipedia article devotes an entire page to criticism of the spiritual element of the program and the Army’s defense but says nothing about the issue of desensitization. What’s more, the Talk page points out that the original article had material directly copied from a website operated by the United States Army, raising the question of whether this is an example of what Hartelius (2010) calls “Wikilobbying” by those “seeking free advertising on its site” (p. 505).

Wikipedia coverage of “Government Sponsored Technologies for Weather Modification” (Censored story No. 9) fell into partial matches provided by three technically challenging articles – “High Frequency Active Auroral Research Program (HAARP),” “U.S. Climate Change Technology Program,” and “Cloud Seeding.” While the Project Censored summary had the advantage of drawing together and analyzing disparate government research programs with the potential to modify the environment, Wikipedia’s breakdown of each of the three modification approaches makes it harder for the reader to assimilate, and question, the government’s intentions and the potential impact of the programs. Again, as Thorsen (2008) points out, this could be the result of Wikipedia’s collaborative approach among hundreds of contributors and administrators, none of whom has the authority to synthesize disparate, complex material into a single unified concept without renegotiating the tedious months-long process of creation, revision, discussion and finalization of the article.

Wikipedia perhaps falls farthest from the Habermasian ideal with its stated preference for “reliable, published sources” for all “facts” presented in its articles (Wikipedia, 2012d). Such a preference may not entirely prevent contributors from questioning “reliable, published” assertions or from introducing alternative assertions, but it can certainly impair their ability to influence the outcome of an article. Personal observation and alternative media and research sources are given lower weight by Wikipedia administrator and often categorized as “fringe theory” (Wikipedia, 2012f). Often, too, these less weighted points of view are grouped together and marginalized within articles under a separate section titled “Conspiracy Theories” (Wikipedia, 2012g).

Wikipedia’s tendency to marginalize alternative points of view is particularly strong when dealing with topics of government secrecy, as revealed in a somewhat heated exchange between a contributor named Henri Hudson and Wikipedia administrators on the Talk page of the HAARP article. According to the Wikipedia article, the HAARP program uses high frequency radio beams to “temporarily excite a limited area of the ionosphere” in order to see if it can enhance radio communication and surveillance. Project Censored said that some scientists believe the radio waves can affect the jet stream and climate patterns. The program has been blamed for allegedly causing numerous natural disasters, from earthquakes and hurricanes to floods and droughts, because in the words of one HAARP scientist quoted in the Wikipedia article “its purpose seems deeply mysterious to the scientifically uninformed.”

Hudson had pushed for a separate section in the article devoted to a 1999 European Parliamentary report calling for HAARP’s “legal, ecological and ethical implications to be examined by an international independent body before any further research and testing.” A Wikipedia editor at first insisted the report belonged under a conspiracy theories section, but after Hudson objected, the editor agreed to take the issue to the Talk pages for wider review. On the Talk page, administrator Lucky Louie argued that the EU report was not a reliable source because “mention of it appears to be relegated to conspiracy websites like Prison Planet and David Icke… Wikipedia can only mirror the degree of attention that a topic has been given by reliable sources.” Hence, he argued, Wikipedia’s reliance on mainstream and official sources barred the EU reference from the article’s core content. In what appears to be a final decision, the EU report was referenced in a single sentence in the article under a separate section titled “Fringe Speculation.” The section’s first paragraph points out that HAARP “is the subject of numerous conspiracy theories.”

However, as Hudson reasonably argued, the EU report is not a theory but a joint statement of “environmental concern” from an official governing body and a request for more openness from the U.S. government about its research. In this case, Wikipedia’s “either/or” categorization for proven and unproven scientific theory fails to take into account that even non-scientists can question the aims of scientific research. While Hansen (2009) perhaps more than any other scholar believes that Wikipedia approximates the Habermasian ideal of “a cooperative search for the truth,” he also acknowledges that it has a bias toward the educated and those with more persuasive verbal skills. This article would seem to be a good example.

Wikipedia’s two partial matches for “Real Unemployment: One Out of Five in the U.S.” (Censored Story No. 10) fall under the headings of “Unemployment” and “Shadowstats.com,” the latter being a website whose author maintains that most governments manipulate their economic data to put their policies in the best possible light. In the U.S., for instance, Bureau of Labor Statistics unemployment figures count only those people who are looking for jobs and have been unemployed for less than year. Those who have quit looking for jobs or have been unemployed longer than a year are not counted. According to Shadowstats.com, the real U.S. unemployment rate in January 2011 was 22.2 %, more than double the 9 % reported by the mainstream media via the Bureau of Labor Statistics.

Wikipedia’s 16-page article on “Employment” devotes a single paragraph to the issue of what it calls “hidden unemployment,” pointing out that the unemployed who decide to retire, continue their education, work only part-time or simply give up looking for a job are not counted as unemployed. Another, more technical section headed “Limitations of the unemployment definition” goes into more detail on this topic and points out that most economists look at an array of unemployment statistics to gauge the economy’s strength.

Wikipedia’s “Shadowstats.com” article is another and perhaps stronger example of its bias toward the educated and the credentialed. John Williams, the author of Shadowstats.com, is not described as an economist but rather as “an economic consultant with an economics BA and an MBA,” the implication being, of course, that he lacks the Ph.D. that would make him a credible analyst of economic data. Wikipedia cites a newspaper article that says Shadowstats.com subscribers “range from individual and professional investors to conspiracy theorists and gold bugs” and ends its account by saying “some experts dismiss Williams’ analysis and his claims of manipulation as ‘preposterous’.”


Did Wikipedia outperform the mainstream media in its coverage of Project Censored’s Top 10 Censored Stories of 2012? In simple quantitative terms, the answer is yes. No single media outlet, either mainstream or alternative, was mentioned as a source for more than one of Project Censored’s list of 10 censored stories. Wikipedia, on the other hand, was able to fully match one of the censored stories and to provide partial though often fragmented coverage for five others. One could argue that, indeed, a collaborative, open and transparent approach to covering news provides enough freedom from the constraints of private ownership, commercial advertising and official pressures to make a noticeable difference. Further in Wikipedia’s favor, none of the exact or partial matches for the censored stories contained misreported facts or misrepresented points of view. Clearly, Wikipedia’s open and collaborative approach to editing and its emphasis on neutrality and “reliable sources” helps it avoid factual errors as well as misleading representations of particular viewpoints. Indeed, a point of view is more likely not to be represented at all if its advocates don’t become part of the editing process.

But before we begin to tout Wikipedia, as Hansen has, as “embod(ying) the promise of technology-enabled rational discourse” (209, p. 38), we must put ourselves in the place of readers who have neither the time nor the inclination to search, aggregate and synthesize the scattered pieces of a larger picture so often presented by Wikipedia. Its information building approach of openness, collaboration, neutrality and mainstream-defined factuality is a two-edged sword that also cuts off its ability to interpret, synthesize and challenge the “facts” in a way that propels the reader beyond the hegemonic point of view.

Worse, Wikipedia’s reliance on “reliable, published sources” and its lesser valuation of alternative evidence and views condemn it to an unquestioning repetition of what the mainstream media and government officials have already supplied them, clearly in violation of the Habermasian ideal of being able to question all assertions. As Wikipedia administrator Lucky Louie put it so succinctly, “Wikipedia can only mirror the degree of attention that a topic has been given by reliable sources.”


Alexa. (2012, a). Wikipedia.org. Retrieved February 3, 2012 from http://www.alexa.com/siteinfo/en.wikipedia.org/wiki/Main_Page.

Alexa. (2012, b) Wikinews.org. Retrieved February 12, 2012 from http://www.alexa.com/siteinfo/wikinews.org.

Bruns, A. (2006). Wikinews: The Next Generation of Alternative Online News? Scan Journal 3(1). Retrieved February 9, 2012 from http://eprints.qut.edu.au.

Gitlin, T. (1980). The whole world is watching: Mass media in the making and unmaking of the New Left. Berkeley, CA: The University of California Press.

Habermas, J. (1976). On the pragmatics of communication. Cambridge, MA: MIT Press.

Habermas, J. (1984). The theory of communicative action: Reason and the rationalization of society. Boston: Beacon


Habermas, J. (1990). Discourse ethics: Notes on a program of philosophical justification. Moral Consciousness and

Communicative Action, pp. 43-115. Cambridge, MA: MIT Press.

Hansen, S.; Berente, N.; & Lyytinen, K. (2009). Wikipedia, critical social theory and the possibility of rational

discourse. The Information Society, 25, 38-59.

Hartelius, E. Johanna. (2010). Wikipedia and the emergence of dialogic expertise. Southern Communication Journal, 75 (5), 505-526.

McIntosh, S. (2008). Collaboration, Consensus, and Conflict: Negotiating news the wiki way. Journalism Practice, 2(2).

McQuail, D. (1994). Mass communication theory: An introduction (3rd ed.). London: Sage.

Negroponte, N. (1996). Being Digital. London: Hodder & Stoughton.

Oboler, A.; Steinberg, G.; & Stern, S. (2010). The framing of political NGOs in Wikipedia through criticism elimination. Journal of Information Technology & Politics, 7,284-299.

Project Censored. (2012a). About: What Is Modern Censorship. Retrieved February 12, 2012 from http://www.projectcensored.org/about/.

Project Censored. (2012b). Analysis of Project Censored: Are We a Left-Leaning, Conspiracy-Oriented Organization. Retrieved February 23, 2012 from http://www.projectcensored.org/top-stories/articles/analysis-of-project-censored-are-we-a-left-leaning-conspiracy-oriented-organization/.

Project Censored. (2012c). About: Project Censored Overview. Retrieved February 22, 2012 from http://www.projectcensored.org/about/.

Project Censored. (2012d). Top 25. Retrieved February 22, 2012 from http://www.projectcensored.org/the-top-25-index/.

Rheingold, H. (1994). The Virtual Community. London: Seeker and Warburg.

Thorsen, E. Journalistic objectivity redefined? Wikinews and the neutral point of view. New Media & Society, 10(6).

Smith, B. (2011, January 15). Wikipedia turns 10 today, and while it has its critics, the online collaborative encyclopedia is one of the world’s top websites, with no plans to go commercial. The (Melbourne) Age, p. 26.

Toffler, A. The Third Wave. London: Pan Books.

Wikinews. (2012a) Wikinews: For Wikipedians. Retrieved February 12, 2012 from http://en.wikinews.org/wiki/Wikinews:For_Wikipedians.

Wikinews. (2012b). Wikinews: CV. Retrieved March 11, 2012 from http://en.wikinews.org/wiki/Wikinews:CV.

Wikinews. (2012c). Wikinews: Awareness. Retrieved March 11, 2012 from http://en.wikinews.org/wiki/Wikinews:Awareness_statistics#Comparison_to_other_Wikimedia_Projects.

Wikinews. (2012d). Help: Talk pages. Retrieved July 11, 2012 from http://en.wikinews.org/wiki/Help:Talk_page.

Wikipedia. (2012a). Wikipedia: About. Retrieved February 3, 2012 from http://en.wikipedia.org.

Wikipedia. (2012b). Wikipedia: Administrator. Retrieved March 11, 2012 from http://en.wikipedia.org.

Wikipedia. (2012c). Wikipedia: In the News. Retrieved February 12, 2012 from http://en.wikipedia.org/wiki/Main_Page.

Wikipedia. (2012d). Wikipedia: Identifying reliable sources. Retrieved February 12, 2012 from http://en.wikipedia.org/wiki/Wikipedia:RS.

Wikipedia. (2012e). Help: Using talk pages. Retrieved July 11, 2012 from http://en.wikipedia.org/wiki/Help:Using_talk_pages.

Wikipedia. (2012f). Wikipedia: Fringe theory. Retrieved July 12, 2012 from http://en.wikipedia.org/wiki/Fringe_theory.

Wikipedia. (2012g). Wikipedia: Conspiracy theory. Retrieved July 12, 2012 from http://en.wikipedia.org/wiki/Conspiracy_theory

Williams, Kevin. (2003). Understanding Media Theory. London: Hodder Arnold.