Collaboration/Corroboration: Questions of Power in the Collaborative Future

When I started reading Collaborative Futures I wanted to agree with their notion that greater collaboration is better for society. However, the further I read the more uneasy I began to feel. The authors of this collaboratively written work provide a nuanced examination of the benefits and potential pitfalls of collaboration. I fully support the notion that groups working collaboratively are better at problem solving and innovating more creative solutions to problems. This seems to work best in small scale collaborations where trust can be established, direct communication is possible, and the goals and organization of the project are clearly understood. My uneasiness comes in when the discussion shifts to large scale open collaborations. While such collaborations could be beneficial to society, they seem to be based on an altruistic notion of humanity that, from my perspective, does not exist.

Collaborative Futures tries to separate social and cultural production from the economic market economy. Yet, the authors also elicit a Bourdieuian analysis of their project when they invoke the term “cultural capital.” The invocation of Bourdieu negates the notion of altruism since, according to Bourdieu, all human action is interested action. Individuals will always act, consciously or not, in what they believe to be their best interests. This goes beyond the notion of economic incentive however to include the accrual of other types of capital. In this sense, the Free Culture movement establishes its own economy based on social and symbolic capital rather than monetary capital.

If we see large scale collaborations as establishing their own economies, then we have to acknowledge that they are also sites of power struggle. Does the open structure of large scale open collaboration open the possibility that a small well organized group could infiltrate and wield the power of the larger collective towards their own ends? Does the very nature of collaboration drive towards a state of “group think” in which dissension is silenced by the tyranny of the majority?

The authors begin to problemitize this situation in their discussion of Stephen Colbert’s actions on Wikipedia. Noticing the potential for a “group think” mentality on Wikipedia, Stephen Colbert coined the term “Wikiality” to describe to describe a phenomenon where a group of people can alter perception of truth through collaboration and corroboration on Wikipedia. He and his views adjusted the wiki page on elephants “claiming that the elephant population in Africa had tripled in the past 6 months (Collaborative Futures 53). Wikipedia responded by locking the article and deleting Colbert’s account.

I see two potentially disturbing power dynamics in this situation that raise questions:

  1. Who has the power to determine the ethics of the collaboration? Wikipedia removed Colbert’s claims, and his viewers’ corroboration of those claims, claiming site vandalism. While Colbert’s claims were blatantly false (it is satire after all), and were easily detected and removed as false claims. But what happens in murkier territory? The talk page for the Oberammergau Passion Play records a debate among wikipedia editors about whether or not to include the claims of anti-semitism in the play. Ultimately a version of the anti-semetic argument remain in the article (at least as of 3/2/2014), but is it possible that this significant element in the history of the play could be erased in service of an editor’s, or group of editors’, personal/political opinion that this aspect is no longer important? Who makes these decisions? On what basis?
  2. The issue of “Wikiality” itself raises interesting questions in terms of collaboration and corroboration. “Wikiality” as Colbert uses it is a form of group think in which a group of people can will facts into existence through repetition. In editing the wikipage for The Lost Colony play, I ran into trouble when i tried to check and add citations for existing information on the page. It was impossible to tell what information on the internet served as sources for the wiki and which were merely citing the wiki as the authority. In this instance the collaboration becomes the corroboration making claims appear to be facts. How do we avoid the dangers of this self-feedback loop?

3 thoughts on “Collaboration/Corroboration: Questions of Power in the Collaborative Future

  1. Michael Mandiberg (they/them)

    I think your concerns about the feedback loop are an interesting, and somewhat difficult one to unravel. I have experienced this myself. It has a lot to do with the fact that WP has a CC BY-SA license, which allows for commercial use, and others slurp the database regularly to populate sites with that information. That seems to be a problem of ubiquity. How do you do new research, when there is a glut of the same old information clogging the bandwidth. It seems less about groupthink.

    Regarding the problems of vandalism and ethics, an experienced editor posted on her Facebook page today something to the effect of “People fear Wikipedia vandalism, but in my experience it doesn’t happen that often, and is cleaned up very quickly. The problem is that the fear of vandalism keeps people from using or editing the site.”

    Or to put it another way: it seems to me that the talk page documents a legitimate conversation between some biased and some unbiased editors, with the NPOV winning in the end, no?

    I don’t think that the process is always smooth; you could ask Christina or Silvana about their experiences with the friction of editing. I do think that that friction, which slows down action, is sometimes productive. It requires discourse.

    The fact is that speech, and the writing of history are always political. Publications have their own ideologies, be it the New York Times, Encyclopedia Britannica, or TDR. Each of these ideologies is embedded in the editorial process, and obfuscated from the reader. I think it is kind of wonderful that you can see the entire debate over the editorial decisions on a controversial article, such as, say… Scientology

    Yes, we don’t know who these people are who are editing Wikipedia are. We fear that they have an agenda, and many do. But what we have seen is that the articles that are well developed, such as the Scientology article, offer a rather even handed description of the facts of the matter.

    On the flip side, here are two articles that that might interest you. You could see them as moments when system breaks, or you could see it as moments of productive friction, around Sandy and Phillip Roth.

    Lastly, it is worth considering that not all edits are created equal. Research has shown that the first editors to work on a page typically shape the form of the page, as most subsequent editors preserve the work or the structure or the tone of the original work.

    What do others think? Is there enough transparency? Does the NPOV come through in the end? Silvana and Christina, what is your perspective on this, having experienced some edit conflicts?

  2. Maura A. Smale (she/her)

    Very thoughtful post, Jared. You articulate some of the concerns I also felt when I read Collaborative Futures. To pick Wikipedia as an example, I think we can see groups coming together to address some of these concerns about Wikipedia, for example last month’s Art + Feminism editathon. But the question of power is an important one, as we’re all finding out when we edit Wikipedia this semester. While there are well-publicized rules and norms for Wikipedia, it also seems to be the case that some editors interpret the rules more strictly than others. And some editors seem to feel real ownership over certain content in Wikipedia and may be inclined to exercise their power by reverting edits to that content. In those cases trust becomes difficult, both trust in the collaborative enterprise as well as, as Pamela notes, trust in the content.

    Other than the editathon model, which seems to be very successful in many ways, I wonder whether there are other practices we can engage in to build more trust into an enterprise like Wikipedia?

  3. Pamela Thielman (she/her)

    With regard to the your second point, I also had some trouble with the article I was editing on Wikipedia. In my case the source was cited, but when I checked my own copy of the book in question the page numbers for the citation were wildly inaccurate though both myself and the previous editor appeared to be working from the same edition. So while the source did contain the information indicated, the citation itself was a dead end. This isn’t the feedback loop you are describing, but it raises questions of authority and accountability.

    For me, a lot of this comes down to trust. In the case of Wikipedia, we have to place trust in the editors (that they are providing accurate, verifiable, unbiased information) and in the larger apparatus (that if individual editors are not living up to their role–intentionally or unintentionally–this will be discovered and rectified by other editors). In the case of The Lost Colony, this worked. You had interest and knowledge and you made edits that brought the article into line with the stated goals of the Wikipedia project. I’m willing to put this kind of trust in Wikipedia because on the whole, I think it works more often than not. But agree that if we’re going to apply this model to other situations, I’m less comfortable with it.

    Comment Tags: collaboration, trust

Comments are closed.