Recherche

Essay Arts

Portrait of the intellectual as a DJ
Wikipedia and the question of scientific expertise


by Philippe Lacour , 12 March 2010


Télécharger l'article : PDF
|

Wikipedia is on its way to becoming something of an institution in today’s intellectual landscape. What role, one might ask, is scientific expertise going to play here? According to Philippe Lacour, this problem is not so much going to find a solution inside the encyclopaedia’s framework, as it is going to be resolved in an external operationalisation process, following the twofold principle of clinical as well as critical differentiation. Wikipedia’s development is based on the dissembling and reassembling of knowledge and digital data, foreshadowing, undoubtedly, what will be a new form of intellectual life.

Wikipedia and Citizendium

Wikipedia is notorious for its editorial ambiguity. Not only is the positivist norm, which includes neutral presentation of facts, accountability and consensual writing, not always complied with (forgery, disputes, etc.), but, more importantly, its leverage inside this project is structurally challenged by other philosophical approaches (liberalism, communitarism, relativism) [1]. The debate surrounding the online encyclopaedia Citizendium might not have received sufficient attention in this regard. In the light of Wikipedia’s success, Larry Sanger’s rivalling project might, at first glance, appear somewhat childish and vain. Yet, personal quarrels aside (Larry Sanger is, in fact, cofounder of Wikipedia, even though Jimmy Wales is still contesting this [2]), the weakness it has pointed out (quality) and the solution it comes up with (expertise) legitimately call for some reconsidering.

As pointed out by Larry Sanger himself [3], his project is, in many respects, completely different from Jimmy Wales’ brainchild. Admittedly, both of them are what you call wikis, that is a website worked upon by a variety of people, with writing and reading modes that can grant more or less access [4]. In addition, Citizendium uses the same wiki engine as Wikipedia (mediawiki, open source), which accounts for the resemblance in appearance. Only, Larry Sanger restricted the “open” access in the wiki’s writing mode [5], opted, right from the start, for English as the only language [6] and limited peer review [7]. Without it constituting its main goal, Citizendium strives to recycle Wikipedia, while improving the spontaneous contributions through a subsequent “gentle expert oversight”, fully aware of this not being a real expert control [8]. The editorial policy could be described as being somewhere between a “bazaar” (a spontaneous and chaotic bottom-up build-up) and a “cathedral” (a controlled and planned top-down construction) [9]. It also includes a legal responsibility [10]. Everyone contributing is required to sign their article and have their identity authenticated. Featuring only 5600 articles, one language and expressing a certain feeling of déjà vu, Citizendium might seem ridiculous. And yet, this sort of distant position could prove most efficient in the future.

Citizendium, in fact, and by no coincidence, picks up where Nupedia – the online encyclopaedia that was supposed to use its wiki space (which, soon afterwards, was to become Wikipedia) as a training ground in preparation for the real thing, providing a makeshift storage space for unfinished articles that were still awaiting approval by the team of experts – left off. And, this training ground then ran off to become this untamed hybrid we all know. On the face of it, this novel and surprising autonomy appeared to have completely changed the relationship between the two projects: Nupedia seeming to be some sort of stillbirth, irrevocably supplanted by Wikipedia’s exponential growth. When, in fact, this supremacy is neither as absolute nor as definite as one might think, at first. Nupedia’s scientific aspirations – a resolute editing policy, decided on by a team of experts – did, indeed, not simply disappear, but can still be retraced, almost in their original form, in Citizendium’s methods, with only the experts’ access being made a lot easier. Having fully accepted Wikipedia’s precedence, this new rival plays the second fiddle quite aptly, taking on a more reflective character. All in all, Larry Sanger’s only real mistake is that, in keeping a universal range of topics and hanging on to an ideal of neutrality [11], he is pursuing the very same ambitions as Wikipedia, for it is, as we are about to see, right here that Wikipedia (Version 1.0) is likely to become once again a fierce competitor. The idea of “remixing” relatively crude material under expert control, while following a responsible editing policy, on the other hand, seems actually destined for a bright future.

Two criteria and one problem

That Larry Sanger’s intuition was not far off becomes evident considering the ongoing debate inside Wikipedia on selecting the most reliable articles after authentication by a review team. The “Wikipedia 1.0 project” is going to be finalised in the medium term (probably by 2010). The French Wikipedia Version 0.5 should see the light of day in 2008, with an approximate 2000 articles (in march there were about 800 of them [12]). The Version 0.5 in English, on the other hand, has already been out since 2007, the Version 0.7 (30000 articles) being due in 2008. The question of the article selection came up when the release of an offline or print version of the encyclopaedia was being considered, leading to the idea of dividing the encyclopaedia in an unstable and a so called stabilised version.

The two stabilisation criteria that have been found are an article’s quality and its importance score, the two of them being, however, completely independent from one another. An article’s importance can range from low, mid, high to top. The importance, or priority, is determined as it applies to a specific WikiProject and can thus vary from one project to the next, for one and the same article. Only, that being the case, why bother insisting on “the article on unemployment being more important than the one on Paris Hilton” [13]? There really seems to be some sort of contradiction here, between an absolute but implicit guideline (seriousness) and the fundamental principle of freedom (relevance depending on a project and its thematic focus).

The following categories have been established for rating an article’s quality: start-class, C-class, B-class, GA good article, A-class, FL Featured List (almost “FA featured article”); the same scale applies to the “portals”. Yet, these quality criteria are also affected by the afore mentioned editorial vagueness: apart from formal criteria (spelling, grammar, layout, pictures) and the respect of documents’ licences, the “encyclopaedic quality” demands entries to be clear, exhaustive, neutral, relevant and their sources to be quoted, etc. It is noteworthy that the quality score is established by vote, which, in case of the article being of (high) quality (marked by a yellow star in the upper right corner), is indicated at the bottom of the page. This is how the French article on “The mutiny in Jerusalem in 1920” got elected in the first round of voting (13 in favour), whereas the one on “Adam Smith” had to go into a second round (12 votes against one). With every user who has added more than 50 contributions having the right to vote, the status will only be awarded if a quorum and a qualified majority have been obtained (80 % in favour of granting the FL status).

In spite of being an eligibility criterion, this project internal evaluation does, however, not guarantee validation. This subtlety proves quite revealing in terms of Wikipedia’s ambiguous attitude towards expertise. In a very engaging article, Christian Vandendorpe takes a somewhat prudent stance on the crucial matter of “peer review” [14]: he points to Wikipedia’s “General Disclaimers” to remind of the absence of formal peer review, only to go on to describe passionately the case of a university teacher encouraging his students to write an entry on Wikipedia in order to learn to face up to the challenge of peer review (notes 2 and 11). It is exactly this sort of ambiguity that is so typical of Wikipedia. First of all, the wording is unclear: the English version does indeed speak of “peer review”, while at the same time emphasising that it is not an academic peer review. The French version, on the other hand, speaks of an informal “reader committee” (comité de lecture) to whom one might refer oneself (voluntarily) by way of a simple request. And then, ambiguity is also furthered by the fact that the encyclopaedia sort of outsources peer review, by insisting, as it does, on providing articles with preferably peer-reviewed notes and sources. An article of the category A should, for example, “include a sufficient number of external written references, coming preferably from print sources (wherever possible from reader committee approved publications) rather than online sources”. Some of the English web sites downright subscribe to the objective of Wikipedia becoming more and more expert-controlled, using even the term “validation” – all while being perfectly aware that this will lead to a break with the founding philosophy of this project [15].

Leakage paths

As we can see, there are, essentially, two limitations in Wikipedia that make a “remix” (à la Citizendium) inevitable, which are, truth be told, inherent in any encyclopaedic project, but constitute, as far as Jimmy Wales’ venture is concerned, outright digital “leakage paths”. The first of these limitations can be called clinical, the other one critical.

Aside from its medical significance, “clinical” points to the problem of the knowledge of singularity, its modalities and, most of all, its scale (or its “pixel”, in terms of photography). The encyclopaedia’s virtuality leads in the long run to its universality, as it is in a continuous state of expansion. Then again, completeness is by definition impossible, for knowledge always succumbs to the details of reality. Take, for example, the incredibly detailed entry on “bike” (vélo) on the French Wikipedia: why do they not mention the bicycle races in Quimper in October 2002 and link it with a page on the weather they had back then, the colour of the leaves, the shape of the clouds, etc.? Anyone who cares to give it a try will find his article deleted for being “too detailed” to be really “relevant” [16]. In the end, any kind of knowledge is based on selecting information in the chaotic multitude that reigns in this world, rendering any hope of being exhaustive as complete and ludicrous an illusion as was Borges’ fictional 1:1 scale map. Wikipedia therefore draws the line at a certain level of generality that one might, however, wish to exceed in a legitimate pursuit of clinical precision – like a fan of cycling sport in the Bretagne might for example.

The second leakage path is the critical one [17]. Time and again, the so-called “neutral point of view” has often been denounced as being ambiguous, especially by commentators coming from the online “wiki” galaxy [18]. There is, for example, this one critic on Meatball Wiki who decries the alleged objectivity of such an ideal, the false symmetries it induces and the apathy of a forced consensus (reached by the elimination of controversies). In fact, this kind of “neutral” editing policy is only possible in cases where existing knowledge is being summarised and impossible where research is concerned that involves a strong possibility of controversy [19], having, of course, a certain tendency to stir up a variety of points of view [20].

Remix

These two difficulties will eventually cause the fragmentation of this common-knowledge platform – Wikipedia – and the subsequent emergence of new, more specialised versions with stricter editing policies [21]. These “micro-remixes” offer a space completely different from what existed so far, not so much a substitute for, as a perfect complement to today’s encyclopaedia: a universe of small interlinked groups of scientists who face each other in an arena of conflict, their struggle for interpretations constituting the way critical knowledge is formed (rather than being only a remnant of it [22]). Such a “battle” of different viewpoints, as envisioned by Jimmy Wales and Larry Sanger, can never take place on Wikipedia (not even on the so-called “discussion” pages), nor, in all likelihood, inside one of the small groups that common concern or interest brought together; eventually, it will come into existence, though, in between these groups, born out of their mutual opposition. Dissent, especially in Social Sciences and Humanities, does, indeed, require the kind of critical editorial choices Wikipedia’s “neutrality” makes impossible. Which is why, apart from knowledge transfer, Wikipedia should limit itself, in the future, to mapping out arenas of conflict and to simply stating existing positions; further critical argument should be left to other micro-wikis. Incidentally, every new Wikipedia “remix” is likely to increase the dependency on this matrix of each of them, eventually turning it into a new koinè of knowledge, a necessary starting point for opinion building [23].

Obviously, this disaggregating-reassembling movement does not in the least conflict with its tendency to unify. Far from being an exception, such a symbiosis matches the very evolution process of this new web that has so inaptly been dubbed “web 2.0” or “web 3.0” and is, in fact, essentially socio-semantic in nature [24]. This not only becomes obvious in the way online knowledge is being built, but also, for example, in the search for information. Indeed, the absolute reign of big generalist search engines seems to draw to an end, as a multitude of specialised, more or less “social” engines (educated by users who grade the search results [25]) make their appearance; they are “remixed” by meta-engines applying ranking scales that are not simply automatic (as opposed to Google’s “Page Rank” [26]). By turning the engine into a human-controlled machine rather than a self-sufficient robot, each time the relevance of the search results can be increased: the computer is “assisting” in a search conducted by a human [27]. That way, micro-engines recycle search results generalist engines have come up with, and meta-engines (like ari@ne or ixquick) reassemble results found by micro-engines, and standard engines… It would be possible to analyse the formation of social networks in very much the same way [28].

One could therefore say that Wikipedia is in itself less important than the principles it stands for: collaboration, writing empowerment, openness to modification and completion. These principles will, undoubtedly, be transformed along the lines of the afore-mentioned critical and clinical leakage paths. The more mobile the content (technically [29] and legally speaking) of the web – indefinitely deformable and apt to join the new “organs of the soul” – the more applications there will be. If one can overlook the pseudo argument of the enormous proportion of poor quality produced by spontaneous contributions [30], one understands this original new intellectual coming into existence before our eyes: he is at the same time critical and committed (Sartre), specific (Foucault), collective and virtual – a veritable DJ of digital knowledge.

Translated from French by Marie Reetz

by Philippe Lacour, 12 March 2010

To quote this article :

Philippe Lacour, « Portrait of the intellectual as a DJ. Wikipedia and the question of scientific expertise », Books and Ideas , 12 March 2010. ISSN : 2105-3030. URL : https://laviedesidees.fr/Portrait-of-the-intellectual-as-a

Nota Bene:

If you want to discuss this essay further, you can send a proposal to the editorial team (redaction at laviedesidees.fr). We will get back to you as soon as possible.

You might also like


Footnotes

[1See especially Marc Foglia et al., Wikipédia. Média de la connaissance démocratique? Quand le citoyen lambda devient encyclopédiste (Wikipedia. Medium of democratic knowledge? Citizen lambda turns into encyclopaedist), ed. FYP, Limoges, March 2008, §3: “Le mix philosophique de Wikipédia” (The philosophical mixture inside Wikipedia), and §12.8: “La construction du consensus?” (Constructing a consensus?).

[2Cf. article “Wikipedia” on Wikipedia. Larry Sanger was recruited as Nupedia’s chief editor by Jimmy Wales, but, soon afterwards, financially coerced to resign from this position (through withholding of his salary).

[3See different accounts by Larry Sanger: Why citizendium and Introduction to citizendium.

[4On the difference between wiki and Wikipedia, cf Marc Foglia et al., Wikipedia. Média de la connaissance démocratique ? Quand le citoyen lambda devient encyclopédiste, (Wikipedia. Medium of democratic knowledge? Citizen lambda turns into encyclopaedist), op. cit., §2.3.

[5Open access to the wiki; content licensed under Attribution-Share Alike 3.0 Creative Commons License, United States.

[6A plan does exist, however, to include other languages, in the future. On Wikipedia’s plurilinguism, cf. Marc Foglia et al., Wikipedia. Média de la connaissance démocratique ? Quand le citoyen lambda devient encyclopédiste (Wikipedia. Medium of democratic knowledge? Citizen lambda turns into encyclopaedist), op. cit., §4.3.

[7Cf. http://en.citizendium.org. Let us keep in mind that Wikipedia does not use the term “peer review” the way it is usually defined in an academic context.

[8See the account given here.

[9This is a reference to the famous opposition established by Eric Raymond’s The Cathedral and the Bazaar, O’Reilly Media, 1999.

[10Whereas Wikipedia, as we all know, is not considered an editor, in the strict sense, but a simple platform. For details, see the discussion on diffusers’ responsibility: “L’affaire wikipédia. De la responsabilité sur internet” (The Wikipedia affair. On the responsibility on the Internet) by Charlotte Dé and Marc Foglia. In et al., Wikipedia. Média de la connaissance démocratique ? Quand le citoyen lambda devient encyclopédiste, (Wikipedia. Medium of democratic knowledge? Citizen lambda turns into encyclopaedist), op. cit., §8.

[11A somewhat irenical idea: to even out positions that are too extreme. This does away with the possibility of a real arena of conflict. As Larry Sanger explains in his “charter”, neutrality will be held up, in spite of any administrative aberration, by following a rotation system when entrusting experts with project supervision.

[12At a conference of Wikimedia France in Paris, Fall 2007. Cf. article in the Libération : ”Wikipédia veut se racheter une conduite” (Wikipedia aims at a new policy). A web site has been created to that effect.

[14Christian Vanderdorpe, “Le phénomène Wikipédia. Une utopie en marche” (Phenomenon Wikipedia. A working utopia) in Le Débat, n° 148, January-February 2008, p. 17-30.

[15Cf. article “validation”. There you will find for example: The Article Validation “is a plan to move toward implementing a functional mechanism which would change the modus operandi of Wikipedia away from the idealistic concept of the completely open model, and give an important tool by which trusted editors can be mechanically assisted in repelling vandalism. Because such a change could theoretically have an impact upon the very philosophy by which Wikipedia has thus far succeeded, it is necessary that such mechanism be well thought out, openly corrected, and thoroughly explained”.

[16On Wikipedia’s editorial control, especially as exercised by the administrators (and other “sheriffs”), cf. Marc Foglia et al., Wikipedia. Média de la connaissance démocratique ? Quand le citoyen lambda devient encyclopédiste, op. cit., §§ 2.1 and 11.3.

[17See Marc Foglia et al., Wikipedia. Média de la connaissance démocratique ? Quand le citoyen lambda devient encyclopédiste (Wikipedia. Medium of democratic knowledge? Citizen lambda turns into encyclopaedist), §§ 2 à 4.

[18Let us not forget that Wikipedia is only the biggest wiki in the web; there are, actually, lots of more or less active wikis, forming together some sort of nebula: cf. Marc Foglia et al., Wikipedia. Média de la connaissance démocratique ? Quand le citoyen lambda devient encyclopédiste, (Wikipedia. Medium of democratic knowledge? Citizen lambda turns into encyclopaedist), §2.3 : “Wiki vs. Wikipédia”.

[19It is impossible for scientists to publish their most recent or innovating theories on Wikipedia web site, which is why a lot of specialists have shown little interest in the encyclopaedia.

[20The Hypertopic model’s main function is to express “different viewpoints on shared items”. Designed by the University of Technology in Troyes it is supposed to help understand and reuse the New Web’s applications and to enable their interoperation. See the wiki.

[21This is exactly what Google was aiming for when it came up with the “knols”, non-anonymous (signed articles) individual platforms based on people participation. There is, however, a downside to the knols with “recyclable” content (Creative Commons legal licences): they allow (at least optionally) for advertising to be added.

[22This is well illustrated by the universal encyclopaedia that is based on the knols idea.

[23See the Wikipedia “mirror” sites; the remix sites on anarchy and liberalism are of particular interest. Some of these remix sites are only available in reading mode; other wikis allow user contribution (with varying limitations). Some of them simply copy Wikipedia pages without any modification; still, most of these clones (especially Citizendium) take their distance to Wikipedia while, at the same time, making it the “source”. Apart from Uncyclopedia, a sort of “playful” remix, there are other interesting cases of composite applications (“mash-up”) linking, for example, Wikipedia geographical meta-data (Geotags) and cartographic tools (like Google Earth).

[24For more details on the Socio-semantic Web, which was designed by the Troyes University of Technology, see the Wikipedia entry “Socio-semantic web”.

[25See, for example, the “community” based search engines (swicki) the association Eureckster has provided for free. Any user community can “specialise” these engines according to their particular needs by tagging them, a method not unlike “social bookmarking”. See also Wikipedia’s “Social search”.

[26Google, being of course well aware of this sort of development, has, under the influence of the customisation movement, launched Google Custom Search Engine: http://www.google.com/coop/cse/ (beta version).

[27For more facts on the evolution of intelligence from “machines that think” to “machines that make think” see the online abstract of Bruno Bachimont’s doctoral thesis (IT Thesis download).

[28There are various general social networks competing for the largest site (Facebook, LinkedIn, etc.). It is possible, of course, to build “one’s” own social network (for example on http://www.ning.com or http://www.webjam.com). Then, there are sites that tie together several (specialised as well as general) networks (http://www.spokeo.com, http://www.friendfeed.com, etc.), functioning as some sort of meta-network.

[29To learn more about this idea of a content “migrating” from one medium to the next, see in particular the discussion on Craowiki’s MamanWiki (a wiki farm pushing its little wikis to leave home).

[30This is what Deleuze tells anyone who looks down on the seventh art: it is only an argument against mediocrity, not against the movies. Cf. L’image-mouvement. Cinéma 1, (Picture-movement. Cinema 1) Paris, Minuit, 1983.

Our partners


© laviedesidees.fr - Any replication forbidden without the explicit consent of the editors. - Mentions légales - webdesign : Abel Poucet