Timeline of "distributed Wikipedia" proposals
editA timeline of - mostly independent - proposals for a kind of distributed Wikipedia (abolishing the principle that there is only one current article version for each topic), and more specifically, proposals to apply the principles of distributed revision control (as exemplified by Git in software development) to wikis in general and Wikipedia in particular.
Also noting significant related material.
1993
edit- Interpedia
- "several independent 'Seal-of-approval' (SOAP) agencies were envisioned which would rate Interpedia articles based on criteria of their own choosing; users could then decide which agencies' recommendations to follow." (from the Wikipedia article, unsourced)
1997
edit- Ward Cunningham — Folk memory: A minimalist architecture for adaptive federation of object servers
- The Distributed Encyclopedia[dead link ] (proposal by Ulrich Fuchs, who would later become one of the first admins on the German Wikipedia and in 2005 founded a fork, "Wikiweise)
- "Whenever possible, the author stores the essay in html format on his or her own web site. .... all essays will have a uniform layout. ... All essays can be accessed via a centralized index." [1].
In his book "Good Faith Collaboration", Joseph Reagle comments: "The irony here is that while it became clear that the Web would play a fundamental role [for an Internet-based encyclopedia, something that wasn't clear in the earlier Interpedia proposal], and an enormous strength of the Web is its hypertextual and decentralized character, Wikipedia itself is not decentralized in this way. It is not a collection of articles, each written by a single author, strewn across the Web. Instead, many authors can collaborate on a single article, stored in a central database that permits easy versioning, formatting and stylistic presentation. Furthermore, there is a vibrant common culture among Wikipedians that contributes to Wikipedia's coherence." (However, the "Distency" seemed to aim at one article per topic: "... we will accept (mostly) everything on every headword. But, of course, we want to avoid two people writing on the same subject the same time. It's very unpleasant for you to write something we can't accept any more because someone other was faster with an essay about the same subject." [2])
- "Whenever possible, the author stores the essay in html format on his or her own web site. .... all essays will have a uniform layout. ... All essays can be accessed via a centralized index." [1].
2001/2002
edit- According to a post by Andrew Famiglietti on the CPOV mailing list, "the history of Wikipedia and Wikipedia like projects shows a long list of failures to implement a 'marketplace of ideas' model. GNUpedia, an attempt by the FSF to build its own encyclopedia in 2001, imploded after selecting a technologically ambitious plan to build a repository of texts users could filter by their own criteria. .. Wikipedia users batted around plans to build similar 'multiple stable versions' in the fall of 2001/spring 2002. None were ever implemented." (see also Wikinfo)
2004
edit- August: Wikipedia:Branching support
- "...we could create forks of Wikipedia with particular points-of-view or topic focuses (e.g. Christians could create a Christian fork with Biblical Point of View (BPOV) and a focus on Christian topics), but could regularly resynchronise between the parent Wikipedia and the forks -- useful material from the parent Wikipedia since the article import could be copied to the fork easily, and recent innovations in the fork could be copied to the parent."
- November: Remark in Authority metric proposal by Tim Starling
- "... Only articles which are well-written and polished would be protected in this way. New or incomplete articles would be open to editing, just like on Wikipedia. And we could implement simple methods to carry over changes from Wikipedia to the proposed site. ... the whole idea of forking is that you can try things which aren't politically possible on the original site."
2005
edit- July: Meta:User:TomLord/distributed wikis
- "...a kind of p2p take on wikis. I should be able to selectively import articles from yours to mine, edit them locally and upload. If I have a specialized wiki of my own, you should be able to merge it into your more general wiki..." (see also meta:Versioning and distributed data)
- August: In Ward Cunningham's keynote (meta:Transwiki:Wikimania05/Presentation-WC1 video and slides) at the first Wikimania
- a kind of distributed wiki concept - similar or identical to "Folk memory" - is described (within the last third of this presentation. Watching the video is recommended, the slides alone are not very easy to understand)
2006
edit- December: bug 8265 (Please add branching support to MediaWiki)
- based on Wikipedia:Branching support (linked above in 2004)
2007
edit- August: Benjamin Mako Hill finishes and publishes a Masters thesis at the MIT Media Lab in the Electronic Publishing Group that included a working proof of concept distributed wiki (TextNet) built on top of the Bazaar distributed revision control system. Much of Mako's work focused on the problem of translating systems designed for collaboration on source code to textual content. The system also includes a simple javascript interface for reviewing and handlng merge conflicts. The project included the publication of working system called implemented on top of Python whose source code is available online but which seems to have bitrot to some degree. The software can can be checked out using bzr. Mako also published the text of the thesis which includes an academic description of the problem and an evaluation of a series of potential approaches of solutions. Mako presented on the system, and gave a demostration, at Wikimania 2007 in Tapei.
- August: "Uberfact: the ultimate social verifier" by Mencius Moldbug
- proposal for a Wikipedia where 'factional reputation' is assigned contributions from contrasting viewpoints
- October: "Decentralizing Wikipedia and its sister projects" (Wikiversity)
- "... When someone wants to view an article on cows, they basically ping a tracker server ... The tracker server would say, 'In order the amount of trust you can put into the integrity of the article, you can find the most up to date revision of the article on Cows on the servers x, y, z, and c'. The person who requested the article on cows would then send a message off to machines x, y, z, and c ... They make their changes and send them off to a few of the servers that contain the article on cows. This would then be propagated and perhaps reviewed by editors and then integrated into the best version."
2008
edit- February: Possibility of a git-based fully distributed Wikipedia Thread on Foundation-l
- Inspired by the release of git-wiki
- July: "Federating Wikipedia as Open Educational Resource" presentation Wikimania 2008 by Murugan Pal from the CK-12 Foundation
- about extracting content from Wikipedia for use in other formats, also discusses synchronizing (merging content back into Wikipedia), includes some demonstration of their "FlexBook" software.
2009
edit- August: Side remark in User:HaeB's Wikimania talk
- git-like forking/merging tools might foster cooperation between Wikipedia and Citizendium
- August: strategy:Proposal:Distributed Wikipedia
- "Communities can then decide who to view as 'authoritative'. In other words, the entire Wikipedia database could in theory entirely be forked. Democratically. In this way, much of the criticism of Wikipedia's process simply... melts away."
- October: Wikipedia meets git thread on Foundation-l
- Also mentions git-wiki, gitit, ikiwiki, wigit, DSMW (Distributed Semantic Media Wiki), ...
- November:
- During a very public controversy about deletions on the German Wikipedia (mainly fueled by members of the Chaos Computer Club), German hacker Scytale announces Levitation, a software project to import XML Wikipedia dumps into Git repositories. It produces a functional version (tested on some large Wikipedias), but peters out before achieving Scytale's vision of an "Omnipedia" ("everyone his own Wikipedia").
- November: mspr0: Die Multipedia: Schafft ein, zwei, viele Wikipedien! (In German - Google translation)
- Written independently, but inspired by the same controversy (the title alludes to a Che Guevara quote about "two, three or many Vietnams")
- November: Distripedia
- short blog post, apparently without many consequences
2010
edit- March: Maja van der Velden: "When Knowledges Meet: Database Design and the Performance of Knowledge", talk at the "Critical Point of View"(CPOV) conference (summary, video)
- Suggests "decentering Wikipedia further" to a "distributed database of local ontologies" (around 14:20 in the video), cf. [3]
- May: Gitizendium by Tom Morris
- "An attempt to move a little chunk of Citizendium into Git", mainly motivated by a desire to handle Citizendium's "approval" process, which forks articles into a stable and a "Draft" version, in a more natural way, see Citizendium forum posts: [4], [5]
- May: "DistriWiki: A Proposal" by Jason Scott.
- "Wikipedia is fucking centralized. [...] That’s why one vandal, Jimbo, was able to do so much damage, so quickly. [...] We’re lucky – the Wikipedia 'problem' I’m talking about was solved years ago. It was called Usenet. [...] I therefore propose DistriWiki, a set of protocols and MediaWiki extensions that push out compressed snapshot differences of the Wikipedia software and which allow mirror MediaWikis to receive these changes and make decisions based on them. [...] Imagine a world where these little Wikipedia mirrors have their own subsets of Wikipedia space that are different than Wikipedia, where other thoughts other than the grey goo consensus of Wikipedia rules the day."
- May: CPOV interview with Florian Cramer
- Also mentions Levitation
- July: Federating Wikipedia (presentation at Wikimania 2010, by V. Grishchenko)
- August: Making GitHub More Open: Git-backed Wikis (GitHub announcement)
- "Each wiki is a Git repository, so you're able to push and pull them like anything else."
- September: Anil Dash: Forking is a Feature
- blog post, suggesting among other observations: "... one of the best ways for Wikipedia to reinvigorate itself, and to break away from the stultifying and arcane editing discussions that are its worst feature, could be to embrace the idea that there's not One True Version of every Wikipedia article. A new-generation Wikipedia based on Git-style technologies could allow there to be not just one Ocelot article per language, but an infinite number of them, each of which could be easily mixed and merged into your own preferred version"
- November: Discussion (WebCite) about a possible fork of Citizendium
- "I've been experimenting with Gollum, a wiki engine that uses Git. Distributed revision control means we get rid of a huge amount of politics, and people can work on things in a distributed fashion, with branches and offline editing and so on. ... Gollum really rocks and could really be the future of wikis."
2011
edit- January: Tony Sidaway: [6]
- "A few years ago I tried to work out how a peering arrangement for parallel Wikipedias could work. Peer sites would in effect have proxy accounts, and edits would appear on selected pages. ... Peering is performed by an exchange of revisions, after which each wiki is returned to its “native” state (ie, the latest revision is a copy of the local state of the article before the peered revisions were introduced)."
- January: "Distributed Wikis" (talk by Pfctdayelise at linux.conf.au, abstract, slides, video)
- "... now that distributed version control systems (DVCS) have made forking trivial, are there implications for the political act as well? How does political forking work within collaborative prose text projects (i.e. wikis)? English Wikipedia is so large as to be practically unforkable ... One of the core Wikipedia rules is “one topic, one article”, which would seem to prohibit forking, but could we adhere to this principle and still take advantage of DVCS?"
- June: Foundation-l discussion
- Alec Conroy: "In 2002, we sort of 'forked off' from the 'mainstream' Free Software movement, and this 2002ish model of revision control is the model we use in our wikis. ... A users could create a whole new 'project' without using any Wikimedia resources at all ... If a new project was popular, it could be seamlessly and automatically shared with the entire world, again, at no expense to the foundation. 'Bad' projects would get weeded out because no one would share them, while 'good' projects would rise to the top automatically."
- David Gerard: "Adapting MediaWiki to git has been tried a few times. I suspect the problem is that the software deeply assumes a database behind it, not a version-controlled file tree."
- June: Ward Cunningham: Smallest Federated Wiki, see also his June 2012 talk "Why You Need to Host 100 New Wikis Just for Yourself":
- "The Federated Wiki provides new mechanisms for thoughtful conversation. This talk demonstrates many of these and explains why cooperating wikis offers a better environment than traditional wiki for reframing contentious issues. If you have 100 thoughts, if you engage in 100 conversations, you can use 100 Federated Wikis to develop your ideas in public."
- June: Thoughts about using a git-backed wiki for a proposed "Encyclopaedia of original research"
- August: Opening up Wikipedia's data: A lightweight approach to Wikipedia as a platform, Wikimania talk
- About opening up Wikipedia as a data platform cites statistics from open source projects that switched to decentralized revision control, concluding that "decentralizing interaction increases participation".
2012
edit- January: Adam Wight, Project idea at the San Francisco Hackathon: "No more edit conflicts"
- A proposal to avoid edit conflicts by "Enhanc[ing] revision ids to support version vector, initial goal will be to ease the 3-way merge bottleneck. Tolerate article text in an ambiguous state, meaning that unresolved conflicts will result in a fork which can be cleaned up later by the original poster or an editor. ... A rough analogy is that MediaWiki databases would act like a distributed version control system (git) such that people could clone and fork them."
- July: Wikitech-l: project to "to prototype an 'offline' wikipedia, similar to Kiwix, which allows the end-user to make edits and synchronize them back to a central repository like enwiki. ... Non-linear revisioning might also facilitate simpler models for page protection, and would allow the formation of multiple, independent consensuses." / "Distributing Wikipedia in a fashion similar to git will make it a lot easier to use in areas where Internet connections are not so common."
- July: Ward Cunningham announces his Federated Wiki: Wiki Inventor Sticks a Fork in His Baby (Wired, July 4, 2012)
- July: John Erling Blad opens issue "History should support branches (at this revision there was a merge/split with that revision)" (T40795)
- December: "Towards Content Neutrality in Wiki Systems" (doi:10.3390/fi4041086)
- Argues that "a collection of every-point-of-view [EPOV], contradictory, possibly emotionally charged articles may provide a better approximation to reality than a synthetic and illusionary neutral point-of-view" and outlines a distributed wiki system that already exists as "a preliminary implementation as a Mediawiki plug-in prototype".
- December: Adam Wight, mw:Requests for comment/Nonlinear versioning
- Proposes "a MediaWiki extension to allow branching histories for article content"
2013
edit- "Decentering Design: Wikipedia and Indigenous Knowledge", paper by Maja van der Velden (see also above, 2010).
- "A distributed Wikipedia would allow Indigenous knowledge communities to design and own their Wikipedia and to hostit in their own community. This control over their Wikipediawill allow the community to decide if the Wikipedia, or parts of it, will be connected with other Wikipedias."
- Review in the Wikimedia Research Newsletter
2014
edit- January: Some discussion at meta:Wikimedia Forum#Individualized Wikipedia (74: "Think about the difference between a distributed w:Git versus a centralized w:SVN ... in the latter, there is one central server (enWiki), but in the distributed variant, every PC ... has their own local version-control-repo. So in a nutshell, imagine if instead of needing a master's degree in EECS to get mediawiki up and running ..., it was possible to click three buttons, and have a *local* copy of mediaWiki running on your laptop... But the main focus, given the importance of mainspace enWiki, methinks would always be getting drafted changes back up into mainspace there")
2015
edit- January: Jon Udell, "A federated Wikipedia": "How can we ease the relentlessness of Wikipedia’s consensus engine? [...] why not encourage [forking the Wikipedia article]? [...] The network graph showing who forked that Wikipedia article, and made substantive contributions, needn’t be overwhelming. Timothy Messer-Kruse’s fork might or might not emerge as authoritative in the judgement of Wikipedia but also of the world. If it did, Wikipedia might or might not choose to merge it. But if the consensus engine is willing to listen for a while to a chorus of voices, it may be able to recruit and retain more of the voices it needs."
- May: User:AS, "Support for version branching for pages"(proposal in the 2015 Community Wishlist survey for new software features/enhancements): "Add ability to create own branch of page and ability to merge that version into main history (main branch). This would help to avoid most of edit wars and make comparing of different versions easier". One "support" vote, seven "oppose" votes, ranked #99 out of 107 proposals.
- June: J. Hernandez, "Bat-shit crazy proposal 1" (part of a brainstorming of the WMF Reading department): "I think as the foundation we need to provide base infrastructure for community to act and develop on. We would do a software that would have (as a baseline):
- Decentralized installation. Used on the normal web, or installed locally. [...]
- It can import and feed from content exported from other nodes. Either absorb full snapshots, or diffs. Auto-merge when possible. Either implement conflict resolution or just diverge the content without issues.
- Nodes can be federated and update from each other. [...] Very little or none centralization.
- Enable distribution of all the centralized existing wikis content under this format to allow these other installs of software to feed from it [...]
- With this we could [...] empower isolated silos of content to sync up with each other isolated silos. Let them make their own networks of knowledge. Let them collaborate on their content and knowledge."`
- September: C. Scott Ananian, "Make it easy to fork, branch, and merge pages (or more)" (Phabricator task T113004, proposal for the 2016 MediaWiki developer summit).
- "It used to be conventional wisdom that forking was the death of an open source project. ... Then git arrived, and shortly after, github. Suddenly, forking wasn't evil! ... There are number of ways to experiment with 'fork-and-merge' models for editing wikimedia projects. Some concrete suggestions are fleshed out below."
- See also Wikimania 2016 proposal, below
2016
edit- January: "Github like collaboration on wiki aka fork and merge" (unconference session about T113004 at the WMF All Hands meeting)
- May: C. Scott Ananian, "Forking Wikipedia... and merging back again", discussion proposal for Wikimania 2016 (not accepted)
- "Today, "fork and merge" models are prevalent for software development -- there is no centralized authority, instead on sites like github, every user has their own copy of a community project [..] Targeted participants: Editors. Those involved in edit wars. Folks who want to build welcoming spaces."
- See also developer summit proposal, above (2015)
2018
edit- Everipedia aims to become a decentralized alternative to Wikipedia, according to the site's Chief Information Officer Larry Sanger (formerly the chief organizer of Wikipedia until 2002):
- "Because the network is decentralized, the network will bring together articles from multiple encyclopedias, not just Everipedia. It will be possible to have different articles on the same topic, and we will eventually have a rating system that will make it possible for people to find different articles on the same topics, rated by different categories of people, groups, and experts."
- See also this article by David Gerard
2024
edit- Ibis aims to build a federated alternative to Wikipedia using the ActivityPub protocol, according to Lemmy maintainer Felix Ableitner:
- "Most importantly all of this is fully federated, so it is possible to synchronize articles between instances, and interact with remote articles as if they were on the local website."
Additions are welcome, but note that this is not about the related proposals to host/distribute Wikipedia (in its current form) using P2P transfer (such as meta:P2P or Globule's "Decentralized Wikipedia" collaborative hosting, first proposed around 2007)
Other overviews
edit- The historical notes in V. Grishchenko's paper Deep Hypertext with Embedded Revision Control Implemented in Regular Expressions (WikiSym 2010) mention several other examples of distributed wiki software and credit Ward Cunningham (1997) as author of the first distributed wiki proposal, it also describes relations to ideas from Project Xanadu.
- http://www.delicious.com/pfctdayelise/decentralisedwiki
- "The high availability wiki project" lists a few distributed wikiengines based on git or mercurial.