Jump to content

Wikipedia:Wikipedia Signpost/2024-03-29/Special report

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by HaeB (talk | contribs) at 17:05, 29 March 2024 (thanks for the concern, but that's not how byline attribution works, a lot of content remains from earlier versions (including the entire first section) / it's a single author / I suggest keeping this part, also to indicate alignment regarding e.g. anonymity). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Special report

19-page PDF accuses Wikipedia of bias against Israel, suggests editors be forced to reveal their real names, and demands a new feature allowing people to view the history of Wikipedia articles

Logo of Arabic Wikipedia, showing the Wikipedia "globe" colored to look like the Palestinian flag.
Arabic Wikipedia's logo in solidarity with Palestine has caused controversy

The World Jewish Congress has published a report titled "The Bias Against Israel on Wikipedia", which has been covered by The Jerusalem Post, Jewish News Syndicate, and Spectator Australia (paywalled). The author, Dr. Shlomit Aharoni Lir, is an academic, described in a related recent publication as "a poet, essayist, lecturer, and gender studies scholar [who] holds a research fellowship at Bar-Ilan University and is a lecturer at Achva College." She had previously published a peer-reviewed paper about gender bias on Wikipedia (Signpost coverage). The present report does not seem intended to be an academic publication, although it has already been used as a citation in the article Wikipedia and the Israeli–Palestinian conflict.

The 19-page report, which focuses on English Wikipedia, "is based on research, content analysis, and interviews with Israeli Wikipedians"; its overview of challenges to Wikipedia's ideals include "The Power of the Admins and Beurocrats" [sic], as well as the gender gap (see Signpost coverage). Their citations on the gender gap include a survey from fourteen years ago saying that Wikipedia editors were mostly male,[1] and a paper from thirteen years ago which compared Wikipedia to Encyclopedia Britannica on coverage of women in historical biography lists and concluded that Wikipedia had "significantly greater coverage" and its articles were "significantly longer than Britannica articles [...] for every source list"[2]. It is somewhat unclear what, if any, relation the gender ratio of biography articles (or indeed of the editoriat) has to Israel and Palestine; while there is indeed a paper from ten years ago[3] (cf. our coverage) that talks in greater depth about lower participation rates for female editors, it's hard to see what the connection is. Indeed, the closest the report comes to making a connection between them is to say:

While this is a brilliant turn of phrase, it is also a fairly broad one, with some fairly broad implications; is the Japanese Wikipedia prima facie untrustworthy because it is edited mostly by Japanese people, or articles on the English Wikipedia written by all-female WikiProjects? It is true that we live in a society, and that the biases of that society pose issues for our attempts to write a encyclopedia that is both neutral and based on direct citations to sources written in that society (a problem apparently endemic to all encyclopedia writers, even our Britannic forebears). This is the subject of much ongoing reflection and work; but it's not clear how it relates to the conflict in Gaza.

The report criticizes Arabic Wikipedia's "blackout" in solidarity with Palestinians (see Signpost coverage), the English Wikipedia's coverage of the Holocaust, and its general "bias against Israel", which the author argues is exemplified through content and sourcing bias, "deletion attacks", editing restrictions, "selective enforcement" by administrators, and "anti-Israeli editors".

Regarding administrators, the report mentions issues with "concealment of decision-makers' actions, alongside the significant authority wielded by anonymous administrators who can delete entries and block participants without accountability". Wikipedia is one of the most transparently operated websites in history. Editorial decisions, discussions about those decisions, edits themselves and administrative actions are meticulously logged, to the second, in full public view, on a page that automatically generates a display of precisely what changes were made in the edit, or what actions were taken, by who.

For example: this is a full public log of every formal action that Signpost editor and Wikipedia administrator JPxG has ever taken. You can see, in this log, that Ayamediainc was indefinitely blocked on March 12, 2024, at 04:45 UTC, for violation of the spam guideline. On their user talk page is a publicly viewable template indicating the name of the page in question, and an explanation of the specific way that it was in violation of the policy. The log for that user account indicates it was created at 02:47 that same day, and three hours later they created a page at User:Ayamediainc/sandbox with the summary "Added sections from Ayamediainc profile and added background information". The log also specifies that this edit tripped two automated alerts because text on the page matched patterns that are strongly associated with spam. The policies, the user warnings, the block, the reason for the block, and even the process of the user's unblock appeal are publicly viewable and can be audited by anyone.

One of these suggestions I (User:JPxG) can easily agree with: the "transparent editing history", item, which exhorts Wikipedia to to "ensure that all changes to articles are transparent and traceable", which "helps in identifying editors who may consistently introduce bias into articles".

This is a great idea; in fact, it's such a great idea that we did it 22 years ago, and a link to the complete history of every article has been a central element of the top of every Wikipedia page since the year 2002.

If there is some other website which provides greater transparency into its administrative and editorial decisions, perhaps it would provide a useful model for us to emulate. However, it is hard to come up with one. The histories viewable for every single Wikipedia article track every modification ever made to them, from major copyedits to em-dash fixes, and are permanently attributable to the editors. It's hard to come up with any way to increase the transparency of the process, except for personally doxing the administrators and editors.

This is one of the suggestions vaguely alluded to in the report, and later said explicitly: to this, I may offer the rejoinder that in that block log you can see me issuing blocks to a wide range of people encompassing bored schoolkids, scammers, vandals, and seriously disturbed and hostile individuals who carry decades-long grudges. I am a volunteer who edits for fun. These are not, generally speaking, people I would prefer to know where my family lived, especially not the guy who capped off a decade and change of Wikipedia harassment career by going to jail for making dozens of graphic death threats to the Merriam-Webster dictionary. That was not a made-up example: this guy is real and he's one of the hundred or so entries on the long-term abuse page. As for editorial integrity, I can definitely imagine what effect it would have on our article about any given scandal of corporate malfeasance or government corruption if somebody were able to instantly file vexatious lawsuits against individual editors. I just cannot imagine it being a good one.

The report says that "network bullying and discriminatory treatment increase when there is no personal responsibility and acting under cover of anonymity is possible". In general, my experience over the last twenty years of hanging out on the Internet suggests that network bullying also increases a lot when every person you ban from a website has trivial access to your home address, including this sicko.

One of the report's proposals for new features that should be implemented is to "host forums and discussions within the Wikipedia community to address concerns about neutrality and gather feedback for policy improvements"; it is not specified how this would fit into the existing directory of centralized discussions, dashboard, twenty-year-old, project-wide Request for Comment process with fifteen categories, six Village Pumps, and dozen or so noticeboards, including a neutral-point-of-view noticeboard and dispute resolution noticeboard.

After this, there are a number of specific examples given of pages concerning the conflict in Gaza. Frankly, this may be true: political articles are biased sometimes. They tend to be edited by a variety of people with various allegiances, who argue at great length over everything from trivial minutiae to the timelines of major events. It is not clear this can, or should, be fixed. Existing research seems fairly consistent on the idea that conversation and collaboration between people of various perspectives improves the overall quality of articles. While it is indeed the case that some people are biased towards one view or another, it seems unlikely that instituting binding top-down procedures (like an official oversight committee to dictate content at the behest of external consulting agencies and lobbyists, as is also suggested in the report) would arrive at a better result; after all, what if the official censorship committee was also biased?

Lauren Dickinson of the Wikimedia Foundation's Communications Department told The Signpost that several staff had reviewed the document, and found that "the WJC report makes a number of unsubstantiated claims of bias on Wikipedia. It lacks adequate references, quotes, links, or other sources to support its purported findings. Further, the report misunderstands Wikipedia's NPOV policy, as well as the importance of anonymity for user privacy on Wikimedia projects."

Arbitration Committee grants new editor extended-confirmed status to open case request

Subsequent to the publication of this report, on March 20, the Arbitration Committee announced that a user account created that day with zero edits (Mschwartz1) would be granted extended-confirmed status "for the exclusive purpose of participating in a case request about Israel-Palestine". Extended-confirmed status, generally, is given to accounts with over 500 edits that are at least 30 days old (and is currently a prerequisite for any editing activity in the Israel–Palestine area, formally designated a Contentious Topic).

A long discussion ensued, both at the ArbCom noticeboard and the unmentionable BADSITE, in which it was speculated that this may have been an employee of the organization publishing the report (due to timing that closely aligned with the publishing of the report). Arbitrator Barkeep49 said that it "may or may not be a coincidence", explaining that "I can say the conversation with us that led to this grant has been going on since early February." Limited information has been made available about the nature of the editor, although the rare decision to grant EC status to a zero-edit account on the day of its creation based on private correspondence with the Committee beforehand indicates that there is something unusual about the situation.

It remains uncertain whether this account has any relation to the WJC (or to any lobbying organization); commenters at the talk page for the ArbCom noticeboard have questioned whether this unknown party has standing to request a case be opened, whether a disclosure is required per WP:COI, and other issues. A more comprehensive explanation came from arbitrator Primefac:

Mschwartz1's sole edit (on the 26th) was to add this case request against Nishidani (mistakenly putting it at Wikipedia talk:Arbitration Committee instead of Wikipedia:Arbitration/Requests/Case, after which it was closed with instructions on how to post it to the correct board).

As of press time, there seems to be no conclusive evidence either way of who or what this account belongs to, despite many fairly strong opinions being expressed on the talk page;

See also related earlier coverage: "Does Wikipedia's Gaza coverage show an anti-Israel bias?" ("In the media", November 6, 2023) and "WikiProjects Israel and Palestine ("WikiProject Report", January 10, 2024)