Wikipedia:Wikipedia Signpost/2023-10-03/Recent research: Difference between revisions
start with layout template |
CC User:JPxG (should be publishable at this point) |
||
Line 1: | Line 1: | ||
<noinclude>{{Signpost draft |
<noinclude>{{Signpost draft |
||
|title = |
|title = Concerns about limiting Wikipedia's "anyone can edit" principle "may be overstated" |
||
|blurb = |
|blurb = And other new research publications |
||
|Ready-for-copyedit = |
|Ready-for-copyedit = Yes |
||
|Copyedit-done = No |
|Copyedit-done = No |
||
|Final-approval = No <!--Should only be used by EiC --> |
|Final-approval = No <!--Should only be used by EiC --> |
||
Line 11: | Line 11: | ||
{{Wikipedia:Wikipedia Signpost/Templates/Signpost-article-header-v2 |
{{Wikipedia:Wikipedia Signpost/Templates/Signpost-article-header-v2 |
||
|{{{1|Concerns about limiting Wikipedia's "anyone can edit" principle "may be overstated"}}} |
|||
|{{{1|YOUR ARTICLE'S DESCRIPTIVE TITLE HERE<!-- REPLACE THIS-->}}} |
|||
|By [[User:HaeB|Tilman Bayer]] |
|||
|By ... |
|||
}} |
}} |
||
Line 21: | Line 21: | ||
==FlaggedRevs study finds that concerns about limiting Wikipedia's "anyone can edit" principle "may be overstated"== |
|||
=== ... === |
|||
A paper titled "The Risks, Benefits, and Consequences of Prepublication Moderation: Evidence from 17 Wikipedia Language Editions",<ref>{{Cite journal| doi = 10.1145/3555225| volume = 6| issue = CSCW2| pages = 333–1–333:25| last1 = Tran| first1 = Chau| last2 = Champion| first2 = Kaylea| last3 = Hill| first3 = Benjamin Mako| last4 = Greenstadt| first4 = Rachel| title = The Risks, Benefits, and Consequences of Prepublication Moderation: Evidence from 17 Wikipedia Language Editions| journal = Proceedings of the ACM on Human-Computer Interaction| date = 2022-11-11| url = https://doi.org/10.1145/3555225}} {{closed access}} |
|||
:''Reviewed by ...'' |
|||
/ {{Cite journal| doi = 10.1145/3555225| issn = 2573-0142| volume = 6| issue = CSCW2| pages = 1–25| last1 = Tran| first1 = Chau| last2 = Champion| first2 = Kaylea| last3 = Hill| first3 = Benjamin Mako| last4 = Greenstadt| first4 = Rachel| title = The Risks, Benefits, and Consequences of Prepublication Moderation: Evidence from 17 Wikipedia Language Editions| journal = Proceedings of the ACM on Human-Computer Interaction| date = 2022-11-07| url = http://arxiv.org/abs/2202.05548}}</ref> from last year's [[CSCW]] conference, addresses a longstanding open question in Wikipedia research, with important implications for some current issues. |
|||
Wikipedia famously allows anyone to edit, which generally means that even unregistered editors can make changes to content that go live immediately - only subject to "postpublication moderation" by other editors afterwards. Less well known is that on many Wikipedia language versions, this principle has long been limited by a software feature called [[Flagged Revisions]] (FlaggedRevs), which was developed and designed on the request of German Wikipedia community and deployed there first in 2008, and has since been adopted by various other Wikimedia projects. (These do not include the English Wikipedia, which after much discussion implemented a system called "Pending Changes" that is very similar, but is only applied on a case-by-case basis to a small percentage of pages.) As summarized by the authors: |
|||
=== ... === |
|||
:''Reviewed by ...'' |
|||
<blockquote style="padding-left:1.0em; padding-right:1.0em; background-color:#eaf8f4;"> |
|||
=== ... === |
|||
FlaggedRevs is a prepublication content moderation system in that it will display the most recent “flagged” revision of any page for which FlaggedRevs is enabled instead of the most recent revision in general. FlaggedRevs is designed to “give additional information regarding quality,” by ensuring that revisions from less-trusted users are vetted for vandalism or substandard content (e.g., obvious mistakes because of sloppy editing) before being flagged and made public. The FlaggedRevs system also displays the moderation status of the contribution to readers. [...] Although there are many details that can vary based on the way that the system is configured, |
|||
:''Reviewed by ....'' |
|||
FlaggedRevs has typically been deployed in the following way on Wikipedia language editions. |
|||
First, users are divided into groups of trusted and untrusted users. Untrusted users typically include all users without accounts as well as users who have created accounts recently and/or contributed very little. Although editors without accounts remain untrusted indefinitely, editors with accounts are automatically promoted to trusted status when they clear certain thresholds determined by each language community. For example, German Wikipedia automatically promotes editors with accounts who have contributed at least 300 revisions accompanied by at least 30 comments. |
|||
⚫ | |||
The paper studies the impact of the introduction of FlaggedRevs "on 17 Wikipedia language |
|||
communities: Albanian, Arabic, Belarusian, Bengali, Bosnian, Esperanto, Persian, Finnish, Georgian, German, Hungarian, Indonesian, Interlingua, Macedonian, Polish, Russian, and Turkish" (leaving out a few non-Wikipedia sister projects that also use the system). The overall findings are that |
|||
<blockquote style="padding-left:1.0em; padding-right:1.0em; background-color:#eaf8f4;"> |
|||
"the system is very effective at blocking low-quality contributions from ever being visible. In analyzing its side effects, we found, contrary to expectations and most of our hypotheses, little evidence that the system neither raises transaction costs sufficiently to inhibit participation by the community as a whole, nor measurably improves the quality of contributions." |
|||
⚫ | |||
In the "Discussion" section, the authors write |
|||
<blockquote style="padding-left:1.0em; padding-right:1.0em; background-color:#eaf8f4;"> |
|||
Our results suggest that prepublication moderation systems like FlaggedRevs may have a substantial |
|||
upside with relatively little downside. If this is true, why are a tiny proportion of Wikipedia language editions using it? Were they just waiting for an analysis like ours? In designing this study, we carefully read the Talk page of FlaggedRevs.<ref group=supp>https://meta.wikimedia.org/wiki/Talk:Flagged_Revisions</ref> Community members commenting in the discussion agreed that prepublication review significantly reduces the chance of letting harmful content slip through and being displayed to the public. Certainly, many agreed that the implementation of prepublication review was a success story in general—especially on German Wikipedia. [...] |
|||
However, the same discussion also reveals that the success of German Wikipedia is not enough to |
|||
convince more wikis to follow in their footsteps. From a technical perspective, FlaggedRevs’ source |
|||
code appears poorly maintained.<ref group = supp>https://phabricator.wikimedia.org/T66726#3189794</ref> [...] FlaggedRevs itself suffers from a range of specific limitations. For example, the FlaggedRevs system does not notify editors that their contribution has been rejected or approved. [...] Since April 2017, requests for deployment of the system by other wikis have been paused by the Wikimedia Foundation indefinitely.15 Despite these problems, our findings suggest that the system kept low-quality contributions out of the public eye and did not deter contributions from the majority of new and existing users. Our work suggests that systems like FlaggedRevs deserve more attention. |
|||
⚫ | |||
(This reviewer agrees on particular on the lack of notifications for new and unregistered editors that their edit has been approved - having filed, in vain, a proposal to implement this uncontroversially beneficial and already designed software feature to the annual "Community Wishlist", in [[m:Community_Wishlist_Survey_2023/Notifications,_Watchlists_and_Talk_Pages/Notify_users_when_their_revision_has_been_approved_or_rejected|2023]], 2022, and 2019.) |
|||
Interestingly, while the FlaggedRevs feature was (as summarized by the authors) developed by the the Wikimedia Foundation and the German Wikimedia chapter (Wikimedia Deutschland), community complaints about a lack of support from the Foundation for the system were present even then, e.g. in a [https://web.archive.org/web/20110110112742/http://corp.kaltura.com/devwiki/index.php/Flagged_Revisions talk] at [[Wikimania]] 2008 ([https://stuartgeiger.com/posts/2008/07/wikimania-2008-flagged-revisions-with-philipp-birken/ notes], [https://archive.org/details/wikimania2008_Flagged-Revisions-Development-and-experiences video recording]) by [[User:P. Birken]], a main driving force behind the project. Perhaps relatedly, the authors of the present study highlight a lack of researcher attention: |
|||
<blockquote style="padding-left:1.0em; padding-right:1.0em; background-color:#eaf8f4;"> |
|||
"Despite its importance and deployment in a number of large Wikipedia communities, very little is known regarding the effectiveness of the system and its impact. A report made by the members of the Wikimedia Foundation in 2008 gave a brief overview of the extension, its capabilities and deployment status at the time, but acknowledged that “it is not yet fully understood what the impact of the implementation of FlaggedRevs has been on the number of contributions by new |
|||
users.”<ref group=supp>https://meta.wikimedia.org/wiki/FlaggedRevs_Report_December_2008</ref> Our work seeks to address this empirical gap." |
|||
</blockquote> |
|||
(Disclosure: This reviewer provided some advice to the authors at the beginning of their research project, but was not involved in it otherwise.) |
|||
===Briefly=== |
===Briefly=== |
||
* [[Wikimania]], the annual global conference of the Wikimedia movement, took place in Singapore in August (as an in-person event again for the first time since 2019). Its [https://eventyay.com/e/8f889410/schedule?sort=starts-at&track=Research%2C%20Science%20%26%20Medicine%3A research track] included the by now traditional [https://blog.communitydata.science/the-state-of-wikimedia-research-2022-2023/ State of Wikimedia Research] presentation highlighting research trends from the past year (with involvement by members of this research newsletter). |
|||
* See the [[mw:Wikimedia Research/Showcase|page of the monthly '''Wikimedia Research Showcase''']] for videos and slides of past presentations. |
* See the [[mw:Wikimedia Research/Showcase|page of the monthly '''Wikimedia Research Showcase''']] for videos and slides of past presentations. |
||
* ... |
|||
===Other recent publications=== |
===Other recent publications=== |
||
''Other recent publications that could not be covered in time for this issue include the items listed below. Contributions, whether reviewing or summarizing newly published research, [[m:Research:Newsletter#How to contribute|are always welcome]].'' |
''Other recent publications that could not be covered in time for this issue include the items listed below. Contributions, whether reviewing or summarizing newly published research, [[m:Research:Newsletter#How to contribute|are always welcome]].'' |
||
:<small>''Compiled by ...''</small> |
|||
===="Wikidata as Semantic Infrastructure: Knowledge Representation, Data Labor, and Truth in a More-Than-Technical Project"==== |
|||
===="..."==== |
|||
From the abstract:<ref>{{Cite journal| doi = 10.1177/20563051231195552| issn = 2056-3051| volume = 9| issue = 3| pages = 20563051231195552| last1 = Ford| first1 = Heather| last2 = Iliadis| first2 = Andrew| title = Wikidata as Semantic Infrastructure: Knowledge Representation, Data Labor, and Truth in a More-Than-Technical Project| journal = Social Media + Society| date = 2023-07-01| url = https://doi.org/10.1177/20563051231195552}}</ref> |
|||
From the abstract: |
|||
<blockquote style="padding-left:1.0em; padding-right:1.0em; background-color:#eaf8f4;"> |
|||
"Various Wikipedia researchers have commended Wikidata for its collaborative nature and liberatory potential, yet less attention has been paid to the social and political implications of Wikidata. This article aims to advance work in this context by introducing the concept of semantic infrastructure and outlining how Wikidata’s role as semantic infrastructure is the primary vehicle by which Wikipedia has become infrastructural for digital platforms. We develop two key themes that build on questions of power that arise in infrastructure studies and apply to Wikidata: knowledge representation and data labor." |
|||
</blockquote> |
|||
===="Naked data: curating Wikidata as an artistic medium to interpret prehistoric figurines"==== |
|||
From the abstract:<ref>{{Cite journal| doi = 10.1080/14794713.2023.2253335| issn = 1479-4713| volume = 0| issue = 0| pages = 1–18| last1 = Sant| first1 = Toni| last2 = Tabone| first2 = Enrique| title = Naked data: curating Wikidata as an artistic medium to interpret prehistoric figurines| journal = International Journal of Performance Arts and Digital Media| accessdate = 2023-09-27| date = 2023| url = https://doi.org/10.1080/14794713.2023.2253335}}</ref> |
|||
<blockquote style="padding-left:1.0em; padding-right:1.0em; background-color:#eaf8f4;"> |
<blockquote style="padding-left:1.0em; padding-right:1.0em; background-color:#eaf8f4;"> |
||
"In 2019, Digital Curation Lab Director Toni Sant and the artist Enrique Tabone started collaborating on a research project exploring the visualization of specific data sets through Wikidata for artistic practice. An art installation called Naked Data was developed from this collaboration and exhibited at the Stanley Picker Gallery in Kingson, London, during the DRHA 2022 conference. [...] This article outlines the key elements involved in this practice-based research work and shares the artistic process involving the visualizing of the scientific data with special attention to the aesthetic qualities afforded by this technological engagement." |
|||
⚫ | |||
</blockquote> |
|||
===="What makes Individual I's a Collective We; Coordination mechanisms & costs"==== |
|||
===="..."==== |
|||
From the abstract:<ref>{{Cite| publisher = arXiv| doi = 10.48550/arXiv.2306.02113| last1 = Yoon| first1 = Jisung| last2 = Kempes| first2 = Chris| last3 = Yang| first3 = Vicky Chuqiao| last4 = West| first4 = Geoffrey| last5 = Youn| first5 = Hyejin| title = What makes Individual I's a Collective We; Coordination mechanisms & costs| date = 2023-06-03| url = http://arxiv.org/abs/2306.02113}}</ref> |
|||
From the abstract: |
|||
<blockquote style="padding-left:1.0em; padding-right:1.0em; background-color:#eaf8f4;"> |
<blockquote style="padding-left:1.0em; padding-right:1.0em; background-color:#eaf8f4;"> |
||
"Diving into the Wikipedia ecosystem [...] we identified and quantified three fundamental coordination mechanisms and found they scale with an influx of contributors in a remarkably systemic way over three order of magnitudes. Firstly, we have found a super-linear growth in mutual adjustments (scaling exponent: 1.3), manifested through extensive discussions and activity reversals. Secondly, the increase in direct supervision (scaling exponent: 0.9), as represented by the administrators’ activities, is disproportionately limited. Finally, the rate of rule enforcement exhibits the slowest escalation (scaling exponent 0.7), reflected by automated bots. The observed scaling exponents are notably robust across topical categories with minor variations attributed to the topic complication. Our findings suggest that as more people contribute to a project, a self-regulating ecosystem incurs faster mutual adjustments than direct supervision and rule enforcement." |
|||
⚫ | |||
</blockquote> |
|||
====" |
===="Wikidata Research Articles Dataset"==== |
||
From the abstract:<ref>{{Cite| publisher = Freie Universität Berlin| doi = 10.17169/refubium-40231| last = Farda-Sarbas| first = Mariam| title = Wikidata Research Articles Dataset| date = 2023| url = https://refubium.fu-berlin.de/handle/fub188/40510}}</ref> |
|||
From the abstract: |
|||
<blockquote style="padding-left:1.0em; padding-right:1.0em; background-color:#eaf8f4;"> |
<blockquote style="padding-left:1.0em; padding-right:1.0em; background-color:#eaf8f4;"> |
||
"The "Wikidata Research Articles Dataset" comprises peer-reviewed full research papers about Wikidata from its first decade of existence (2012-2022). This dataset was curated to provide insights into the research focus of Wikidata, identify any gaps, and highlight the institutions actively involved in researching Wikidata." |
|||
⚫ | |||
</blockquote> |
|||
===References=== |
===References=== |
Revision as of 19:58, 2 October 2023
Article display preview: | This is a draft of a potential Signpost article, and should not be interpreted as a finished piece. Its content is subject to review by the editorial team and ultimately by JPxG, the editor in chief. Please do not link to this draft as it is unfinished and the URL will change upon publication. If you would like to contribute and are familiar with the requirements of a Signpost article, feel free to be bold in making improvements!
|
Concerns about limiting Wikipedia's "anyone can edit" principle "may be overstated"
A monthly overview of recent academic research about Wikipedia and other Wikimedia projects, also published as the Wikimedia Research Newsletter.
FlaggedRevs study finds that concerns about limiting Wikipedia's "anyone can edit" principle "may be overstated"
A paper titled "The Risks, Benefits, and Consequences of Prepublication Moderation: Evidence from 17 Wikipedia Language Editions",[1] from last year's CSCW conference, addresses a longstanding open question in Wikipedia research, with important implications for some current issues.
Wikipedia famously allows anyone to edit, which generally means that even unregistered editors can make changes to content that go live immediately - only subject to "postpublication moderation" by other editors afterwards. Less well known is that on many Wikipedia language versions, this principle has long been limited by a software feature called Flagged Revisions (FlaggedRevs), which was developed and designed on the request of German Wikipedia community and deployed there first in 2008, and has since been adopted by various other Wikimedia projects. (These do not include the English Wikipedia, which after much discussion implemented a system called "Pending Changes" that is very similar, but is only applied on a case-by-case basis to a small percentage of pages.) As summarized by the authors:
FlaggedRevs is a prepublication content moderation system in that it will display the most recent “flagged” revision of any page for which FlaggedRevs is enabled instead of the most recent revision in general. FlaggedRevs is designed to “give additional information regarding quality,” by ensuring that revisions from less-trusted users are vetted for vandalism or substandard content (e.g., obvious mistakes because of sloppy editing) before being flagged and made public. The FlaggedRevs system also displays the moderation status of the contribution to readers. [...] Although there are many details that can vary based on the way that the system is configured, FlaggedRevs has typically been deployed in the following way on Wikipedia language editions. First, users are divided into groups of trusted and untrusted users. Untrusted users typically include all users without accounts as well as users who have created accounts recently and/or contributed very little. Although editors without accounts remain untrusted indefinitely, editors with accounts are automatically promoted to trusted status when they clear certain thresholds determined by each language community. For example, German Wikipedia automatically promotes editors with accounts who have contributed at least 300 revisions accompanied by at least 30 comments.
The paper studies the impact of the introduction of FlaggedRevs "on 17 Wikipedia language communities: Albanian, Arabic, Belarusian, Bengali, Bosnian, Esperanto, Persian, Finnish, Georgian, German, Hungarian, Indonesian, Interlingua, Macedonian, Polish, Russian, and Turkish" (leaving out a few non-Wikipedia sister projects that also use the system). The overall findings are that
"the system is very effective at blocking low-quality contributions from ever being visible. In analyzing its side effects, we found, contrary to expectations and most of our hypotheses, little evidence that the system neither raises transaction costs sufficiently to inhibit participation by the community as a whole, nor measurably improves the quality of contributions."
In the "Discussion" section, the authors write
Our results suggest that prepublication moderation systems like FlaggedRevs may have a substantial upside with relatively little downside. If this is true, why are a tiny proportion of Wikipedia language editions using it? Were they just waiting for an analysis like ours? In designing this study, we carefully read the Talk page of FlaggedRevs.[supp 1] Community members commenting in the discussion agreed that prepublication review significantly reduces the chance of letting harmful content slip through and being displayed to the public. Certainly, many agreed that the implementation of prepublication review was a success story in general—especially on German Wikipedia. [...] However, the same discussion also reveals that the success of German Wikipedia is not enough to convince more wikis to follow in their footsteps. From a technical perspective, FlaggedRevs’ source code appears poorly maintained.[supp 2] [...] FlaggedRevs itself suffers from a range of specific limitations. For example, the FlaggedRevs system does not notify editors that their contribution has been rejected or approved. [...] Since April 2017, requests for deployment of the system by other wikis have been paused by the Wikimedia Foundation indefinitely.15 Despite these problems, our findings suggest that the system kept low-quality contributions out of the public eye and did not deter contributions from the majority of new and existing users. Our work suggests that systems like FlaggedRevs deserve more attention.
(This reviewer agrees on particular on the lack of notifications for new and unregistered editors that their edit has been approved - having filed, in vain, a proposal to implement this uncontroversially beneficial and already designed software feature to the annual "Community Wishlist", in 2023, 2022, and 2019.)
Interestingly, while the FlaggedRevs feature was (as summarized by the authors) developed by the the Wikimedia Foundation and the German Wikimedia chapter (Wikimedia Deutschland), community complaints about a lack of support from the Foundation for the system were present even then, e.g. in a talk at Wikimania 2008 (notes, video recording) by User:P. Birken, a main driving force behind the project. Perhaps relatedly, the authors of the present study highlight a lack of researcher attention:
"Despite its importance and deployment in a number of large Wikipedia communities, very little is known regarding the effectiveness of the system and its impact. A report made by the members of the Wikimedia Foundation in 2008 gave a brief overview of the extension, its capabilities and deployment status at the time, but acknowledged that “it is not yet fully understood what the impact of the implementation of FlaggedRevs has been on the number of contributions by new users.”[supp 3] Our work seeks to address this empirical gap."
(Disclosure: This reviewer provided some advice to the authors at the beginning of their research project, but was not involved in it otherwise.)
Briefly
- Wikimania, the annual global conference of the Wikimedia movement, took place in Singapore in August (as an in-person event again for the first time since 2019). Its research track included the by now traditional State of Wikimedia Research presentation highlighting research trends from the past year (with involvement by members of this research newsletter).
- See the page of the monthly Wikimedia Research Showcase for videos and slides of past presentations.
Other recent publications
Other recent publications that could not be covered in time for this issue include the items listed below. Contributions, whether reviewing or summarizing newly published research, are always welcome.
"Wikidata as Semantic Infrastructure: Knowledge Representation, Data Labor, and Truth in a More-Than-Technical Project"
From the abstract:[2]
"Various Wikipedia researchers have commended Wikidata for its collaborative nature and liberatory potential, yet less attention has been paid to the social and political implications of Wikidata. This article aims to advance work in this context by introducing the concept of semantic infrastructure and outlining how Wikidata’s role as semantic infrastructure is the primary vehicle by which Wikipedia has become infrastructural for digital platforms. We develop two key themes that build on questions of power that arise in infrastructure studies and apply to Wikidata: knowledge representation and data labor."
"Naked data: curating Wikidata as an artistic medium to interpret prehistoric figurines"
From the abstract:[3]
"In 2019, Digital Curation Lab Director Toni Sant and the artist Enrique Tabone started collaborating on a research project exploring the visualization of specific data sets through Wikidata for artistic practice. An art installation called Naked Data was developed from this collaboration and exhibited at the Stanley Picker Gallery in Kingson, London, during the DRHA 2022 conference. [...] This article outlines the key elements involved in this practice-based research work and shares the artistic process involving the visualizing of the scientific data with special attention to the aesthetic qualities afforded by this technological engagement."
"What makes Individual I's a Collective We; Coordination mechanisms & costs"
From the abstract:[4]
"Diving into the Wikipedia ecosystem [...] we identified and quantified three fundamental coordination mechanisms and found they scale with an influx of contributors in a remarkably systemic way over three order of magnitudes. Firstly, we have found a super-linear growth in mutual adjustments (scaling exponent: 1.3), manifested through extensive discussions and activity reversals. Secondly, the increase in direct supervision (scaling exponent: 0.9), as represented by the administrators’ activities, is disproportionately limited. Finally, the rate of rule enforcement exhibits the slowest escalation (scaling exponent 0.7), reflected by automated bots. The observed scaling exponents are notably robust across topical categories with minor variations attributed to the topic complication. Our findings suggest that as more people contribute to a project, a self-regulating ecosystem incurs faster mutual adjustments than direct supervision and rule enforcement."
"Wikidata Research Articles Dataset"
From the abstract:[5]
"The "Wikidata Research Articles Dataset" comprises peer-reviewed full research papers about Wikidata from its first decade of existence (2012-2022). This dataset was curated to provide insights into the research focus of Wikidata, identify any gaps, and highlight the institutions actively involved in researching Wikidata."
References
- ^ Tran, Chau; Champion, Kaylea; Hill, Benjamin Mako; Greenstadt, Rachel (2022-11-11). "The Risks, Benefits, and Consequences of Prepublication Moderation: Evidence from 17 Wikipedia Language Editions". Proceedings of the ACM on Human-Computer Interaction. 6 (CSCW2): 333–1–333:25. doi:10.1145/3555225. / Tran, Chau; Champion, Kaylea; Hill, Benjamin Mako; Greenstadt, Rachel (2022-11-07). "The Risks, Benefits, and Consequences of Prepublication Moderation: Evidence from 17 Wikipedia Language Editions". Proceedings of the ACM on Human-Computer Interaction. 6 (CSCW2): 1–25. doi:10.1145/3555225. ISSN 2573-0142.
- ^ Ford, Heather; Iliadis, Andrew (2023-07-01). "Wikidata as Semantic Infrastructure: Knowledge Representation, Data Labor, and Truth in a More-Than-Technical Project". Social Media + Society. 9 (3): 20563051231195552. doi:10.1177/20563051231195552. ISSN 2056-3051.
- ^ Sant, Toni; Tabone, Enrique (2023). "Naked data: curating Wikidata as an artistic medium to interpret prehistoric figurines". International Journal of Performance Arts and Digital Media. 0 (0): 1–18. doi:10.1080/14794713.2023.2253335. ISSN 1479-4713. Retrieved 2023-09-27.
- ^ Yoon, Jisung; Kempes, Chris; Yang, Vicky Chuqiao; West, Geoffrey; Youn, Hyejin (2023-06-03), What makes Individual I's a Collective We; Coordination mechanisms & costs, arXiv, doi:10.48550/arXiv.2306.02113
- ^ Farda-Sarbas, Mariam (2023), Wikidata Research Articles Dataset, Freie Universität Berlin, doi:10.17169/refubium-40231
- Supplementary references and notes:
This page is a draft for the next issue of the Signpost. Below is some helpful code that will help you write and format a Signpost draft. If it's blank, you can fill out a template by copy-pasting this in and pressing 'publish changes': {{subst:Wikipedia:Wikipedia Signpost/Templates/Story-preload}}
Images and Galleries
|
---|
To put an image in your article, use the following template (link): This will create the file on the right. Keep the 300px in most cases. If writing a 'full width' article, change
Placing (link) will instead create an inline image like below [[File:|300px|center|alt=Placeholder alt text]]
To create a gallery, use the following to create |
Quotes
| |||
---|---|---|---|
To insert a framed quote like the one on the right, use this template (link): If writing a 'full width' article, change
To insert a pull quote like
use this template (link):
To insert a long inline quote like
use this template (link): |
Side frames
|
---|
Side frames help put content in sidebar vignettes. For instance, this one (link): gives the frame on the right. This is useful when you want to insert non-standard images, quotes, graphs, and the like.
For example, to insert the {{Graph:Chart}} generated by in a frame, simple put the graph code in to get the framed Graph:Chart on the right. If writing a 'full width' article, change |
Two-column vs full width styles
|
---|
If you keep the 'normal' preloaded draft and work from there, you will be using the two-column style. This is perfectly fine in most cases and you don't need to do anything. However, every time you have a However, you can also fine-tune which style is used at which point in an article. To switch from two-column → full width style midway in an article, insert where you want the switch to happen. To switch from full width → two-column style midway in an article, insert where you want the switch to happen. |
Article series
|
---|
To add a series of 'related articles' your article, use the following code or will create the sidebar on the right. If writing a 'full width' article, change Alternatively, you can use at the end of an article to create For more Signpost coverage on the visual editor see our visual editor series. If you think a topic would make a good series, but you don't see a tag for it, or that all the articles in a series seem 'old', ask for help at the WT:NEWSROOM. Many more tags exist, but they haven't been documented yet. |
Links and such
|
---|
By the way, the template that you're reading right now is {{Editnotices/Group/Wikipedia:Wikipedia Signpost/Next issue}} (edit). A list of the preload templates for Signpost articles can be found here. |
Discuss this story
FlaggedRevs
There seem to be some errors in The Risks, Benefits, and Consequences of Prepublication Moderation: Evidence from 17 Wikipedia Language Editions (https://arxiv.org/abs/2202.05548) papers assumptions on how FlaggedRevs works. For example:
--Zache (talk) 09:11, 4 October 2023 (UTC)[reply]
In Russian Wikipedia, as well as in Russian Wikinews, FlaggedRevs is a disaster. You say Germans are guilty in that? --ssr (talk) 06:09, 27 October 2023 (UTC)[reply]
Wikidata
ChatGPT v. Wikipedia
The study authors comment on prose quality. I happened to ask ChatGPT yesterday to explain what government shutdowns in the U.S. are and what effects they have. I got the following answer:
I then compared that to the lead of Government shutdowns in the United States:
Personally I found ChatGPT's output a lot more readable than the Wikipedia lead – it is just better written. The English Wikipedia text often required me to go back and read the sentence again.
Take the first sentence:
At first I parsed "when funding legislation" as an indication of when shutdowns occur (i.e. "when you are funding legislation"). I needed to read on to realise that this wasn't where the sentence was going.Next, Wikipedia uses the rather technical expression "when funding legislation ... is not enacted" (which is also passive voice) where ChatGPT uses the much easier-to-understand "when Congress fails to pass a budget" (active voice).
Where ChatGPT speaks of a "temporary suspension of non-essential government services", Wikipedia says the federal government "curtails agency activities and services, ceases non-essential operations", etc. I find the ChatGPT phrase easier to understand and faster to read while providing much the same information as the quoted Wikipedia passage (a point the study authors commented on specifically).
The Wikipedia sentence
leaves me wondering even now what the word "it" at the end of the sentence is meant to refer to.I suspect our sentence construction and word use are not helping us win friends. It's one thing when we are the only service available; it's another when there is a new kid on the block. Andreas JN466 13:56, 4 October 2023 (UTC)[reply]
Even if ChatGPT or its successor becomes the predominant internet search tool, that doesn't mean Wikipedia will be obsolete. It likely means that Wikipedia will go back to its theoretical origin as a reference work rather than the internet search tool many readers use it as. Thebiguglyalien (talk) 16:11, 4 October 2023 (UTC)[reply]
Ah, the rise of AI. I've used it to get ideas for small projects in the past, but people prefer LLMs over Wikipedia? That's, just... sad. The Master of Hedgehogs is back again! 22:09, 4 October 2023 (UTC)[reply]