Jump to content

Wikipedia:Wikipedia Signpost/2023-10-03/Recent research: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
CC User:JPxG (should be publishable at this point)
No edit summary
Line 42: Line 42:
In the "Discussion" section, the authors write
In the "Discussion" section, the authors write
<blockquote style="padding-left:1.0em; padding-right:1.0em; background-color:#eaf8f4;">
<blockquote style="padding-left:1.0em; padding-right:1.0em; background-color:#eaf8f4;">
Our results suggest that prepublication moderation systems like FlaggedRevs may have a substantial upside with relatively little downside. If this is true, why are a tiny proportion of Wikipedia language editions using it? Were they just waiting for an analysis like ours? In designing this study, we carefully read the Talk page of FlaggedRevs.<ref group=supp>https://meta.wikimedia.org/wiki/Talk:Flagged_Revisions</ref> Community members commenting in the discussion agreed that prepublication review significantly reduces the chance of letting harmful content slip through and being displayed to the public. Certainly, many agreed that the implementation of prepublication review was a success story in general—especially on German Wikipedia. [...]<br>
Our results suggest that prepublication moderation systems like FlaggedRevs may have a substantial
However, the same discussion also reveals that the success of German Wikipedia is not enough to convince more wikis to follow in their footsteps. From a technical perspective, FlaggedRevs’ source code appears poorly maintained.<ref group = supp>https://phabricator.wikimedia.org/T66726#3189794</ref> [...] FlaggedRevs itself suffers from a range of specific limitations. For example, the FlaggedRevs system does not notify editors that their contribution has been rejected or approved. [...] Since April 2017, requests for deployment of the system by other wikis have been paused by the Wikimedia Foundation indefinitely.<ref group = supp>https://phabricator.wikimedia.org/T66726#3189794</ref> Despite these problems, our findings suggest that the system kept low-quality contributions out of the public eye and did not deter contributions from the majority of new and existing users. Our work suggests that systems like FlaggedRevs deserve more attention.
upside with relatively little downside. If this is true, why are a tiny proportion of Wikipedia language editions using it? Were they just waiting for an analysis like ours? In designing this study, we carefully read the Talk page of FlaggedRevs.<ref group=supp>https://meta.wikimedia.org/wiki/Talk:Flagged_Revisions</ref> Community members commenting in the discussion agreed that prepublication review significantly reduces the chance of letting harmful content slip through and being displayed to the public. Certainly, many agreed that the implementation of prepublication review was a success story in general—especially on German Wikipedia. [...]
However, the same discussion also reveals that the success of German Wikipedia is not enough to
convince more wikis to follow in their footsteps. From a technical perspective, FlaggedRevs’ source
code appears poorly maintained.<ref group = supp>https://phabricator.wikimedia.org/T66726#3189794</ref> [...] FlaggedRevs itself suffers from a range of specific limitations. For example, the FlaggedRevs system does not notify editors that their contribution has been rejected or approved. [...] Since April 2017, requests for deployment of the system by other wikis have been paused by the Wikimedia Foundation indefinitely.15 Despite these problems, our findings suggest that the system kept low-quality contributions out of the public eye and did not deter contributions from the majority of new and existing users. Our work suggests that systems like FlaggedRevs deserve more attention.
</blockquote>
</blockquote>
(This reviewer agrees on particular on the lack of notifications for new and unregistered editors that their edit has been approved - having filed, in vain, a proposal to implement this uncontroversially beneficial and already designed software feature to the annual "Community Wishlist", in [[m:Community_Wishlist_Survey_2023/Notifications,_Watchlists_and_Talk_Pages/Notify_users_when_their_revision_has_been_approved_or_rejected|2023]], 2022, and 2019.)
(This reviewer agrees on particular regarding the lack of notifications for new and unregistered editors that their edit has been approved - having filed, in vain, a proposal to implement this uncontroversially beneficial and [https://phabricator.wikimedia.org/T54510 already designed] software feature to the annual "Community Wishlist", in [[m:Community_Wishlist_Survey_2023/Notifications,_Watchlists_and_Talk_Pages/Notify_users_when_their_revision_has_been_approved_or_rejected|2023]], 2022, and 2019.)




Line 54: Line 51:


<blockquote style="padding-left:1.0em; padding-right:1.0em; background-color:#eaf8f4;">
<blockquote style="padding-left:1.0em; padding-right:1.0em; background-color:#eaf8f4;">
"Despite its importance and deployment in a number of large Wikipedia communities, very little is known regarding the effectiveness of the system and its impact. A report made by the members of the Wikimedia Foundation in 2008 gave a brief overview of the extension, its capabilities and deployment status at the time, but acknowledged that “it is not yet fully understood what the impact of the implementation of FlaggedRevs has been on the number of contributions by new
"Despite its importance and deployment in a number of large Wikipedia communities, very little is known regarding the effectiveness of the system and its impact. A report made by the members of the Wikimedia Foundation in 2008 gave a brief overview of the extension, its capabilities and deployment status at the time, but acknowledged that “it is not yet fully understood what the impact of the implementation of FlaggedRevs has been on the number of contributions by new users.”<ref group=supp>https://meta.wikimedia.org/wiki/FlaggedRevs_Report_December_2008</ref> Our work seeks to address this empirical gap."
users.”<ref group=supp>https://meta.wikimedia.org/wiki/FlaggedRevs_Report_December_2008</ref> Our work seeks to address this empirical gap."
</blockquote>
</blockquote>
Still, it may be worth mentioning that there have been at least two preceding attempts to study this question (neither of these has been published in peer-reviewed form, thus their omission from the present study is understandable). They likewise don't seem to have identified major concerns that FlaggedRevs might contribute to community decline:
(Disclosure: This reviewer provided some advice to the authors at the beginning of their research project, but was not involved in it otherwise.)
# A [https://www.slideshare.net/glimmerphoenix/flagged-revs talk at Wikimania 2010] presented preliminary results from a study commissioned by Wikimedia Germany, e.g. that on German Wikipedia, "In general, flagged revisions did not [affect] anonymous editing" and that "most revisions got approved very rapidly" (the latter result surely doesn't hold everywhere; e.g. [https://ru.wikipedia.org/w/index.php?title=%D0%A1%D0%BB%D1%83%D0%B6%D0%B5%D0%B1%D0%BD%D0%B0%D1%8F:%D0%A1%D1%82%D0%B0%D1%82%D0%B8%D1%81%D1%82%D0%B8%D0%BA%D0%B0_%D0%BF%D1%80%D0%BE%D0%B2%D0%B5%D1%80%D0%BE%D0%BA&uselang=en on Russian Wikipedia], the median time for an unregistered editor's edit to get reviewed is over 13 days at the time of writing). It also found, unsurprisingly, a "reduced impact of vandalism", consistent with the present study.
# An [https://hu.wikipedia.org/w/index.php?title=Szerkeszt%C5%91:Tgr/Jel%C3%B6lt_v%C3%A1ltozatok_hat%C3%A1s%C3%A1nak_elemz%C3%A9se/en&uselang=en informal investigation] of an experiment conducted by the Hungarian Wikipedia in 2018/19 similarly found that FlaggedRevs had "little impact on the growth of the editor community" overall. The experiment consisted of deactivating the feature of FlaggedRevs that hides unreviewed revisions from readers. As a second question, the Hungarian Wikipedians asked "How much extra load does [deactivating FlaggedRevs] put on patrollers?" They found that "[t]he ratio of bad faith or damaging edits grew minimally (2-3 percentage points); presumably it is a positive feedback for vandals that they see their edits show up publicly. The absolute number of such edits grew significantly more than that, since the number of anonymous edits grew [...]."
In any case, the CSCW paper reviewed here presents a much more comprehensive and methodical approach, not just in studying the impact of FlaggedRevs across multiple wikis, but also regarding the formalizing of various research hypotheses and concerning the use of more reliable statistical techniques.

''(Disclosure: This reviewer provided some input to the authors at the beginning of their research project, but was not involved in it otherwise.)''

''See also related earlier coverage: "[[m:Research:Newsletter/2012/March#Sociological_analysis_of_debates_about_flagged_revisions_in_the_English,_German_and_French_Wikipedias|Sociological analysis of debates about flagged revisions in the English, German and French Wikipedias]]" (2012)''



===Briefly===
===Briefly===

Revision as of 02:33, 3 October 2023

Recent research

Concerns about limiting Wikipedia's "anyone can edit" principle "may be overstated"


A monthly overview of recent academic research about Wikipedia and other Wikimedia projects, also published as the Wikimedia Research Newsletter.


FlaggedRevs study finds that concerns about limiting Wikipedia's "anyone can edit" principle "may be overstated"

A paper titled "The Risks, Benefits, and Consequences of Prepublication Moderation: Evidence from 17 Wikipedia Language Editions",[1] from last year's CSCW conference, addresses a longstanding open question in Wikipedia research, with important implications for some current issues.

Wikipedia famously allows anyone to edit, which generally means that even unregistered editors can make changes to content that go live immediately - only subject to "postpublication moderation" by other editors afterwards. Less well known is that on many Wikipedia language versions, this principle has long been limited by a software feature called Flagged Revisions (FlaggedRevs), which was developed and designed on the request of German Wikipedia community and deployed there first in 2008, and has since been adopted by various other Wikimedia projects. (These do not include the English Wikipedia, which after much discussion implemented a system called "Pending Changes" that is very similar, but is only applied on a case-by-case basis to a small percentage of pages.) As summarized by the authors:

FlaggedRevs is a prepublication content moderation system in that it will display the most recent “flagged” revision of any page for which FlaggedRevs is enabled instead of the most recent revision in general. FlaggedRevs is designed to “give additional information regarding quality,” by ensuring that revisions from less-trusted users are vetted for vandalism or substandard content (e.g., obvious mistakes because of sloppy editing) before being flagged and made public. The FlaggedRevs system also displays the moderation status of the contribution to readers. [...] Although there are many details that can vary based on the way that the system is configured, FlaggedRevs has typically been deployed in the following way on Wikipedia language editions. First, users are divided into groups of trusted and untrusted users. Untrusted users typically include all users without accounts as well as users who have created accounts recently and/or contributed very little. Although editors without accounts remain untrusted indefinitely, editors with accounts are automatically promoted to trusted status when they clear certain thresholds determined by each language community. For example, German Wikipedia automatically promotes editors with accounts who have contributed at least 300 revisions accompanied by at least 30 comments.

The paper studies the impact of the introduction of FlaggedRevs "on 17 Wikipedia language communities: Albanian, Arabic, Belarusian, Bengali, Bosnian, Esperanto, Persian, Finnish, Georgian, German, Hungarian, Indonesian, Interlingua, Macedonian, Polish, Russian, and Turkish" (leaving out a few non-Wikipedia sister projects that also use the system). The overall findings are that

"the system is very effective at blocking low-quality contributions from ever being visible. In analyzing its side effects, we found, contrary to expectations and most of our hypotheses, little evidence that the system neither raises transaction costs sufficiently to inhibit participation by the community as a whole, nor measurably improves the quality of contributions."

In the "Discussion" section, the authors write

Our results suggest that prepublication moderation systems like FlaggedRevs may have a substantial upside with relatively little downside. If this is true, why are a tiny proportion of Wikipedia language editions using it? Were they just waiting for an analysis like ours? In designing this study, we carefully read the Talk page of FlaggedRevs.[supp 1] Community members commenting in the discussion agreed that prepublication review significantly reduces the chance of letting harmful content slip through and being displayed to the public. Certainly, many agreed that the implementation of prepublication review was a success story in general—especially on German Wikipedia. [...]
However, the same discussion also reveals that the success of German Wikipedia is not enough to convince more wikis to follow in their footsteps. From a technical perspective, FlaggedRevs’ source code appears poorly maintained.[supp 2] [...] FlaggedRevs itself suffers from a range of specific limitations. For example, the FlaggedRevs system does not notify editors that their contribution has been rejected or approved. [...] Since April 2017, requests for deployment of the system by other wikis have been paused by the Wikimedia Foundation indefinitely.[supp 3] Despite these problems, our findings suggest that the system kept low-quality contributions out of the public eye and did not deter contributions from the majority of new and existing users. Our work suggests that systems like FlaggedRevs deserve more attention.

(This reviewer agrees on particular regarding the lack of notifications for new and unregistered editors that their edit has been approved - having filed, in vain, a proposal to implement this uncontroversially beneficial and already designed software feature to the annual "Community Wishlist", in 2023, 2022, and 2019.)


Interestingly, while the FlaggedRevs feature was (as summarized by the authors) developed by the the Wikimedia Foundation and the German Wikimedia chapter (Wikimedia Deutschland), community complaints about a lack of support from the Foundation for the system were present even then, e.g. in a talk at Wikimania 2008 (notes, video recording) by User:P. Birken, a main driving force behind the project. Perhaps relatedly, the authors of the present study highlight a lack of researcher attention:

"Despite its importance and deployment in a number of large Wikipedia communities, very little is known regarding the effectiveness of the system and its impact. A report made by the members of the Wikimedia Foundation in 2008 gave a brief overview of the extension, its capabilities and deployment status at the time, but acknowledged that “it is not yet fully understood what the impact of the implementation of FlaggedRevs has been on the number of contributions by new users.”[supp 4] Our work seeks to address this empirical gap."

Still, it may be worth mentioning that there have been at least two preceding attempts to study this question (neither of these has been published in peer-reviewed form, thus their omission from the present study is understandable). They likewise don't seem to have identified major concerns that FlaggedRevs might contribute to community decline:

  1. A talk at Wikimania 2010 presented preliminary results from a study commissioned by Wikimedia Germany, e.g. that on German Wikipedia, "In general, flagged revisions did not [affect] anonymous editing" and that "most revisions got approved very rapidly" (the latter result surely doesn't hold everywhere; e.g. on Russian Wikipedia, the median time for an unregistered editor's edit to get reviewed is over 13 days at the time of writing). It also found, unsurprisingly, a "reduced impact of vandalism", consistent with the present study.
  2. An informal investigation of an experiment conducted by the Hungarian Wikipedia in 2018/19 similarly found that FlaggedRevs had "little impact on the growth of the editor community" overall. The experiment consisted of deactivating the feature of FlaggedRevs that hides unreviewed revisions from readers. As a second question, the Hungarian Wikipedians asked "How much extra load does [deactivating FlaggedRevs] put on patrollers?" They found that "[t]he ratio of bad faith or damaging edits grew minimally (2-3 percentage points); presumably it is a positive feedback for vandals that they see their edits show up publicly. The absolute number of such edits grew significantly more than that, since the number of anonymous edits grew [...]."

In any case, the CSCW paper reviewed here presents a much more comprehensive and methodical approach, not just in studying the impact of FlaggedRevs across multiple wikis, but also regarding the formalizing of various research hypotheses and concerning the use of more reliable statistical techniques.

(Disclosure: This reviewer provided some input to the authors at the beginning of their research project, but was not involved in it otherwise.)

See also related earlier coverage: "Sociological analysis of debates about flagged revisions in the English, German and French Wikipedias" (2012)


Briefly

Other recent publications

Other recent publications that could not be covered in time for this issue include the items listed below. Contributions, whether reviewing or summarizing newly published research, are always welcome.

"Wikidata as Semantic Infrastructure: Knowledge Representation, Data Labor, and Truth in a More-Than-Technical Project"

From the abstract:[2]

"Various Wikipedia researchers have commended Wikidata for its collaborative nature and liberatory potential, yet less attention has been paid to the social and political implications of Wikidata. This article aims to advance work in this context by introducing the concept of semantic infrastructure and outlining how Wikidata’s role as semantic infrastructure is the primary vehicle by which Wikipedia has become infrastructural for digital platforms. We develop two key themes that build on questions of power that arise in infrastructure studies and apply to Wikidata: knowledge representation and data labor."


"Naked data: curating Wikidata as an artistic medium to interpret prehistoric figurines"

From the abstract:[3]

"In 2019, Digital Curation Lab Director Toni Sant and the artist Enrique Tabone started collaborating on a research project exploring the visualization of specific data sets through Wikidata for artistic practice. An art installation called Naked Data was developed from this collaboration and exhibited at the Stanley Picker Gallery in Kingson, London, during the DRHA 2022 conference. [...] This article outlines the key elements involved in this practice-based research work and shares the artistic process involving the visualizing of the scientific data with special attention to the aesthetic qualities afforded by this technological engagement."

"What makes Individual I's a Collective We; Coordination mechanisms & costs"

From the abstract:[4]

"Diving into the Wikipedia ecosystem [...] we identified and quantified three fundamental coordination mechanisms and found they scale with an influx of contributors in a remarkably systemic way over three order of magnitudes. Firstly, we have found a super-linear growth in mutual adjustments (scaling exponent: 1.3), manifested through extensive discussions and activity reversals. Secondly, the increase in direct supervision (scaling exponent: 0.9), as represented by the administrators’ activities, is disproportionately limited. Finally, the rate of rule enforcement exhibits the slowest escalation (scaling exponent 0.7), reflected by automated bots. The observed scaling exponents are notably robust across topical categories with minor variations attributed to the topic complication. Our findings suggest that as more people contribute to a project, a self-regulating ecosystem incurs faster mutual adjustments than direct supervision and rule enforcement."

"Wikidata Research Articles Dataset"

From the abstract:[5]

"The "Wikidata Research Articles Dataset" comprises peer-reviewed full research papers about Wikidata from its first decade of existence (2012-2022). This dataset was curated to provide insights into the research focus of Wikidata, identify any gaps, and highlight the institutions actively involved in researching Wikidata."

References

  1. ^ Tran, Chau; Champion, Kaylea; Hill, Benjamin Mako; Greenstadt, Rachel (2022-11-11). "The Risks, Benefits, and Consequences of Prepublication Moderation: Evidence from 17 Wikipedia Language Editions". Proceedings of the ACM on Human-Computer Interaction. 6 (CSCW2): 333–1–333:25. doi:10.1145/3555225. Closed access icon / Tran, Chau; Champion, Kaylea; Hill, Benjamin Mako; Greenstadt, Rachel (2022-11-07). "The Risks, Benefits, and Consequences of Prepublication Moderation: Evidence from 17 Wikipedia Language Editions". Proceedings of the ACM on Human-Computer Interaction. 6 (CSCW2): 1–25. doi:10.1145/3555225. ISSN 2573-0142.
  2. ^ Ford, Heather; Iliadis, Andrew (2023-07-01). "Wikidata as Semantic Infrastructure: Knowledge Representation, Data Labor, and Truth in a More-Than-Technical Project". Social Media + Society. 9 (3): 20563051231195552. doi:10.1177/20563051231195552. ISSN 2056-3051.
  3. ^ Sant, Toni; Tabone, Enrique (2023). "Naked data: curating Wikidata as an artistic medium to interpret prehistoric figurines". International Journal of Performance Arts and Digital Media. 0 (0): 1–18. doi:10.1080/14794713.2023.2253335. ISSN 1479-4713. Retrieved 2023-09-27.
  4. ^ Yoon, Jisung; Kempes, Chris; Yang, Vicky Chuqiao; West, Geoffrey; Youn, Hyejin (2023-06-03), What makes Individual I's a Collective We; Coordination mechanisms & costs, arXiv, doi:10.48550/arXiv.2306.02113
  5. ^ Farda-Sarbas, Mariam (2023), Wikidata Research Articles Dataset, Freie Universität Berlin, doi:10.17169/refubium-40231
Supplementary references and notes:

This page is a draft for the next issue of the Signpost. Below is some helpful code that will help you write and format a Signpost draft. If it's blank, you can fill out a template by copy-pasting this in and pressing 'publish changes': {{subst:Wikipedia:Wikipedia Signpost/Templates/Story-preload}}


Images and Galleries
Sidebar images

To put an image in your article, use the following template (link):

[[File:|center|300px|alt=TKTK]]

O frabjous day.
{{Wikipedia:Wikipedia Signpost/Templates/Filler image-v2
 |image     = 
 |size      = 300px
 |alt       = TKTK
 |caption   = 
 |fullwidth = no
}}

This will create the file on the right. Keep the 300px in most cases. If writing a 'full width' article, change |fullwidth=no to |fullwidth=yes.

Inline images

Placing

{{Wikipedia:Wikipedia Signpost/Templates/Inline image
 |image   =
 |size    = 300px
 |align   = center
 |alt     = Placeholder alt text
 |caption = CAPTION
}}

(link) will instead create an inline image like below

[[File:|300px|center|alt=Placeholder alt text]]
CAPTION
Galleries

To create a gallery, use the following

<gallery mode = packed | heights = 200px>
|Caption for second image
</gallery>

to create

Quotes
Framed quotes

To insert a framed quote like the one on the right, use this template (link):

{{Wikipedia:Wikipedia Signpost/Templates/Filler quote-v2
 |1         = 
 |author    = 
 |source    = 
 |fullwidth = 
}}

If writing a 'full width' article, change |fullwidth=no to |fullwidth=yes.

Pull quotes

To insert a pull quote like

use this template (link):

{{Wikipedia:Wikipedia Signpost/Templates/Quote
 |1         = 
 |source    = 
}}
Long quotes

To insert a long inline quote like

The goose is on the loose! The geese are on the lease!
— User:Oscar Wilde
— Quotations Notes from the Underpoop

use this template (link):

{{Wikipedia:Wikipedia Signpost/Templates/block quote
 | text   = 
 | by     = 
 | source = 
 | ts     = 
 | oldid  = 
}}
Side frames

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.

A caption

Side frames help put content in sidebar vignettes. For instance, this one (link):

{{Wikipedia:Wikipedia Signpost/Templates/Filler frame-v2
 |1         = Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.
 |caption   = A caption
 |fullwidth = no
}}

gives the frame on the right. This is useful when you want to insert non-standard images, quotes, graphs, and the like.

Example − Graph/Charts
A caption

For example, to insert the {{Graph:Chart}} generated by

{{Graph:Chart
 |width=250|height=100|type=line
 |x=1,2,3,4,5,6,7,8|y=10,12,6,14,2,10,7,9
}}

in a frame, simple put the graph code in |1=

{{Wikipedia:Wikipedia Signpost/Templates/Filler frame-v2
 |1=
{{Graph:Chart
 |width=250|height=100|type=line
 |x=1,2,3,4,5,6,7,8|y=10,12,6,14,2,10,7,9
}}
 |caption=A caption
 |fullwidth=no
}}

to get the framed Graph:Chart on the right.

If writing a 'full width' article, change |fullwidth=no to |fullwidth=yes.

Two-column vs full width styles

If you keep the 'normal' preloaded draft and work from there, you will be using the two-column style. This is perfectly fine in most cases and you don't need to do anything.

However, every time you have a |fullwidth=no and change it to |fullwidth=yes (or vice-versa), the article will take that style from that point onwards (|fullwidth=yes → full width, |fullwidth=no → two-column). By default, omitting |fullwidth= is the same as putting |fullwidth=no and the article will have two columns after that. Again, this is perfectly fine in most cases, and you don't need to do anything.

However, you can also fine-tune which style is used at which point in an article.

To switch from two-column → full width style midway in an article, insert

{{Wikipedia:Wikipedia Signpost/Templates/Signpost-block-end-v2}}
{{Wikipedia:Wikipedia Signpost/Templates/Signpost-block-start-v2|fullwidth=yes}}

where you want the switch to happen.

To switch from full width → two-column style midway in an article, insert

{{Wikipedia:Wikipedia Signpost/Templates/Signpost-block-end-v2}}
{{Wikipedia:Wikipedia Signpost/Templates/Signpost-block-start-v2|fullwidth=no}}

where you want the switch to happen.

Article series

To add a series of 'related articles' your article, use the following code

Related articles
Visual Editor

Five, ten, and fifteen years ago
1 January 2023

VisualEditor, endowment, science, and news in brief
5 August 2015

HTTPS-only rollout completed, proposal to enable VisualEditor for new accounts
17 June 2015

VisualEditor and MediaWiki updates
29 April 2015

Security issue fixed; VisualEditor changes
4 February 2015


More articles

{{Signpost series
 |type        = sidebar-v2
 |tag         = VisualEditor
 |seriestitle = Visual Editor
 |fullwidth   = no
}}

or

{{Signpost series
 |type        = sidebar-v2
 |tag         = VisualEditor
 |seriestitle = Visual Editor
 |fullwidth   = yes
}}

will create the sidebar on the right. If writing a 'full width' article, change |fullwidth=no to |fullwidth=yes. A partial list of valid |tag= parameters can be found at here and will decide the list of articles presented. |seriestitle= is the title that will appear below 'Related articles' in the box.

Alternatively, you can use

{{Signpost series
 |type        = inline
 |tag         = VisualEditor
 |tag_name    = visual editor
 |tag_pretext = the
}}

at the end of an article to create

For more Signpost coverage on the visual editor see our visual editor series.

If you think a topic would make a good series, but you don't see a tag for it, or that all the articles in a series seem 'old', ask for help at the WT:NEWSROOM. Many more tags exist, but they haven't been documented yet.

Links and such

By the way, the template that you're reading right now is {{Editnotices/Group/Wikipedia:Wikipedia Signpost/Next issue}} (edit). A list of the preload templates for Signpost articles can be found here.