Talk:Incident Reporting System: Difference between revisions

From Meta, a Wikimedia project coordination wiki
Latest comment: 1 year ago by Stuartyeates in topic Scope / Universal_Code_of_Conduct
Content deleted Content added
Tag: 2017 source edit
Line 130: Line 130:


https://altilunium.my.id/p/PIRS [[User:Rtnf|Rtnf]] ([[User talk:Rtnf|talk]]) 23:19, 29 August 2022 (UTC)
https://altilunium.my.id/p/PIRS [[User:Rtnf|Rtnf]] ([[User talk:Rtnf|talk]]) 23:19, 29 August 2022 (UTC)

== Scope / Universal_Code_of_Conduct ==

I find it worrying that [[Universal Code of Conduct]] isn't mentioned until nearly the end of the document. In particular it's not mentioned in the background and focus sections. Unless the scope is clear to all participants, I'm sure we can expect confusion, non-optimal feedback and dashed expectations. For example:
* Is PIRS meant to be a reporting system covering all [[Universal Code of Conduct]] violations? Or just 'harassment and other forms of abuse'?
* Is PIRS meant to just cover Universal Code of Conduct issues or other issues that need to be reported and handled privately (Security breaches; legal issues, etc)?
* Presumably there'll be a separate reporting method for issues involving those with access to PRIS internals?
Looking forward to clarity. [[User:Stuartyeates|Stuartyeates]] ([[User talk:Stuartyeates|talk]]) 09:33, 30 August 2022 (UTC)

Revision as of 09:33, 30 August 2022

Incident Reporting System

The following Wikimedia Foundation staff monitor this page:

In order to notify them, please link their username when posting a message.

You can leave your comments here, however we are interested in answers to the questions below.

  1. How do you currently deal with inappropriate behaviour in your community?
  2. What are some of the most common UCoC violations in your community?
  3. How are these currently being reported and processed? Is there a standard approach?
  4. What would you like to change or do differently from the current approach?
  5. The need to have a private space where users can file a report without feeling exposed or unsafe came up in our research. Many of the processes currently request for all conversations to be public. What do you think about exploring ways users can file private reports? What are some possible solutions? Is there anything particularly concerning we should be aware of when thinking about this?
  6. What worries you about this project?
  7. Are we missing something? Is there a question we didn't ask?

Questions

Hi guys, I would love to answer your questions in a more private way. Do you have these questions on a form or an email where I can send the answers? Thank you and congratulations on your work! XenoF (talk) 22:17, 24 August 2022 (UTC)Reply

We will make available Google Forms. Until then, you can also answer privately in an email to @MAna (WMF) ––– STei (WMF) (talk) 14:15, 25 August 2022 (UTC)Reply

Use something like the Volunteer Response Team system?

I'd encourage you to look at how Volunteer Response Team works (both generally, and see commons:Commons:Volunteer Response Team) - and perhaps you could share infrastructure to avoid setting up a completely new system? Since that already exists and seems to work well, but could also do with improvements that you'd want anyway for a new system? Thanks. Mike Peel (talk) 12:57, 25 August 2022 (UTC)Reply

It will need to be a new system. This is something specifically requested in the UCOC enforcement guidelines (project/process-agnostic, private, mediawiki-based reporting tool) Vermont 🐿️ (talk) 14:17, 25 August 2022 (UTC)Reply
@Vermont: Mediawiki-based sounds good. Maybe it would be something that VRT could use in the future, instead of a separate system? Either way, I think there's synergies here, and experience that could be shared. Thanks. Mike Peel (talk) 16:28, 25 August 2022 (UTC)Reply
I included in my list of suggestions below that I recommend they allow enforcement processes to have the tool forward reports to VRT queues as emails. It's necessary to have that option, though optimally the primary ticket handling method would be through MediaWiki, like with the global rename queue. Vermont 🐿️ (talk) 16:30, 25 August 2022 (UTC)Reply
Thank you Mike, valuable. ––– STei (WMF) (talk) 14:17, 25 August 2022 (UTC)Reply

Some thoughts

There are a lot of questions to answer and unpack here in developing this project, questions which would usually be the result of months of large-scale community discussion on a local project. And this is something with global effects, a reporting tool that should be easily accessible by both reporters and enforcement processes.

If this consultation does not effectively communicate with the people who we hope will use this tool once it's created, those people will not use it. This stage of discussions is incredibly important, and unfortunately not everyone is equipped to give informed opinions on what is needed from a reporting tool. I recommend engaging in more target consultations than village pumps and talk pages, specifically communicating with the established enforcement processes who we hope will end up using this tool. This can be part of identifying pilot wikis, but the difficulty there would be creating a centralized tool with global modularity without catering too much to the specific needs of pilot wikis.

Something like the Special:GlobalRenameQueue in terms of design and ticket management could be cool. The difficulty would be making it project-agnostic, and allowing projects to pass individual reports between each other or sharing access.

Some things such a tool would need:

  • Everything as defined in the UCOC Enforcement Guidelines, including ongoing support.
  • Opt-in from local projects with the capacity to handle this and interest in doing so
  • Ability to specify which enforcement body on a project you want to send the report to. Stewards or Meta-Wiki admins on Meta-Wiki. A project's CheckUsers or ArbCom or local admins, etc.
  • As a corollary of the above, the ability for enforcement processes receiving a report to easily forward that report to other enforcement processes, and/or allow view or reply access from those other enforcement processes. Reports should by default be visible only to the entity it is reported to, and possibly some other specific groups (U4C members, Stewards, Ombuds, T&S).
  • Users should be able to access the central reporting system and whatever ticket management results from it from any Wikimedia project without losing the centralized aspect, so that easy forwarding/changes of ticket ownership is easy.
  • Varying degrees of privacy. Allow reports to be made privately, allow them to be made privately, and allow the reporter to specify if they are okay with their private report being made public (which people handling reports can then change).
  • Local project modularity. It's important to ensure that it is globally accessible and easily forwardable/viewable between processes, but it's also important to allow local projects to decide what questions are asked (what fields are open in the report) to the reporter. This should include the option for some projects to have their reports automatically converted to an email, specifically for those who prefer VRT queues.
  • Process-agnostic. Mentioned a few times in this list, but...allow the reporter to select from a drop downs or something which enforcement process to send their report to. Processes can be listed on an opt-in basis by those processes.
  • Something to clarify that this is not for content disputes or basic things that can be solved with talk page discussions.
  • Option for enforcement processes to leave internal comments on tickets/reports not visible to the reporter
  • Reporting should be an action that shows up in Special:CheckUser
  • Options for declining, closing without comment, closing with comment, etc.
  • Email updates to reporter if preferred, otherwise on-wiki notification updates.
  • When it comes to whether IPs can report, I'd say probably the best answer is no, but...maybe allow enforcement processes to permit IP reports on a process-by-process basis?
  • Specific users to be listed with certain access levels based on userrights, AND based on lists. For example, arbcoms aren't a user right, but admins/checkusers/stewards/ombuds are userrights.

This is a very significant tool, with a lot of idiosyncracies to be aware of and work with. There are many community-maintained tools that engage in report management like this, and so many more community processes, and making a centralized reporting system is a difficult task. Best, Vermont 🐿️ (talk) 15:39, 25 August 2022 (UTC)Reply

I will have additional thoughts but I wanted to wholeheartedly endorse Vermont's points here in the interim. Best, KevinL (aka L235 · t) 05:49, 26 August 2022 (UTC)Reply
Thanks for this, super interesting and lots to process! I'm curious in particular about the varying degrees about privacy and the thinking behind it. What are we looking to solve here? Is it about user preference/comfort or is it more about different types of harassment cases would require different approaches when it comes to privacy? MAna (WMF) (talk) 17:20, 26 August 2022 (UTC)Reply
MAna, thank you for your reply! How we currently handle privacy is quite complicated. It depends on the reporter, it depends on the violation being reported, it depends on a near infinite amount of other mitigating factors. And it also varies significantly between enforcement processes. A focus on allowing enforcement processes to customize how they accept reports is necessary for those enforcement processes to use this tool.
In my view, at least these few options are needed:
  • For the reporter
    • Option to post the initial report publicly or send privately
    • If privately, option to state their comfort level with it being made public
  • For the enforcement process
    • Ability to limit reporter's options, say to only accept public reports, or only private reports.
    • Option to make individual private reports public, with consent of the reporter. Note that some reports may by necessity go unactioned if it's something that would be publicly addressed but cannot because of the reporter's preference for privacy. This is something for individual enforcement processes to weigh.
    • Option to make individual public reports private
    • Whether to allow anonymous reports (basically everyone's going to say no but it could be beneficial to have the option here)
It's an incredibly difficult task to make a reporting structure that Wikimedia enforcement processes are going to want to use, and will require inordinate levels of customization. And frequent communication with people involved in those enforcement processes is needed; if not, no one's going to want to use this system. PIRS not only has to be as good as existing tools, but better to warrant a shift. Vermont 🐿️ (talk) 19:34, 26 August 2022 (UTC)Reply
MAna, a question: I noticed that there doesn't seem to be a part of the 2-step plan where designs for this (whether it be wireframes, flow charts, feature lists, etc.) go through any sort of community review prior to the actual software being made. This is prone to missing needed features that may be very difficult to add post-implementation, and could make the effort put into the project moot. Special:Investigate, for example, is not widely used to a large extent because of insufficient iterations and discussions with the people expected to use the tool. Is there the possibility of changing the timeline or plan to account for this? Vermont 🐿️ (talk) 04:22, 27 August 2022 (UTC)Reply
We definitely want to have designs go through some sort of community discussion/feedback. It is not specifically listed but we are thinking about it as part of Phase 1. MAna (WMF) (talk) 17:04, 29 August 2022 (UTC)Reply
I also want to add something about question 5. Many established enforcement processes accept reports via email, the ability to accept those are not really a problem. The difficulty is that many discussions are required by community consensus to take place publicly. So, someone may send in a private email to handle a content dispute that should be reported to the relevant public noticeboard...but will be told that their options are either to report it publicly themselves or to disengage from the dispute. This tool cannot change those practices, that is a matter of community policy/guidelines. Vermont 🐿️ (talk) 00:19, 30 August 2022 (UTC)Reply

Process and Feedback

Hello - I'm going to split my thoughts on the process and some comments in response to the questions asked.

Process

My thoughts here have some, but not total, overlap with Vermont's well elucidated section above.

  • I would have noted that tool's scope (even for consulting) might have been better written after we'd seen the amendments from the Revision Committee, particularly the case with regard to the right to privacy/right to be heard balance aspect.
  • I would support Vermont's statement that specific consultation with functionary groups (more specifically, distinct reaching out to Stewards, arbs, and CUOS) is needed
  • But I would also add that while they handle the most complex cases, by numbers, admins/eliminators handle the most by number, so I wouldn't deprioritise getting feedback from that source.
  • Just a stress on the need to have genuinely responsive consultation - I'd suggest along the lines of the discussion tools process. Otherwise it risks blowback in an area that can least afford it and the loss of a potentially very helpful tool - not to mention the wastage of the team's time!

Thoughts

  • This tool is going to need a simply insane amount of customisation capability to be able to cover both all the local projects, and the Steward actions.
  • Privacy - currently this tool is named the "Private Incident Reporting System", which I think must change. This tool can enable such, but there appears to be an anticipation that it will be such. The revisions to the Enforcement Guidelines are likely to significantly reduce the required level of privacy in most cases. Thus we have the first customisation point - how much privacy those raising conduct issues will have through the system. Some projects might want a high level, but others will be using the status quo + UCOC (whichever is the higher).
  • The tool can't just default to a high-level of privacy then, it needs project by project calibration, or a low-level default if that isn't possible.
  • Conduct pathways - one of the best ways this tool can be helpful is to ease, automate, and guide conduct submissions from newer editors. En-wiki, for example, has numerous conduct fora (AIV, ANI, AN, AN3, ArbCom, etc etc). A platform that can take them step by step (complaint, evidence, requested remedy, accused notification etc) would smooth many things. Those need not be all written by the team, but it needs to be readily done so by a project's editors if not.
  • The platform should then either post it on the board (for public cases) or covert to email for those that the project feels should be private.
  • Many conduct issues (especially the most common, such as vandalism) lead to a warning. Having any vandalism identified routed to either a board or an admin would lead to a large expenditure of time and lots of "insufficient prior warnings, editor not blocked but warned" outcomes. Whether this might be resolved by just excluding lower level issues such as that or by encouraging the editor to go issue a warning I'm not sure, but it should be considered.

I will likely return to add a few more specific answers in response to the team's specific questions, but the above were my primary concerns or thoughts. Please feel free to ping me to clarify or talk! :) Nosebagbear (talk) 20:44, 25 August 2022 (UTC)Reply

The tool should by default be as private as email, imo, unless the handling enforcement process or reporter wants otherwise. This is primarily for issues that require sensitivity, this isn't for content disputes or most things that can go on public noticeboards. And yep, enforcement processes should be able to have subqueues of some sort, like your conduct pathways sort here. Like we do in VRT. Vermont 🐿️ (talk) 21:22, 25 August 2022 (UTC)Reply
Conduct pathways - are you thinking about something like a step-by-step guide of what to do, where to go, what kind of information to submit? MAna (WMF) (talk) 17:39, 26 August 2022 (UTC)Reply

Comments from zzuuzz@enwiki

Hello. I've got to admit I'm unclear about this project's scope, even the UCoC's scope, and I'm nowhere near having any useful questions or answers. However, in the spirit of answering some of the questions above, I'll just respond with the following, from my own perspective as an enwiki admin and checkuser who has read the UCoC.
The vast majority of UCoC violations can be classed as generic vandalism. It's pervasive. This is probably followed by POV-pushing, including paid editing and spam. Hey, you did ask. Following that, the most common violations that I see are threats of violence, outing, persistent defamation, harassment, hate speech, and other abuse from sockpuppets and long term abusers. To respond to any of this we generally warn, report, and block. There are stronger measures, such as edit filters, and lesser approaches, such as topic bans, dispute resolution and just having a chat, but most of anything that's inappropriate, especially grossly or repeatedly inappropriate, will at least result in a block. We're really not shy about blocking.
Violations are reported to admins through various noticeboards and talk pages. occasionally privately through various routes, and are also frequently processed without any report (ie by admins at large). I wouldn't say the variety of venues or identifying abuse is generally a problem, and there are lots of signposts to, eg the help desks, if you look for them. Make a report to someone somewhere, or be abusive, and someone will probably notice it.
Personally, I would like to see the WMF work better with everyday admins, with ISPs to help them enforce their terms, and with legal processes against unwelcome criminal behaviour, including outside the US. For me then, I would view such a system as a 2-way pipeline to T&S or other departments to build cases to take action in the real world against people who are already blocked or banned. That's where UCoC violations and 'unsafe environment' are really happening. We currently have a pipeline through email addresses, but in my experience they typically resemble a black hole. It's a type of interaction that I'll rarely bother with, on a par with writing a personal blog that no one reads. I read the description of this project and wonder what proportion of reports are going to be generic vandalism and spam that could and should have been reported in public so it's not restricted to an unnecessarily small crowd. If you're not careful, I think you're at risk of being overloaded. -- zzuuzz (talk) 01:02, 28 August 2022 (UTC)Reply

Please respect Wikimedia community volunteer attention and time

I would like to voice my concern that the Wikimedia Foundation has a history of proposing projects like this, making significant investments, then abandoning the projects without closing the project with community stakeholders. This cycle is a major disruption to Wikimedia community planning, and creates a conflict wherein Wikimedia Foundation staff get paid regardless while community members have no power or resources to advocate for themselves to get a useable final product delivered.

Can you please do the following:

  1. Publicly disclose the labor or resource investment which WMF will put into this
  2. Publicly give an estimate of how much volunteer labor you expect to recruit to advance this
  3. Commit to publishing at least annual updates on this project

I am advocating for the Wikimedia community here and requesting sympathy. When these projects fail as they have in the past, the Wikimedia community is left with broken promises when the WMF committed help. This has also caused wasted time as the WMF discouraged the community from using resources to seek other solutions in favor of advancing the staff one. If you are going to do this, make strong commitments to follow through. Also, Wikimedia community labor is not an endless resource to be recruited! I know that our community members generously volunteer, but they do this because of trust in Wikipedia and not because they owe it to a paid staff cycle of tool development! Do not violate that trust! Please disclose in advance how many volunteer hours you will be recruiting and from which communities, and give explicit published credit to communities which source volunteers to support this project!

Past discussions include the 2014 Grants:IdeaLab/Centralised harassment reporting and referral service and the 2019 Community health initiative/User reporting system consultation 2019. In addition to these I would estimate that there were 5 other seemingly major Wikimedia Foundation initiatives which recruited community members to interviews and focus groups to address this challenge, but either were not documented or I do not know where to find records.

I appreciate and support this project. Please proceed ethically and in a socially appropriate way! Bluerasberry (talk) 16:11, 29 August 2022 (UTC)Reply

Get data to third-party researchers

a chart to illustrate a process for managing complaints
a chart to illustrate a process for managing complaints

It is more important that we get data about complaints than actually addressing the complaints themselves. This is because we are currently unable to study the problem well. If we have data to study the problem then we will find the solution; if instead we proceed without collecting data, whatever justice we are able to provide will only be a short term fix for a problem we do not understand.

Some data about those complaints should go to a trusted neutral third party - not Wikimedia Foundation, and not volunteers, but probably either a locked dataset which cannot experience tampering, or a university researcher. The reason we need this data is because we do not know the scope, depth, and variety of harassment, and also because the Wikimedia community has low trust in reporting based on a generation of inadequate processes. I want to specifically say that Wikimedia community members will report Wikimedia Foundation staff for harassment, and that in the past, Wikimedia Foundation staff have been accused of abusing power to dismiss such complaints. Because of this, the design of the system should eliminate all speculation that Wikimedia Foundation staff can pressure or influence the system to avoid accusations. If there were a trusted system in place, then I think much harassment would be prevented.

About the proposal pictured here, we need three things: first everyone needs to have trust that their complaint is received. When they submit, they need a receipt that their report exists. The reporting system is unrelated to any justice system, so we do not need to create justice, but rather give referrals to justice processes where they exist elsewhere either on wiki or off as available. Finally most important is that deidentified data about complaints needs to be available to third-party researchers who can report trends and outcomes, even if they are embarrassing.

Thanks - Bluerasberry (talk) 16:25, 29 August 2022 (UTC)Reply

My Comments

https://altilunium.my.id/p/PIRS Rtnf (talk) 23:19, 29 August 2022 (UTC)Reply

Scope / Universal_Code_of_Conduct

I find it worrying that Universal Code of Conduct isn't mentioned until nearly the end of the document. In particular it's not mentioned in the background and focus sections. Unless the scope is clear to all participants, I'm sure we can expect confusion, non-optimal feedback and dashed expectations. For example:

  • Is PIRS meant to be a reporting system covering all Universal Code of Conduct violations? Or just 'harassment and other forms of abuse'?
  • Is PIRS meant to just cover Universal Code of Conduct issues or other issues that need to be reported and handled privately (Security breaches; legal issues, etc)?
  • Presumably there'll be a separate reporting method for issues involving those with access to PRIS internals?

Looking forward to clarity. Stuartyeates (talk) 09:33, 30 August 2022 (UTC)Reply