Jump to content

End-user image suppression: Difference between revisions

From Meta, a Wikimedia project coordination wiki
Content deleted Content added
Christiaan (talk | contribs)
clarifying the tagging of images
Christiaan (talk | contribs)
added secondary PR goal
Line 3: Line 3:
Following on from recent debate about the appropriateness of sexually explicit images on Wikipedia there has been some level of agreement to develop a technical solution to the problem of users being presented with [[potentially offensive images]], particurlarly sexually explicit or medically explicit images.
Following on from recent debate about the appropriateness of sexually explicit images on Wikipedia there has been some level of agreement to develop a technical solution to the problem of users being presented with [[potentially offensive images]], particurlarly sexually explicit or medically explicit images.


The primary goal is to avoid offending some of our users while also avoiding the need to participate in [[Wikipedia:Self-censorship|self-censorship]].
The primary goal is to avoid offending some of our users while also avoiding the need to participate in [[Wikipedia:Self-censorship|self-censorship]]. A secondary goal would be the PR value this system would provide in terms of staving off attacks by those who might seek to discredit Wikimedia projects, even though we have a content disclaimers.


This solution would primarily involve the tagging of [[potentially offensive images]] coupled with some combination of the following:
This solution would primarily involve the tagging of [[potentially offensive images]] coupled with some combination of the following:

Revision as of 16:36, 23 February 2005

This article seeks to explore the implementation of end-user controlled suppression of potentially offensive images on Wikimedia projects. It does not seek to implement the idea, but to explore the ins and outs of how it may be implemented if it is wanted.

Following on from recent debate about the appropriateness of sexually explicit images on Wikipedia there has been some level of agreement to develop a technical solution to the problem of users being presented with potentially offensive images, particurlarly sexually explicit or medically explicit images.

The primary goal is to avoid offending some of our users while also avoiding the need to participate in self-censorship. A secondary goal would be the PR value this system would provide in terms of staving off attacks by those who might seek to discredit Wikimedia projects, even though we have a content disclaimers.

This solution would primarily involve the tagging of potentially offensive images coupled with some combination of the following:

  1. a site-based preference option allowing users to remove, hide or show this content by default
  2. a cookie-based preference option allowing users to remove, hide or show this content by default
  3. browser-based filtration, when browsers support it
  4. proxy and IP based filtration for organisations (such as primary schools, etc.)

Tagging images

Tagging images would simply involve ensuring that potentially offensive images have all been given categories as per normal procedures.

Default setting

A default setting for new and anonymous users would need to be chosen, although this may be largely redundant if we use a cookie.

Relevant links

See also