Jump to content

End-user image suppression: Difference between revisions

From Meta, a Wikimedia project coordination wiki
Content deleted Content added
Christiaan (talk | contribs)
added secondary PR goal
Christiaan (talk | contribs)
Line 1: Line 1:
This article seeks to explore the implementation of [[Wikipedia:end-user|end-user]] controlled suppression of [[potentially offensive images]] on [[Wikimedia projects]]. It does not seek to implement the idea, but to explore the ins and outs of how it may be implemented ''if'' it is wanted.
This page deals with the technical implications of implementing [[Wikipedia:end-user|end-user]] controlled suppression of [[potentially offensive images]] on [[Wikimedia projects]]. For discussion of whether it would be desirable, see [[Desirability of end-user content suppression]]


This page originated from a discussion on the WikiEN-I mailing list where some level of consensus was reached between the free-speech and content suppression parties that a technical solution is possible that will go a long way towards satisfying most mainstream positions in the debate.
Following on from recent debate about the appropriateness of sexually explicit images on Wikipedia there has been some level of agreement to develop a technical solution to the problem of users being presented with [[potentially offensive images]], particurlarly sexually explicit or medically explicit images.


The primary goal is to avoid offending some of our users while also avoiding the need to participate in [[Wikipedia:Self-censorship|self-censorship]]. A secondary goal would be the PR value this system would provide in terms of staving off attacks by those who might seek to discredit Wikimedia projects, even though we have a content disclaimers.
The primary goal would be to give [[Wikipedia:end-user|end-users]] the choice as to whether or not they see certain [[potentially offensive images]] by default or not, thus avoiding their offence and hence minimising the urge of editors to participate in [[Wikipedia:Self-censorship|self-censorship]] while also placating those who would seek to discredit Wikimedia projects based on the adult nature of some content (disclaimers aside).


This solution would primarily involve the tagging of [[potentially offensive images]] coupled with some combination of the following:
It would primarily involve ensuring that [[potentially offensive images]] are categorised, coupled with some combination of the following:


# a site-based preference option allowing users to remove, hide or show this content by default
# a site-based preference option allowing users to remove, hide or show [[potentially offensive images]] by default
# a cookie-based preference option allowing users to remove, hide or show this content by default
# a cookie-based preference option allowing users to remove, hide or show [[potentially offensive images]] by default
# browser-based filtration, when browsers support it
# browser-based filtration (when supported in the future)
# proxy and IP based filtration for organisations (such as primary schools, etc.)
# proxy and IP based filtration for organisations (such as primary schools, etc.)


== Tagging images ==
== Categorising images ==


Tagging images would simply involve ensuring that [[potentially offensive images]] have all been given [[categories]] as per normal procedures.
Tagging images would simply involve ensuring that [[potentially offensive images]] have all been given [[categories]] as per normal procedures.

Revision as of 19:32, 23 February 2005

This page deals with the technical implications of implementing end-user controlled suppression of potentially offensive images on Wikimedia projects. For discussion of whether it would be desirable, see Desirability of end-user content suppression

This page originated from a discussion on the WikiEN-I mailing list where some level of consensus was reached between the free-speech and content suppression parties that a technical solution is possible that will go a long way towards satisfying most mainstream positions in the debate.

The primary goal would be to give end-users the choice as to whether or not they see certain potentially offensive images by default or not, thus avoiding their offence and hence minimising the urge of editors to participate in self-censorship while also placating those who would seek to discredit Wikimedia projects based on the adult nature of some content (disclaimers aside).

It would primarily involve ensuring that potentially offensive images are categorised, coupled with some combination of the following:

  1. a site-based preference option allowing users to remove, hide or show potentially offensive images by default
  2. a cookie-based preference option allowing users to remove, hide or show potentially offensive images by default
  3. browser-based filtration (when supported in the future)
  4. proxy and IP based filtration for organisations (such as primary schools, etc.)

Categorising images

Tagging images would simply involve ensuring that potentially offensive images have all been given categories as per normal procedures.

Default setting

A default setting for new and anonymous users would need to be chosen, although this may be largely redundant if we use a cookie.

Relevant links

See also