Jump to content

End-user image suppression: Difference between revisions

From Meta, a Wikimedia project coordination wiki
Content deleted Content added
R3m0t (talk | contribs)
R3m0t (talk | contribs)
Line 1: Line 1:
This page deals with the technical implications of implementing [[Wikipedia:end-user|end-user]] controlled suppression of [[potentially offensive images]] on [[Wikimedia projects]].
*For discussion on censorship on the wikimedia projects, see [[Censorship on wikmedia projects]].
*For discussion of whether it would be desirable, see [[desirability of end-user image suppression]].

This page originated from a discussion on the WikiEN-L [[mailing list]] where some level of consensus was reached between the free-speech and content suppression parties that a technical solution is possible that will go a long way towards satisfying most mainstream positions in the debate. However, it should be stressed that this page is ''not'' intended to imply that this is a <i>fait accompli</i>, but merely an initial exploration of the specification and feasibility of a possible solution. The features described below are ''not'' currently under development, and will certainly not be made live until their desirability has been firmly established.

== Goal ==

The primary goal would be to give [[Wikipedia:end-user|end-users]] the choice as to whether or not they see certain [[potentially offensive images]] by default, thus avoiding their offence and hence minimising the urge of editors to participate in [[Wikipedia:Self-censorship|self-censorship]] while also placating those who would seek to discredit Wikimedia projects based on the adult nature of some content (disclaimers aside).

This page deals with the technical implications of implementing [[Wikipedia:end-user|end-user]] controlled suppression of [[potentially offensive images]] on [[Wikimedia projects]].
This page deals with the technical implications of implementing [[Wikipedia:end-user|end-user]] controlled suppression of [[potentially offensive images]] on [[Wikimedia projects]].
*For discussion on censorship on the wikimedia projects, see [[Censorship on wikmedia projects]].
*For discussion on censorship on the wikimedia projects, see [[Censorship on wikmedia projects]].
Line 16: Line 26:
# a cookie-based preference option allowing users to remove, hide or show [[potentially offensive images]] by default
# a cookie-based preference option allowing users to remove, hide or show [[potentially offensive images]] by default
# browser-based filtration (when supported in the future)
# browser-based filtration (when supported in the future)



=== Initial default setting ===
=== Initial default setting ===


# An initial default setting for newly registered users would need to be chosen, or:
An initial default setting for newly registered users would need to be chosen.

# For anonymous users we could use a [[Wikipedia:HTTP_cookie|cookie]]. This cookie would be triggered when an anonymous user first attempts to view a page containing [[potentially offensive images]], whereby they would first be presented with a page asking them to choose a default setting for their session. Newly registered users would have a similar page, but instead of cookies, the decision would be stored in their user preferences.
For anonymous users we could use a [[Wikipedia:HTTP_cookie|cookie]]. This cookie would be triggered when an anonymous user first attempts to view a page containing [[potentially offensive images]], whereby they would first be presented with a page asking them to choose a default setting for their session.


=== User preference system ===
=== User preference system ===
Line 33: Line 43:


Tagging images would simply involve ensuring that [[potentially offensive images]] have all been given [[categories]] as per normal procedures.
Tagging images would simply involve ensuring that [[potentially offensive images]] have all been given [[categories]] as per normal procedures.

== Relevant links ==

*[[Potentially offensive images]]
*[[en:Wikipedia:Descriptive image tagging]]

== See also ==

*[[Wikipedia:Wikipedia:Image censorship|Wikipedia:Image censorship]]
*[[Wikipedia:Wikipedia:Content disclaimer|Wikipedia:Content disclaimer]]
*[[Wikipedia:Wikipedia:Profanity|Wikipedia:Profanity]]
*[[en:Wikipedia:Self-censorship]]


== Relevant links ==
== Relevant links ==

Revision as of 07:32, 25 February 2005

This page deals with the technical implications of implementing end-user controlled suppression of potentially offensive images on Wikimedia projects.

This page originated from a discussion on the WikiEN-L mailing list where some level of consensus was reached between the free-speech and content suppression parties that a technical solution is possible that will go a long way towards satisfying most mainstream positions in the debate. However, it should be stressed that this page is not intended to imply that this is a fait accompli, but merely an initial exploration of the specification and feasibility of a possible solution. The features described below are not currently under development, and will certainly not be made live until their desirability has been firmly established.

Goal

The primary goal would be to give end-users the choice as to whether or not they see certain potentially offensive images by default, thus avoiding their offence and hence minimising the urge of editors to participate in self-censorship while also placating those who would seek to discredit Wikimedia projects based on the adult nature of some content (disclaimers aside).

This page deals with the technical implications of implementing end-user controlled suppression of potentially offensive images on Wikimedia projects.

This page originated from a discussion on the WikiEN-L mailing list where some level of consensus was reached between the free-speech and content suppression parties that a technical solution is possible that will go a long way towards satisfying most mainstream positions in the debate. However, it should be stressed that this page is not intended to imply that this is a fait accompli, but merely an initial exploration of the specification and feasibility of a possible solution. The features described below are not currently under development, and will certainly not be made live until their desirability has been firmly established.

Goal

The primary goal would be to give end-users the choice as to whether or not they see certain potentially offensive images by default, thus avoiding their offence and hence minimising the urge of editors to participate in self-censorship while also placating those who would seek to discredit Wikimedia projects based on the adult nature of some content (disclaimers aside).

Basic idea

It would primarily involve ensuring that potentially offensive images are categorised, coupled with some combination of the following:

  1. a site-based preference option allowing users to remove, hide or show potentially offensive images by default
  2. a cookie-based preference option allowing users to remove, hide or show potentially offensive images by default
  3. browser-based filtration (when supported in the future)

Initial default setting

An initial default setting for newly registered users would need to be chosen.

For anonymous users we could use a cookie. This cookie would be triggered when an anonymous user first attempts to view a page containing potentially offensive images, whereby they would first be presented with a page asking them to choose a default setting for their session.

User preference system

An Image Suppression preference section would be created. In this would be a choice to show all images by default. Under that would be the ability to choose amongst categories based on potentially offensive images and then choose to have one of the following happen:

  1. Hide images (this would create a place-holder where images would normally be which the user could then choose to display on a case-by-case basis)
  2. Remove images (this would remove any link or placeholder (but not the image from the server of course))

Categorising images

Tagging images would simply involve ensuring that potentially offensive images have all been given categories as per normal procedures.

Relevant links

See also

Relevant links

See also