Commons talk:Sexual content: Difference between revisions

From Wikimedia Commons, the free media repository
Jump to navigation Jump to search
Content deleted Content added
Line 419: Line 419:
#{{Support}} In my personal capacity as an editor. This does not represent an official statement of support or opposition from the Wikimedia Foundation. [[User:Philippe|Philippe]] ([[User talk:Philippe|<span class="signature-talk">talk</span>]]) 15:00, 7 December 2010 (UTC)
#{{Support}} In my personal capacity as an editor. This does not represent an official statement of support or opposition from the Wikimedia Foundation. [[User:Philippe|Philippe]] ([[User talk:Philippe|<span class="signature-talk">talk</span>]]) 15:00, 7 December 2010 (UTC)
#{{Support}} - Time to raise the bar. - [[User:Josette|Josette]] ([[User talk:Josette|<span class="signature-talk">talk</span>]]) 15:55, 7 December 2010 (UTC)
#{{Support}} - Time to raise the bar. - [[User:Josette|Josette]] ([[User talk:Josette|<span class="signature-talk">talk</span>]]) 15:55, 7 December 2010 (UTC)
#{{Support}} Requesting that the poster at least claim to have obtained consent in particular is a reasonable expectation. [[User:Sxeptomaniac|Sxeptomaniac]] ([[User talk:Sxeptomaniac|<span class="signature-talk">talk</span>]]) 17:37, 7 December 2010 (UTC)
#{{Support}} Requesting that the poster at least claim to have obtained consent in particular is a reasonable expectation. [[User:Ŝxeptomaniac|Ŝxeptomaniac]] ([[User talk:Ŝxeptomaniac|<span class="signature-talk">talk</span>]]) 17:37, 7 December 2010 (UTC)
#{{Support}} Having a policy is certainly better than not having one, and helps combat censorship. &mdash; [[User:Loadmaster|Loadmaster]] ([[User talk:Loadmaster|<span class="signature-talk">talk</span>]]) 17:47, 7 December 2010 (UTC)
#{{Support}} Having a policy is certainly better than not having one, and helps combat censorship. &mdash; [[User:Loadmaster|Loadmaster]] ([[User talk:Loadmaster|<span class="signature-talk">talk</span>]]) 17:47, 7 December 2010 (UTC)
#{{Sup}} - The proposed policy seems to cover the basics. [[User:Killiondude|Killiondude]] ([[User talk:Killiondude|<span class="signature-talk">talk</span>]]) 18:08, 7 December 2010 (UTC)
#{{Sup}} - The proposed policy seems to cover the basics. [[User:Killiondude|Killiondude]] ([[User talk:Killiondude|<span class="signature-talk">talk</span>]]) 18:08, 7 December 2010 (UTC)
Line 446: Line 446:
#::with respect to the 2 above user-voters, ''commons is a media repository'' '''''NOT''''' an encyclopedia. our project goals here are different from those of wikipedia; if the wmc project's goals were not different, there would be no reason to have a separate wmc project. see [[Commons:Project scope]] [[User:Lx 121|Lx 121]] ([[User talk:Lx 121|<span class="signature-talk">talk</span>]]) 03:13, 8 December 2010 (UTC)
#::with respect to the 2 above user-voters, ''commons is a media repository'' '''''NOT''''' an encyclopedia. our project goals here are different from those of wikipedia; if the wmc project's goals were not different, there would be no reason to have a separate wmc project. see [[Commons:Project scope]] [[User:Lx 121|Lx 121]] ([[User talk:Lx 121|<span class="signature-talk">talk</span>]]) 03:13, 8 December 2010 (UTC)
#:::I chose my wording wrong, I meant Wikimedia Commons, not wikipedia. I was transitioning back and forth from both wikipedia to wmc continuously. Although I still agree with this policy. [[User:Creation7689|Creation7689]] ([[User talk:Creation7689|<span class="signature-talk">talk</span>]]) 20:20, 9 December 2010 (UTC)
#:::I chose my wording wrong, I meant Wikimedia Commons, not wikipedia. I was transitioning back and forth from both wikipedia to wmc continuously. Although I still agree with this policy. [[User:Creation7689|Creation7689]] ([[User talk:Creation7689|<span class="signature-talk">talk</span>]]) 20:20, 9 December 2010 (UTC)
#::::fair enough, it happens :) & yes, of course everyone is free to support or oppose the proposed policy. you may have slightly missed my point though; it wasn't meant as a technicality. wikimedia commons '''''is different''''' from wikipedia. our purpose here is different; we're '''''not only''''' collecting media materials for wikipedia, wikimedia et cie. we're collecting free and/or open-source media files (for educational purposes) for '''''anyone & everyone''''' to make use of. if all we wanted to do was stock up for wm projects, we wouldn't need a full standalone project for that; just a central co-ordination & storage service. we could also get rid of at least half the files we already have as redundant/unnecessary. it falls within our purview to have as wide & varied a collection of media file on here as possible. [[User:Lx 121|Lx 121]] ([[User talk:Lx 121|<span class="signature-talk">talk</span>]]) 01:01, 10 December 2010 (UTC)
#{{support}} - This proposed policy seems to be consonant with other wiki policies and meets the legal concerns of the Foundation. [[User:Wabbott9|Wabbott9]] ([[User talk:Wabbott9|<span class="signature-talk">talk</span>]]) 03:36, 8 December 2010 (UTC)
#{{support}} - This proposed policy seems to be consonant with other wiki policies and meets the legal concerns of the Foundation. [[User:Wabbott9|Wabbott9]] ([[User talk:Wabbott9|<span class="signature-talk">talk</span>]]) 03:36, 8 December 2010 (UTC)
#{{support}} - This should have been on poll long ago. It's what one would call "common sense," IMHO. [[User:Jsayre64|Jsayre64]] ([[User talk:Jsayre64|<span class="signature-talk">talk</span>]]) 03:56, 8 December 2010 (UTC)
#{{support}} - This should have been on poll long ago. It's what one would call "common sense," IMHO. [[User:Jsayre64|Jsayre64]] ([[User talk:Jsayre64|<span class="signature-talk">talk</span>]]) 03:56, 8 December 2010 (UTC)

Revision as of 01:01, 10 December 2010

Archive 1 (Dec 2008-Apr 2009) | Archive 2 (April 2010) | Archive 3 (May 2010) | Archive 4 (June 2010) | Archive 5 (June - October 2010)

Recent Changes

I did a pretty major copy-edit to the whole policy. No content changes of any kind were intended. Here's a summary of what I did.

  • Fixed typos and obvious grammar errors
  • Shortened phrases for brevity
  • Rewrote some wikilinks for flow and simplicity
  • Cleaned up header and list formatting
  • Added H2 header to combine File Descriptions and Categorization under 'Working with sexual content'
  • Added H2 header to combine 2257 and obscenity law under 'Legal issues'
  • Removed duplicate policy links
  • Rearranged sections to put most important info first
  • Added italics to two key areas
  • Refactored legal discussion to separate Wikipedia procedure from historical background

If there are any questions, shoot. The diff won't be very helpful, since there was a lot of shuffling and it happened over 10 different edits, but there are no content changes. I'd just give it a top to bottom read to make sure that it didn't change anything substantive, and hopefully it's a cleaner, shorter version of an important policy. Ocaasi (talk) 05:20, 2 November 2010 (UTC)[reply]

Limits on what 2257 applies to

Should we add a note that {{2257}} should not be used indiscriminately and never on pictures of children? I removed 2257 from a lot of pictures of nude children; pictures of children that 2257 applied to would need deletion, not tagging. In the larger sense, I don't want to see this indiscriminately spammed across a bunch of files in Category:Nudity when it only applies to sexual activity.--Prosfilaes (talk) 07:29, 2 November 2010 (UTC)[reply]

That seems like an important point: record keeping is not for illegal images; deletion is for illegal images. I don't know when exactly the line is drawn, but if you have that sense, it sounds good. Ocaasi (talk) 09:36, 2 November 2010 (UTC)[reply]
That's not my point at all; my point is that record keeping is not for mere nudity.--Prosfilaes (talk) 11:07, 2 November 2010 (UTC)[reply]
Alright, misread then. I think they're both good points. I will add yours as a counterweight, explaining both ends of the 2257 spectrum. Check in and see if it works. Sorry I identified you as privatemusings in the edit comment, I had just written on that talk page. Ocaasi (talk) 12:45, 2 November 2010 (UTC)[reply]
I tried this: Content identified as a sexual depiction of a minor should not be tagged with a 2257 template or amended to include 2257 information; it should be deleted. Ocaasi (talk) 10:21, 2 November 2010 (UTC)[reply]
New Version: 2257 requirements apply specifically to sexual content including subjects of a legal age. If content is of child nudity that does not qualify as sexual then 2257 notifications are not needed. On the other hand, if child nudity is identified as sexual, then it should not be tagged with a 2257 template or amended to include 2257 information; it should be deleted and reported to the WMF. Ocaasi (talk) 12:55, 2 November 2010 (UTC)[reply]
Yes - I can say unequivocally that 2257 should never be used on images of minors (unless the image happens to also include images of adults who fit it). Explicitly sexual images of minors should be deleted, not tagged - mere nudity is generally not a reason for 2257 tagging. I've added brief notes to the tag regarding this. Dcoetzee (talk) 21:44, 4 November 2010 (UTC)[reply]

What does this mean?

Can someone tell me what this sentence is trying to say:

By sporadic application of the First Amendment obscenity prosecution for adult sexual content or for sexual artwork, depicting apparent minors has become rare and unpredictable, but explicit sexual content depicting actual minors has not been protected by the courts and its prohibition is strictly enforced. Ocaasi (talk) 01:16, 4 November 2010 (UTC)[reply]
The Feds could sue a site hosting w:lolicon claiming that the material was legally obscene, but given that obscenity requires that it be without artistic, etc. value, it's never cut and dry, so they rarely do so. However, material showing actual children doesn't have such requirements, and will get slammed down quickly and hard. (In practice, I understand a lot of cases are stopped by the fact the FBI won't generally go after teenage CP unless they can get a confirmed ID and age of birth. But that's not really relevant for us.)--Prosfilaes (talk) 01:45, 4 November 2010 (UTC)[reply]

Check please:

Obscenity prosecutions of adult sexual content, sexual art, or caricatures depicting apparent minors has become rare and unpredictable; however, explicit sexual content depicting actual minors has received no protection and prohibitions against it are strictly enforced.

--Ocaasi (talk) 09:57, 4 November 2010 (UTC)[reply]

I read that, and it still doesn't make sense, because there is nothing qualifying 'protection' - the sentence above uses the term 'modification', and 'protection' does not obviously apply when what is actually being referred to is a loosening of restrictions on the depiction of such materials. How about Obscenity prosecutions of adult sexual content, sexual art, or caricatures depicting apparent minors has become rare and unpredictable; however, there has been no such modification where explicit sexual content depicting actual minors is concerned, and prohibitions against it are strictly enforced. --Elen of the Roads (talk) 22:50, 5 December 2010 (UTC)[reply]
Ah, I see what you mean. The perils of a la carte fixes. It will probably have to wait until after the poll, but that should be an uncontroversial improvement. I think we should start a list of to-fix-afterwards edits, since all of the new eyeballs will be wasted while the draft is !vote-locked. Maybe I'll do that. Ocaasi (talk) 18:14, 6 December 2010 (UTC)[reply]

Archive

Would someone mind archiving the old threads on this page. I'm a bit inexperienced in the archive business...Ocaasi (talk) 01:16, 4 November 2010 (UTC)[reply]

I've given it a go - although we may not even require 4 / 5 months of threads above? cheers, Privatemusings (talk) 02:19, 4 November 2010 (UTC)[reply]
Excellent...Ocaasi (talk) 07:37, 4 November 2010 (UTC)[reply]

Question about content without explicit consent

Edit went from:

This shall not in itself be a reason to delete them, but the community will discuss whether or not consent can be assumed in individual cases.

to

The community is free to discuss whether or not consent can be assumed in individual cases.

Do we want to remove the statement that lack of explicit consent is not sufficient for deletion? Is 'community is free to discuss' weaker than 'community will discuss'? Which is intended... Ocaasi (talk) 07:48, 4 November 2010 (UTC)[reply]

I think we need the statement, because it's been such a controversial question in this discussion and is likely to arise over and over. A small but persistent minority have wanted a strict requirement of explicit consent for everything. I think it's important that we make it clear that is not the policy we are proposing. - Jmabel ! talk 15:08, 4 November 2010 (UTC)[reply]
Agreed. There are many works for which explicit consent is unnecessary and unreasonable (e.g. notable artwork that happens to include a nude person). Dcoetzee (talk) 21:39, 4 November 2010 (UTC)[reply]
That sentence is not about "everything". There are already a list of cases where explicit consent is not necessary (including non-photographic artwork..., and old photographs). That sentence is only about old uploads. We compromised that there would be no mass deletion of old untagged uploads, but that individual discussions would proceed. So, how does the new wording not convey that? --99of9 (talk) 09:40, 5 November 2010 (UTC)[reply]
Logically, it shouldn't matter if we say explicit consent is necessary or not if we're leaving it to case by case discussion anyway. Even so, I agree that I'd like to keep the extra statement anyway, just to be clear about it. Wnt (talk) 09:37, 5 November 2010 (UTC)[reply]
I think there's a risk of being overly verbose, a bit confusing, and perhaps even mildly prejudicial, to be honest. The intent of my edit (as requested) was to keep it simple - a deletion discussion where an older file is deleted due to perceived consent issues is ok, as is a deletion discussion where an older image is kept despite consent issues being raised. If our intention here is not to mandate things either way, which I think it is, then I think it's prolly best the way it is :-) Privatemusings (talk) 23:25, 7 November 2010 (UTC)[reply]
I combined the bullet point with the other point about this to reduce confusion (I hope). Looking at the text and messing about with it this way and that, I didn't see any obvious way to make the text clearer without making it more confusing. ;) I think that as long as we have agreement here that we're not somehow intending this to be interpreted to allow deletion by default of old material (which I don't think is how it reads), I suppose we can spare readers of the policy the extra verbiage. Wnt (talk) 04:02, 8 November 2010 (UTC)[reply]
I kept the combination, but reworded it a bit, dropping the future tense and instead stating it like this:
Uploaders of self-produced sexual content are expected to make a positive assertion that the subjects of the photos consented to the photograph and to its public release. Provision of further evidence of consent is welcome (using the OTRS process) but is not normally required. Content which lacks assertions of consent, usually from files uploaded before implementation of this policy, should not be automatically deleted. The community should instead discuss whether or not consent can be assumed in each case.
I think that covers both issues ok, while keeping it as straightforward as possible. Ocaasi (talk) 17:42, 8 November 2010 (UTC)[reply]

< I tried again, because I felt that 'your' version kind of implied that future uploads of sexual content which are missing explicit assertions of consent might be ok - whereas I think we've agreed that they're necessary. Hopefully it's even clearer now :-) cheers, Privatemusings (talk) 23:45, 8 November 2010 (UTC)[reply]

I see how your version was an improvement in affirming the present requirements. I tightened the phrasing just a touch, no change to content (except 'discuss' to 'determine', if you want to be really technical). On the other sentence, you went from the 'should not be automatically deleted' to 'free to discuss', which though correct is a somewhat weaker way of saying that prior content without consent 'shouldn't be deleted without discussion'. I think in cases where people get deletion-happy, 'free to discuss' doesn't set a requirement for them to make that effort. How about both changes? Ocaasi (talk) 05:55, 9 November 2010 (UTC)[reply]
I thought that version sounded a little too favorable toward random proposals for deletion of old content, while being too restrictive toward proposals to delete new content that seems very suspicious despite an assertion of consent. (A different issue, mentioned only briefly in the previous discussions, was that you might come up with some content that seems intrinsically suspicious despite an assertion - a flash snapshot of a startled couple in a hotel bed, etc. The assertion should generally be a good defense, but we weren't going to require editors to be completely blind, either) I'm not sure my last edit is the best way either... Wnt (talk) 09:14, 9 November 2010 (UTC)[reply]

I've made a small edit, but I'm still concerned by this new wording. One thing that bugs me is that old random Flikr uploads will now be treated differently to new random Flikr uploads (without assertions). The old need reasonable suspicion to delete, the new ones should not be uploaded. --99of9 (talk) 09:55, 9 November 2010 (UTC)[reply]

In general I think we should have some hint of positive trustworthiness (e.g. trusted uploader) to keep the old stuff, rather than just no negatives. --99of9 (talk) 10:00, 9 November 2010 (UTC)[reply]
Looks ok to me. Maybe 'preferred' instead of 'encouraged' just as a word choice. Ocaasi (talk) 14:11, 9 November 2010 (UTC)[reply]
I thought the last draft (99of9) was OK also. I just added a footnote linking an article about the Tyler Clementi case as general background - but if someone doesn't like that I won't insist on keeping it. Wnt (talk) 03:55, 11 November 2010 (UTC)[reply]
That case is really significant (or it will be soon), but I have a few thoughts. I'd like to tighten up the summary a bit just for brevity. Also, do we need to consider victimization issues? This case is receiving ample attention right now, and the family hasn't been avoiding publicity, but maybe until this actually winds up in a court ruling we should just speak generally about it. I could go either way. Thoughts? Ocaasi (talk) 20:13, 11 November 2010 (UTC)[reply]
I've seen several news cycles about Clementi, and in the U.S. his case is being used as something of a rallying cry against bullying. I don't think it would be inappropriate to name him. The legal case against the people involved in the videotaping is likely to be a significant test of a 2003 "invasion of privacy" law — as I understand it, they were operating their own webcam in their own (shared) dorm room. It may be more important in social terms, however — before this case, I think that many people would think of the consent issues more in the context of actresses suing to try to keep sex tapes off the Internet, whereas now some are going to think of it as a life and death issue. I don't think that's really fair — it was prejudice and bullying, not video, that caused this tragedy — but I can't pretend it won't be true, and editors (many of whom may not be from the U.S. and never heard of the case) should have some warning.
As people here know by now, I believe in the "not censored" principle of Wikipedia, and even now I wouldn't have acquiesced to demands for an explicit assertion of consent for future content without a tangible reason. If scenes from the Clementi video had been posted to Commons without an explicit assertion of consent, policy wouldn't touch Clementi (we wouldn't know who he was), nor most likely the uploader (who would be breaking the law and existing policy with or without an explicit statement about it, and probably wouldn't need a notice to tell him this). But it might have protected people "distributing" the video, which I think might be interpreted to affect various types of innocent editors by various arguments which it is probably not productive to detail here. By having an explicit statement to fall back on, they would have a firm defense, whereas without it prosecutors might well end up Wikilawyering in the courtroom.
I am not at all happy to have yet another way for it to be illegal to copy content you find on the Internet!, but from what I've read (which questions whether the video was viewed by anyone) I don't know if the current prosecution could lead to constitutional review of that part of the law; if it did this is not the test case I'd want to see used to make the decision. Wnt (talk) 21:46, 11 November 2010 (UTC)[reply]

Commons Study on Controversial Content

I noticed there's no mention, not even a link to the recent study on controversial content. Is that intentional? Does this policy assume that those recommendations have no teeth or force until the foundation decides to do something about them? Does it make sense to link to them in a 'see also' section, perhaps? Ocaasi (talk) 10:01, 4 November 2010 (UTC)[reply]

See also seems a good idea but the study is just a study, not a policy mandate. I think we ought to adopt this proposal as policy on our own rather than by being forced to via a WMF mandate, though. ++Lar: t/c 11:44, 4 November 2010 (UTC)[reply]
Yes, it's always nicer to self-legislate, like when McDonald's added apple slices in lieu of FDA regulation. I'll try a see also link, which seems to be the right fit between informing and capitulating to the (thoughtful) meddling of the outsiders. Ocaasi (talk) 11:56, 4 November 2010 (UTC)[reply]
(I really don't get the apple-slices/McDonalds reference. Was this a US problem?)Archolman (talk) 00:43, 6 December 2010 (UTC)[reply]
(There were rumblings about Big Food becoming the next Big Tobacco, facing multi-billion dollar obesity lawsuits, congressional hearings, etc. with documents showing execs tried to market to kids and get them addicted to fatty, salty, hormone-ridden snacks. So the industry took a hint and voluntarily adopted milk, apples, more salads, and a few other things to health-wash their image. Which uncharitably suggests by analogy that that is what we're doing, but there's either a) no truth to that or b) nothing wrong with it) Ocaasi (talk) 18:20, 6 December 2010 (UTC)[reply]
The study makes few specific recommendations, and those it makes largely apply to other projects (image hiding on Wikipedia) or are not limited to sexual content as defined here (semi-nude women, which are not considered sexual content by our definition). In any case, I'm not very favorable about it. Wnt (talk) 09:31, 5 November 2010 (UTC)[reply]
If the recommendations are passed on by the foundation, we can then consider whether to expand our definition of sexual content. The good thing is that what we've developed here is a framework for how to deal with whatever is considered sexual. --99of9 (talk) 09:44, 5 November 2010 (UTC)[reply]
Yes - it is important though that we adopt this (in some form) as policy before the Foundation considers the study, or else I fear they might swoop in, say we're not taking clear action, and institute something more conservative. Dcoetzee (talk) 20:53, 5 November 2010 (UTC)[reply]
It's premature to speculate on how to deal with a hypothetical arbitrary policy, which might affect anything. But I think we've designed this policy to deal with a very specific class of content, based to a large degree on American legal distinctions. If the WMF imposed some policy about bare breasts I think we would be best off (at least) to try to keep it as something separate rather than trying to integrate it here. The whole point of an imposed policy is that it is imposed on you, so how can you edit it? So how and why would you try to unify it with any other? Wnt (talk) 03:47, 8 November 2010 (UTC)[reply]
Agree with that, except that when something is 'imposed' on you, it can still be in your interest to adapt it to your purposes, or at least to integrate it with them. But it's premature to bring it up while those issues are just floating through the proposals-and-recommendations cloud. Ocaasi (talk) 17:45, 8 November 2010 (UTC)[reply]

Do we need assertions of consent from unidentifiable people?

The way the last part of the "Prohibited content" section is currently written, we "must" have an assertion of consent even for close-up photos showing, e.g., a particular kind of STD rash. Firstly, is there any reason to require an assertion of consent when the subject of the photograph isn't identifiable? Furthermore, wouldn't requiring an assertion of consent in such cases hinder the ability to illustrate educational medical sexuality articles? 71.198.176.22 16:44, 16 November 2010 (UTC)[reply]

were this proposal adopted, yes - sexual content, as defined on the proposal page, would require an explicit assertion of consent. If you read the note related to the 'close up' type of image you use as an example, I think it's likely such an image would fall outside of the purview of this policy. Personally, I think an assertion of consent would be desirable anywhooo - don't you? cheers, Privatemusings (talk) 20:10, 16 November 2010 (UTC)[reply]
No, I don't think it would be desirable to have to get consent for close up, unidentifiable images because what is the point of such a hassle? I'm not sure what the note you linked to has to do with images of unidentifiable people, either. Is that the note you meant to link to? 71.198.176.22 04:26, 18 November 2010 (UTC)[reply]
I see there was a "grammar tweak" from "with sexual content all are subject" to "all sexual content is subject".[1] This is an example of the perils of copy editing. Additionally, there was some effort to define "identifiable" quite expansively before, including e.g. EXIF information, which has since been taken out; I think the feeling was that it was best to have one single definition of identifiability under the main policy about it.
However, on consideration, I think we should think through carefully just what we want or need to say about this. For example, Wikimedia might want to steer clear of unpublished photos from a secret locker room camera, no matter how carefully cropped and anonymous they may be. On the other hand, we would want to rule in non-identifiable photos from a retired surgeon who has taken a collection of photos for personal use and in medical publications, but never obtained consent to publish them here. My first thought on processing this is that we need to exclude even non-identifiable photos if the photography itself was an illegal violation of privacy, but that we don't need to exclude them otherwise. For example, in the locker room example, it is possible that the act of investigating the dissemination of the "anonymous" photos would end up revealing that they were of a particular person, and put Wikimedia in an entirely undesirable position. But before trying to implement such a zig-zag line, I should see what others here have to say. Wnt (talk) 13:27, 18 November 2010 (UTC)[reply]
I think if you re-read the copy-edit closely, it actually removed a redundancy and did not change the meaning at all. Generally, though I respect legal writing, policy, and constitutions, etc. there is something wrong if a single word or phrase can obscure the intent of the guidance. So we should make it more plain and robust if that is what happened.
I think your 'illegal itself' notion is a good distinction, but I don't know how it would be incorporated. It seems more than reasonable that content which appears to be illegally acquired should be excluded, even if it does not violate other provisions (identifiability, sexual content with children). But how would we know what 'looks illegal'. I think the only question is whether those situations should be a) speedy deleted, with the possibility of discussion afterwards; b) discussed per regular deletion, with the burden on showing it is legal; or c) discussed per regular deletion, with the burden on showing it is illegal.
My gut instinct is that where there identifiability or sexual content with children, a) is the obvious choice; where the content is blatantly sexual or exposing body parts, (or if there are subtle identifiers that could end up as clues), that b) is the best option; and if there is just a generic body part but no children, sex act, or obvious identifiers, that c) is okay. That's not very simple, but it's my best stab at it. Thoughts? Ocaasi 02:20, 19 November 2010 (UTC)[reply]
There's really no way to verify that consent was really obtained for photography under any scenario - even if we kept 2257 paperwork, which we shouldn't, it would be child's play to forge an electronic copy of a driver's license and a signature for which we don't have a valid exemplar. Here we're just trying to figure out a required standard, without much thought for serious enforcement beyond the assertion of consent. However, I think it's possible to be pretty sure in some cases that no consent was obtained, allowing AfDs to be raised. Even so, I don't think that speedy deletion will usually be appropriate for suspected lack of consent, simply because it is going to be a complex conclusion to draw from media that don't explicitly say that they are violating policy. I think that the current guidance that speedy deletion is only for uncontroversial cases is sufficient. As for burden of proof, I don't think that any AfD on Commons truly has a "reasonable doubt" standard; it's more a "preponderance of the evidence", which puts b and c as pretty much the same option. Disturbing the current language on the point could be trouble, since we've had enough trouble just agreeing that there should be a reason to suspect lack of consent... Wnt (talk) 12:00, 19 November 2010 (UTC)[reply]
I've added yet another bullet point to the section, trying to deal with this issue out in the open; it should make any other rewording unnecessary.[2] Provided I didn't screw it up... we're diving into some rather fine detail nowadays. Wnt (talk) 12:25, 19 November 2010 (UTC)[reply]
No no no. Why are we suddenly reopening something we've hammered out so hard already? I thought we had agreed to stick fairly closely to copyedits until we had a clean stable page to vote on. This "close crop" stuff allows someone to put their ex-girlfriend all over commons as long as he cuts her up small enough. Consent is important. I will revert, please obtain wide consensus before making such significant changes since we've already found a compromise point. --99of9 (talk) 13:27, 19 November 2010 (UTC)[reply]
What would the point of a vindictive ex putting unidentifiable images up? Anything intended to embarrass is necessarily going to be identifiable. I don't think this is a legitimate point of compromise since we already have dozens of close-cropped images of genitalia and breasts, several of which are currently being used to illustrate wikipedias. The proposal as worded would require the deletion of those images unless the uploaders, many of whom are long gone, come up with assertions of consent. I don't believe anyone actually wants that. Do they?
I think Wnt's bullet point was a reasonable solution, so I'm replacing it. 71.198.176.22 03:37, 20 November 2010 (UTC)[reply]
Hmmm, I don't want to break down a fragile consensus, but at the same time I'm responding to a November 4 "grammar tweak" as explained above. I don't really want to go back to that rather peculiar phrasing either, especially since we apparently had more difference of opinion about what we were reading there than we imagined. I'd hoped that my wording might be acceptable, but is there a way we can fix this? I should emphasize that if "identifiable" is interpreted narrowly enough, 99of9 does have a point about the cut-up photo sequences. I originally considered some sentences that tried to define "identifiable" strictly, e.g. specifying that a photo can be identifiable from context, clothing, tattoos, etc. That should include cut-up photo sequences where one end has the head and one end has some other part. The problem is, I see this really as a question for the photographs of identifiable people policy - we shouldn't have two different definitions of "identifiable" knocking around or chaos will ensue. Wnt (talk) 05:20, 20 November 2010 (UTC)[reply]
Wnt, I don't understand how you could have otherwise interpreted that sentence before the grammar tweak. Can you please spell out what you thought it meant? The diff you provided seems like a straightforward improvement on the English, and I can't for the life of me see any change of meaning, even implied meaning... --99of9 (talk) 12:09, 20 November 2010 (UTC)[reply]
The pre-Nov. 4 version read "All media is subject to Commons:Photographs of identifiable people.... In addition with sexual content, all [media subject to COM:IDENT] are subject to the following requirements:" It seems to me that the following points were meant to modify the normal requirements of the COM:IDENT policy. Since the consent issue has previously been part of COM:IDENT, this is what was to be expected. Wnt (talk) 13:51, 20 November 2010 (UTC)[reply]
IP address, your statement "The proposal as worded would require deletion of ..." appears completely false to me. We have a dotpoint that explicitly addresses files uploaded prior to activation of this policy. Since that was the justification of your revert, I trust that you will restore to correct your error. Further, my point above was that "I think ... is a reasonable solution" is no longer a good enough reason to make functional changes to the text. We have a carefully agreed compromise, and one or two people is no longer sufficient consensus to make that kind of change. We are supposed to be copyediting. (PS, please consider logging in and backing your opinions with the reputation of your username.)--99of9 (talk) 12:09, 20 November 2010 (UTC)[reply]
I should agree with 99of9 that this part of the policy isn't about the lack of explicit assertions of consent, which would affect the old files, but only about actual lack of consent, which is a much narrower matter. What I'm interested in preserving here is the use of photos taken with consent but not with explicit consent for use here, primarily medical and surgical photos.
To give an example of the sort of things I'm thinking about, consider an NIH-funded paper about a surgery which is published in a private journal, but which by law is required to be under an open license. The paper may contain a low rsolution black and white photo of the procedure, which is published in the journal, but include "supplemental data" at the journal's, author's, or department's web address with a wider array of high-resolution color photos and video. Such photos can be used on Commons by copyright, but may not be regarded by some as having been published (especially since it is not that uncommon to update supplemental data after the paper is out...). The patient would have consented to a medical publication and other uses by the author, but probably not by us.
I think there is a way to write this to clearly rule out the cut-up ex scenario, and clearly rule in the medical images scenario. Wnt (talk) 23:18, 20 November 2010 (UTC)[reply]

I hadn't realized that the text currently includes "Explicit assertions of consent are encouraged but not required for sexual content uploaded prior to implementation of this policy" which does indeed address my concerns about earlier images above. However, I really do think requiring an assertion of consent for unidentifiable images is much less desirable than allowing them. The chance that an uploader is going to try to embarrass someone with an unidentifiable image is tiny, and there's no way it would work if they tried, by definition. That means requiring consent for unidentifiable images is needless instruction creep. Also I hope Wnt can tweak his text to address the external publication issue he describes. I don't understand why we shouldn't still be trying to make improvements to the proposal; we improve policies even after they're adopted. 71.198.176.22 06:51, 21 November 2010 (UTC)[reply]

Want to bet? Send me a full frontal nude image of yourself, your full name, and the copyright holder's release, and I'm sure I can embarrass you on commons without breaking existing policy. --99of9 (talk) 08:42, 21 November 2010 (UTC)[reply]
If you cut an image in two, where one half is identifying and the other is embarrassing, then the existing policy would allow publishing the embarrassing part here. We do not want that, as the identifying part can be published later or elsewhere. That is why we want consent for most sexual content. Truly unidentifiable images and images from serious sources are no problem. --LPfi (talk) 11:48, 22 November 2010 (UTC)[reply]
As I said, 99of9 has a good point if we define identifiability too narrowly. But this scenario extends to many things beyond this policy. After all, a full frontal nude image of a typical person, especially if female, can be quite embarrassing even if the actual genitals are obscured, and it isn't within the scope of this policy at all. Nor do I think we should consider a separate COM:BREASTS policy to deal with that; after all, it might be embarrassing for a man to have his belly cropped out and used to illustrate the obesity article and so forth, if the picture still gets linked back to his name.
My feeling is that we can't force our policies on someone with truly malicious intent - he can always falsely assert consent for the whole image, even fake "evidence" such as it is, so why focus on a later malicious release of a second half of an image? But we might warn editors against doing this unintentionally. Based on this discussion my thought is to add some redundant warning about identifiability, though trying not to set up a separate definition. Wnt (talk) 15:41, 22 November 2010 (UTC)[reply]
I agree; it doesn't matter how close-cropped an image is if it has been, or is likely to be, associated with someone's name or identity. In that case an assertion of consent should be required for new uploads, and if the person associated with an image uploaded any time wants to withdraw consent or claims that there never was consent, we should respect that and delete the image. I am not sure those points are in the policy, and they should be. I'm going to try to put them in. 71.198.176.22 21:54, 22 November 2010 (UTC)[reply]
best way forward in my view, and the one currently written up, is for the sexual content policy on consent to apply to all sexual content - it's easier, clearer, and just plain better that way :-) Privatemusings (talk) 20:26, 22 November 2010 (UTC)[reply]
"All sexual content" includes both images and text, a substantial subset of which there is a wide consensus to allow instead of delete. Which version are you referring to as "the one currently written up?" 71.198.176.22 21:54, 22 November 2010 (UTC)[reply]
not by our definition :-) - take another look..... Privatemusings (talk) 22:36, 22 November 2010 (UTC)[reply]
I'm sure that whoever wrote "media depicting" didn't intend to include textual media or textual descriptions, but text is a kind of media, and according to two of the dictionaries I just checked, depictions include text as well as pictures. Do you think it would be a good idea to resolve such ambiguities before asking people to approve the draft? 71.198.176.22 23:48, 22 November 2010 (UTC)[reply]
  • "Media" here is clearly in the sense that Commons is WMF's media repository. We can't just say "images" because it also includes audio, video, etc. Still, in the context of Commons content, when we refer to "media" we mean the media files, distinct from descriptive text. I believe this is pretty consistent across policies & guidelines. - Jmabel ! talk 05:15, 23 November 2010 (UTC)[reply]
Perhaps I was going about this the wrong way. I'm trying a different edit here [3] explicitly stating that "previously published" includes certain websites. Some of my other concerns are actually not prohibited by the current language of the first section, which only says "public release" rather than release on Commons. I'm not going to work myself up just now over the right to take other people's snapshots for personal use then release anonymized cropped bits on Wikipedia for fun; no doubt there is some case in which this is terribly important but I'm not thinking of it today. An aberration in this edit which might raise some hackles is that I refer to a Wikipedia policy in a link - unfortunately, I didn't see a Commons policy to cite about reliable sources! Wnt (talk) 19:57, 23 November 2010 (UTC)[reply]

Really not ready for prime time

I think we've got a reasonable consensus on the meaning of the policy we will present, but it's really not yet all that well written. If I have time (and focus), I may take shot at this myself, but I'd welcome another good writer trying in my stead.

For the most part, I would hope any rewrite would be shorter, not longer, but there is one thing I think is missing: an explanation of why sexual content raises certain issues that are not found, or are found only in a lesser degree, in other content. E.g.

  1. Sexual depictions of minors can trigger criminal legal concerns in many countries.
  2. Sexual depictions of identifiable living or recently deceased people have enormous potential to prove embarrassing to those people or their associates, and therefore it is more important than usual to be confident that the subject gave any necessary consent for a photo to be distributed.
  3. To a greater degree than in other subject-matter areas, hosting numerous sexual images with little or no educational value is likely to be detrimental to Commons' reputation. Few people will consider it a serious problem if Commons has superfluous pictures of domestic cats or of the west face of St. Paul's Cathedral. Far more will consider it a problem if Commons has superfluous amateur pornographic images.

If someone wants to rework that, great, but I think something like this is needed.

Also, there is currently one very confusing passage in the lede: "This policy addresses how Wikimedia Commons should handle materials outside Commons' project scope..." Is that really what we mean to say? It seems to me that the policy is, instead, about determining the boundaries of that scope. "Materials outside Commons' project scope" are routinely deleted. - Jmabel ! talk 21:02, 16 November 2010 (UTC)[reply]

I'm not sure if writing style is justification for holding up a vote. After all, it's a policy, not a literary masterpiece, and having a policy that we agree on the meaning of already seems like an accomplishment. We keep having gaps of up to a week when no one edits the policy, and waiting for major revisions could take a very, very long time. And every change, however small, seems to risk reopening disputes. I see you've made improvements, and so far we've been delaying the vote for copy-editing; still, it has to end sometime. Wnt (talk) 11:38, 17 November 2010 (UTC)[reply]
I think Jmabel is getting at the fact that we have a good document that needs one more paragraph in the lead. It can be done. Points 1 and 2 are clearly in the article in different sections. Point 3 is somewhat more controversial because it gets into the why of this policy, and whether there are extra-legal, extra-commons concerns about mere reputation or social responsibility involved. I can't address the last point, but I'll incorporate the first to for the intro and see if we can discuss the third.
Sentences added: Sexual content must be handled differently from other content, because sexual depictions of minors can trigger legal issues in many countries; depictions of identifiable living or recently deceased people may be embarrassing or violate their privacy; and hosting sexual images can be done in a way which protects the reputation of Commons among the public to safeguard their continued support.
Question: Is point three accurate: are we writing this policy out of any consensus-concern about the reputation of Commons or about some general social concern?
I have to say that I oppose this paragraph altogether. The policy as written really does not treat sexual content differently: Commons has never had a policy of hosting illegal content, it has long had the Commons:Photographs of identifiable persons policy, and it does not censor photographs based on perceived external or even internal opposition to their content. Furthermore, it is not actually contributing to the policy in any identifiable way - it's exactly the sort of excess and confusing verbiage that we should be trying to chop out. Wnt (talk) 00:58, 18 November 2010 (UTC)[reply]
I do not like the paragraph either. There are other things that can trigger legal issues. Depictions of identifiable people may be embarrassing. Sure sexual content is extra sensitive, but I think people know that. And the third sentence is strange, it sort of says the thing backwards. We want to have these things straight, why do we say we do it to safeguard our support? --LPfi (talk) 13:53, 19 November 2010 (UTC)[reply]
I've offered a compromise (?) sentence about the above.[4] But I don't know if this will satisfy Jmabel, nor would I object to its deletion since it is more or less redundant. Wnt (talk) 01:08, 18 November 2010 (UTC)[reply]
I see what you're getting at, but if identifiability and illegality are sufficient, what is the point of this policy at all? By your description, either sexual content is already covered by existing policy and a sexual content policy is redundant (or merely guidance for applying the main policies), or else sexual content is handled differently than other content. I don't have a preference either way, but I think we should know which is which. I'm going to try rephrasing the added sentenced to say the same thing with less perceived distinction. See if it gets closer.
Sentences added (2): Sexual content tends to need closer attention because of the legal and privacy-related issues it raises. In order to ensure that such content does not violate U.S. law, or other Commons policies, this policy addresses processes to handle sexual material that falls within Wikimedia Commons' project scope and to identify and remove material that does not.
Re: Wnt's point about voting despite style, I agree. Policy on Wikipedia is rephrased constantly, but it doesn't affect the basic guidance. User:Ocaasi 13:25, 17 November 2010 (UTC)[reply]
I'd be perfectly glad to see us head toward a vote at this time.
As for the matter of reputation: that's essentially what started this all in the first place. Jimbo launched a bit of a crusade, deleting even some images that were clearly significant works of art. His actions were almost certainly in response to sensationalistic news stories besmirching Commons' reputation. - Jmabel ! talk 16:19, 17 November 2010 (UTC)[reply]
I recall that his edits generally said "out of project scope", and I would like to think that whatever external pressures were brought to bear on him, that he would not have launched an all-out attack on a category of content if much of it were not actually at odds with existing policy. We know some files were deleted that should not have been, but many of his deletions stuck based on preexisting policy. Wnt (talk) 01:15, 18 November 2010 (UTC)[reply]

Wikimedia leadership response to controversial content

Linking these because they may be of interest to people watching this page, and to inform policy discussion.

  • en:Wikipedia:Wikipedia_Signpost/2010-11-15/News_and_notes#Controversial_content_and_Wikimedia_leadership
  • Sue Gardner: Making change at Wikimedia: nine patterns that work: "we’re the only major site that doesn’t treat controversial material –e.g., sexually-explicit imagery, violent imagery, culturally offensive imagery– differently from everything else. The Board wanted –in effect– to probe into whether that was helping or hurting our effectiveness at fulfilling our mission."
  • [Ting Chen] writes: "the ongoing controversial content discussion is a result of our strategic planning (development and adaption in the nonwestern cultures) and the response of the changes in public policy and in our responsibility."
  • meta:Talk:2010_Wikimedia_Study_of_Controversial_Content:_Part_Two#Recommendations_discussion: Inviting feedback on the 11 recommendations of the study, which are:
    1. no changes be made to current editing and/or filtering regimes surrounding text
    2. no changes or filters be added to text on current Wikimedia projects to satisfy the perceived needs of children
    3. creating Wikijunior
    4. review images of nudity on Commons
    5. historical, ethnographic and art images be excluded from review
    6. active curation within controversial categories
    7. user-selected shuttered images (NSFW button)
    8. no image be permanently denied with such shuttering
    9. allow registered users the easy ability to set viewing preferences
    10. tagging regimes that would allow third-parties to filter Wikimedia content be restricted
    11. principle of least astonishment to be codified

Feel free to visit the last link and provide your own feedback. Dcoetzee (talk) 04:03, 17 November 2010 (UTC)[reply]

Should we allow subjects to withdraw consent for images uploaded prior to adoption?

Is there any reason not to allow subjects associated (correctly or incorrectly) with a potentially embarrassing image to withdraw consent for its inclusion, as a BLP-strength support for deletion? I remember people suggesting that there was, but I'm having a hard time remembering why. 71.198.176.22 22:03, 22 November 2010 (UTC)[reply]

the granting of, and subsequent withdrawing of consent is a tricky issue. I'd be happy to continue discussions about the best policies in this area whilst the current draft is adopted. Privatemusings (talk) 22:37, 22 November 2010 (UTC)[reply]
I'd rather we put forth a draft addressing this issue, because it is tricky, instead of delaying addressing it before asking people whether they want to support the draft. But I'm not opposed to addressing the issue in a wider proposal instead, since it's not specific to sexual content. Commons:Living persons would seem to apply, but that has only been edited by one person about a year ago, and it's in pretty bad shape. Can we use foundation:Resolution:Biographies of living people instead? It doesn't say anything about images, but the spirit of that Resolution seems very clear to me here. Alternatively, should this issue be addressed in Commons:Photographs of identifiable people instead of here? Or do we want to limit withdrawal of consent to sexual content? 71.198.176.22 23:48, 22 November 2010 (UTC)[reply]
Currently the photographs of identifiable people policy is the only place where consent issues come from; depending on ongoing discussions above, things may or may not stay that way.
My position is that the point of Commons is to put images permanently into the public domain. So consent should not be revocable, just as the copyright terms are not revocable. That said, I think we should heavily weight the subject's statement about consent: if he says he never consented, or never consented to public distribution of the image, this outweighs the photographer's assertion of consent. Only hard evidence like a verifiable prior publication of the image is likely to outweigh the subject's statement. It is true that in many cases subjects can abuse this to effectively revoke consent, but at least we're not committing ourselves to the principle that Commons images are only temporarily public. I should note that a subject should only be able to challenge an image if he asserts that it correctly associated with him; dispelling false rumors isn't our responsibility. Lastly, in the interests of defending the public domain, we must not remove published photos that have reached the public domain, no matter how the subject objects; otherwise we fall behind the archives of the original magazine and other copyrighted resources. Wnt (talk) 18:52, 23 November 2010 (UTC)[reply]
2 remarks, though I'm not sure either affects the policy we are discussing here:
  1. I presume you mean "public domain" in a metaphorical rather than a legal sense. I've posted tens of thousands of my photos to the Commons, but I have not placed them in the public domain.
  2. As a matter of practice, we often remove photos as a courtesy to the subject of the photo. If the picture is readily replaceable for all current uses, and there doesn't seem to be anything particularly notable about it, and if the subject of the photo objects to its presence here, we usually do this. For example, if we have a routine photo of someone around the threshold of encyclopedic notability, and they don't like the particular photo, and they provide us with a better one (or we already have a better one), we almost always accede to their wishes. - Jmabel ! talk 05:32, 24 November 2010 (UTC)[reply]
As a matter of practice, not policy, and when the image is replaceable. I would much rather have deleting photos on request a matter of practice when it's convenient or reasonable, rather than policy.--Prosfilaes (talk) 06:33, 24 November 2010 (UTC)[reply]
This is worth a separate policy about. Generally, regardless of legality, copyright, or current policy, I can't really see a good reason not to permit someone to change their mind for any reason at any time. If an image is harmful to a person's reputation, it's not worth hosting it, and we should probably be able to replace it anyway. The only situation I can see where this might not apply are mass uploads to non-sexual content (i.e. trains), where the author wakes up one day and wants to make a profit; that might be too late. Thoughts? Also, where would be the place to discuss this aspect of policy? Ocaasi (talk) 05:25, 25 November 2010 (UTC)[reply]
This discussion is getting off topic but, yes, there are lots of reasons not to allow people to revoke their permissions. None of the following has particularly to do with sexual imagery (which is why this is off topic) but...
Is there a better place where this discussion could happen?
  1. When you upload a photo you took, you grant an irrevocable license. If someone uploads a bunch of useful images, then later stalks out of the project in a fit of pique, they don't get to revoke those permissions and remove their pictures from Wikipedia.
  2. Example rather than abstraction here: the images from Abu Ghraib are certainly embarrassing to their subjects (both Iraqi and American) but we certainly wouldn't remove them because Lynndie England found an image of herself embarrassing.
  3. Example again: a U.S. Senator might object to us hosting anything other than their official photo, but it would still be entirely legitimate (for example) for us to host their high school yearbook photo or a photo contributed by someone who photographed them at a public event.
  4. Similar example, less obvious: if I take a photo of someone about whom we have a Wikipedia article, and I take it entirely in accord with the usual rules in the relevant country (e.g. photographed in a public space in the U.S.), and that person doesn't like Wikipedia having their photo, frankly, unless we decide we want to do a courtesy to them, it's their problem, not ours. Now, if it's really not a good photo, and especially if we have a better one, I'll probably support doing that courtesy, but it's a courtesy, not a matter of policy.
Jmabel ! talk 06:36, 25 November 2010 (UTC)[reply]
I think we're discussing different cases. Generally, I was only referring to pictures that the subject took themselves or granted permission themselves to use, not other public domain images which someone is merely the subject of. Re Lyddie England, she's a public figure at this point, and the photograph wasn't produced or uploaded by her--same with the Senator who presumably didn't upload the content that s/he is seeking to take down. I was trying to think of counterexamples where we should not honor a request for someone to take down a photo they took themselves or granted permission themselves. Can you think of any for those? Ocaasi (talk) 06:48, 25 November 2010 (UTC)[reply]
One of the most frustrating things Commons does is going around deleting images in use, for good reasons or bad. In an optimal world, we would never delete an image used in a current or historical version of any page on a Wikimedia project. We already give photographers much more rights then authors of encyclopedia articles, who have their names buried in the history and don't get to choose their license; there's no need to privilege them further. A photographer shouldn't get to rip their photo out of an article any more than an author could rip their text out of a WP article. We have to treat the portrayed with all the respect of a professional scholar or journalist, which usually coincides with their own desires; when it doesn't, we destroy our own quality as a news source and encyclopedia by letting them dictate to us.
Sexual content is complex, and I will freely acknowledge that the issues surrounding it mean it will get deleted much more freely then most content. I still think it important that policy demand that uploads to Wikimedia projects are not revokable, and that people think before uploading, instead of coming back and demanding that users on 14 Wikimedia projects fix up 47 pages because images that they assumed--and should have had every right to assume--were permanent are now being deleted.--Prosfilaes (talk) 07:09, 25 November 2010 (UTC)[reply]
What you're saying makes sense about the problems caused by revoked images. As long as the warnings to the uploader are sufficiently clear, I think they should at least need a 'good reason' and maybe an OTRS ticket. On the other hand, what about: a photo of a girl in a bikini which ends up on a very popular article like 'beach'; a masturbating woman who decides 5 years later that for employment, personal, or family reasons that the image is harming her; a picture of someone's pierced genitals which has become un-anonymized from cross-publishing in a body-modification mag; a topless photo from an 18 year old which 20 years later doesn't seem like such a good idea. I'm stretching credulity on some of those, but I'm looking for what the precise rationale is. We definitely shouldn't let people revoke content for permissions, except perhaps, when we should. Ocaasi (talk) 07:23, 25 November 2010 (UTC)[reply]
I'm happy to let all of those be dealt with under practice, not policy. But what about some other examples: someone writes the definitive biography of a Stalinist lackey that, while its positive spin has been toned down, stands as the Wikipedia article 10 years later, when he wants it deleted because it will hurt his political career. Or a women who contributes extensively to a Wikiversity project on polyamory who's now worried about the effects on her marriage. Or someone who rewrote masturbation to portray it as dangerous, and filled a talk page with arguments for keeping it; in any of those cases, do we shred articles or talk page to preserve the modesty of the author?--Prosfilaes (talk) 16:27, 25 November 2010 (UTC)[reply]
We're obviously well into hypothetical-world, which is fine with me. I think the major difference between our examples is that photographs are different than text. In both cases, there is a contribution (photo, writing). In both cases, the contributions can be linked to a contributor (uploader, writer). But in the case of the photograph the uploader (or consent giver) is the subject--the contribution bears the mark of the person who uploaded it right out in the open. In the case of an article writer, a screen name could easily be anonymous; and even if the screen name was a Real Name, the text itself would not obviously reveal who did it--you'd have to dig through the history to find the user and then check the diffs, etc. A photograph gives pretty much all of that information without any effort or words, which is why I think the comparison doesn't fit for me. I agree that text contributions like the ones you described should not be revoked, but I don't think that settles the photograph situation.
You're probably right that policy can avoid this issue, but it might be worth beefing up warnings on the upload page to make clear that consent is non-revocable. Once this COM:SEX policy is set, it might be worth linking to or summarizing as well. Something like: "You can upload sexual content to commons if: it is of your body and you understand it will be publicly seen in any context under an open license which you cannot revoke; it is of someone else who consented to and understood those terms; it is of content already released under compatible permissions." Currently the COM:SEX policy is geared towards Commons curators rather than uploaders, which is understandable but not ideal for a broad issue that should address the concerns of both 'users' and 'editors'. Ocaasi (talk) 02:17, 27 November 2010 (UTC)[reply]
So far as I know the upload page has always said that the consent is non-revocable. The point is, once an image goes up on Commons, and especially once it is used to illustrate Wikipedia, it is going to get mirrored all over the world. Publishing a photo here isn't much different than publishing in a sex magazine. Now Wikimedia does recognize a "right to vanish", and the uploader's account name, and presumably any mention of the subject's name in the image, could be removed from the image on this basis. But bear in mind that if we make a policy decision to delete a photo, we're also making a policy decision to delete its derivative works. We're telling the contributor who has taken that picture and put it beside another and labelled up the comparative anatomy between glans and clitoris, labia and scrotum and so forth that all the work he did was for nothing. Now if we were returning the original uploader to secrecy and anonymity we might be tempted, but knowing we're not? Wnt (talk) 18:58, 29 November 2010 (UTC)[reply]
I see what's going on here and the cats-out-of-the-bag nature of a non-revocable, open license. I think where it's pushing me is that we need to be a little more explicit about what exactly placing an image, especially on commons means. Basically, I think we should spell it out: "If you upload an image of your naked body or sexual act onto Wikipedia Commons, you are granting license for anyone in the world to use the image anywhere they choose so long as they follow the terms of the license. Also, you can never revoke your consent once it has been granted." Maybe it doesn't have to be as scary as that, but it's basically just the facts, and it is a little scary. That's what I'm getting at. We shouldn't change policy regarding revoking consent (although I bet OTRS does it quietly on occasion), but we should be abundantly clear that this content is not just for a single Wikipedia article and that uploaders have absolutely no control over content once it is uploaded. For-Ev-Er. Ocaasi (talk) 05:28, 30 November 2010 (UTC)[reply]

Does the upload form create a binding contract when images are submitted? If so, what is the consideration of that contract? Does the notation that the license is non-revocable carry any weight when the uploader isn't reimbursed? 71.198.176.22 14:26, 2 December 2010 (UTC)[reply]

I'm not a lawyer, but similar terms are stated by quite a few prominent sites on the Internet. On Facebook, for example, once you upload content you give them a pretty nearly unlimited license in that content, and it is not revocable either. - Jmabel ! talk 02:10, 3 December 2010 (UTC)[reply]
Armchair analysis (also not a lawyer): Consideration is the mutual exchange of something of value, either money, goods, or a promise not to do something else (e.g. I can contractually pay you $100 not to paint your car red). In the case of cc-by-sa licenses, there is obviously no two-way exchange of value or goods (unless one considers Commons to be exchanging a service by hosting and allowing the distribution of these materials, which is nice but legally unlikely); there is also no exchange of money. Is there a promise 'not' to do something that applies? Well, many copy-left licenses requires attribution, so there is a promise not to distribute the content without doing and also under a compatible copyright. Still, I don't think this is what the courts have in mind, since the license itself should probably be separate from the consideration which makes it binding. There are things which have no contractual power, but can still not be taken back. They're called gifts, and once you give them, you lose your claim of property over them. Although Copyleft licenses are couched in the terminology of contract, they are more just gifts with a few strings attached (that's how I personally think of them; copyleft lawyers probably have more nuanced terminology). This 2006 article discussed consideration and the GNU GPL, seemingly coming down against contract designation (on consideration grounds) but for license status, however that differs. There's more of this out there if you want to poke around the Google or the Google Scholar, or check out http://www.eff.org or ask at the RefDesk, where they are wicked smart.Ocaasi (talk) 09:10, 3 December 2010 (UTC)[reply]
I spend some time at the RefDesk, and I wouldn't trust it for legal advice (which is specifically banned there anyway... ). The main problem here is that if uploads are revocable, it applies to every single image on Commons, so it's not specifically relevant to this policy. I would also worry that any change to the site upload notice might risk the status of all uploads and should be done only with formal legal input. The only question here is if some special courtesy deletion policy is required here, which I don't see. Wnt (talk) 15:39, 3 December 2010 (UTC)[reply]

Simulation resolution

What is the resolution between a diagram and a simulation? 71.198.176.22 20:36, 24 November 2010 (UTC)[reply]

are you asking for thoughts on whether or not / how this policy should differentiate a diagram (drawn, or computer generated I presume) with a simulation (presumably photographed?) - I'm not 100% of how to kick off a response, so a clarification would be helpful to me :-) Privatemusings (talk) 00:30, 25 November 2010 (UTC)[reply]
Yes, if we are going to prohibit photographs and simulations both, then wouldn't it be a good idea to provide some guidance about what is a simulation and what is merely a diagrammatic drawing? 71.198.176.22 22:40, 25 November 2010 (UTC)[reply]
do you have trouble discerning the difference - aren't they different just by definition? :-) Privatemusings (talk) 01:21, 26 November 2010 (UTC)[reply]
This policy isn't about prohibiting photographs, just keeping them within Commons policies. Diagrams are specifically exempted from speedy deletion because certain uncontroversial reasons for deletion can't exist for them: they can't be child pornography, and they can't be created without consent, and the act of drawing them probably implies a certain educational purpose. Simulations aren't specifically defined here, and might be regarded as diagrams; but it is also possible to imagine looking at a "simulation" and not being sure if it is a drawing or a photograph run through a few photoshop filters, or more controversially (for example) a nude rendition of a particular person recreated from X-ray, terahertz, or other forms of scanning. In any case, no one should claim a simulation can be speedy deleted if it doesn't fall into one of the prohibited content category at all. So I'm not going to stickle on having simulations specifically exempted from uncontroversial speedy deletion when it will probably just raise more arguments. The term is just too prone to varying interpretations to use for policy purposes. Wnt (talk) 18:46, 29 November 2010 (UTC)[reply]

I asked the question poorly. I'm referring to "simulated sexually explicit conduct" -- where is the boundary between that and diagrammatic drawings? For example, are the cartoon illustrations used for sex acts in the English Wikipedia simulated sexually explicit conduct or diagrams, and why? 71.198.176.22 12:01, 30 November 2010 (UTC)[reply]

The policy doesn't try to define a difference between simulations and diagrams. At the beginning it defines sexual content to include "simulated" images, which aren't really defined. The main dividing lines are (1) what is prohibited content, which very likely does not include them, and (2) what must go through a full deletion review, with the additional comment that "Some categories of material which are generally useful for an educational purpose include: diagrams, illustrations..." (which reiterates that diagrams and illustrations probably aren't going to be prohibited by the policy).
So to take the example of File:Wiki-analsex.png, the file probably counts as "simulated sexually explicit conduct", so it gets sorted through the policy. Then you ask if it's illegal (probably not; fortunately even photos aren't counted as "obscene" nowadays), or if it's taken without consent (nobody to consent), or if it's out of the project scope (the policy specifically says that illustrations are generally educational). And if someone wants to argue for its deletion, it has to go through a proper discussion.
This may be a roundabout logic, but it's the result of a people trying to come up with a policy from different points of view, perhaps regarding situations we haven't thought of yet. And to be fair, a photo like that isn't all that different from the sort of "animated/live action" image that you could find in the film A Scanner Darkly, which arguably would need (for example) the consent of the original participant. Wnt (talk) 01:00, 1 December 2010 (UTC)[reply]

Time to take it to a poll?

No, it's not perfect. But few things in this world are, and it's been pretty stable, and I'd like to see us adopt this before we are overtaken by events from the Foundation level. - Jmabel ! talk 02:26, 25 November 2010 (UTC)[reply]

I think that timeline is important. The remaining questions are:
  • Should a user be able to revoke sexual content including themselves? (If it was uploaded by them; if it was not uploaded by them; if they gave prior consent in either case)
  • How to handle 'close-up' images. What qualifies as identifiable? Is consent required? What is the process or 'burden' regarding identifiability and consent in these cases?
  • How to handle what could be an illegally acquired image? What does illegally acquired 'look like'? What is the process or 'burden' in these cases?
These are increasingly minor and should only be resolved first if they have the potential to change the outcome of the vote. Ocaasi (talk) 07:02, 25 November 2010 (UTC)[reply]

Here's my answers fwiw :-)

  • Revoking consent will remain outside the purview of this policy, to be handled through practice. The principle of irrevocable consent is important to all wmf content generation, so the best thing for this policy to do is to invite contact with the office, which is suitably in place.
  • Close-up images which are sexual content, as defined, require assertions of consent, as does all sexual content. We've left the burden to be decided through community discussion - I'm fine with that.
  • We've stated 'Provision of further evidence of consent is welcome (using the OTRS process) but is not normally required.' - in the event that an image has a reasonable basis for being illegally required (as discussed and established through community discussion) this would, in my view, qualify as 'not normal' hence the community is empowered to insist on further evidence, the nature of which is left to the community discussion to evaluate. I'm ok with that for now too.

that's my thoughts. Privatemusings (talk) 22:09, 25 November 2010 (UTC)[reply]

At a quick read, I agree completely with Privatemusings here. And none of this entails any further rewordings. - Jmabel ! talk 02:05, 26 November 2010 (UTC)[reply]
I like those answers too. Maybe they'd be useful for a COM:SEX FAQ. In fact, an FAQ, even an informal one might help explain the intention of some of the policies and let voters browse through issues. That's not to say the policy shouldn't be self-explanatory or that the FAQ is binding--just that a guide to a new policy might help introduce the policy itself (see: W:Wikipedia:Neutral_point_of_view/FAQ for example). Ocaasi (talk) 05:30, 26 November 2010 (UTC)[reply]
I don't actually understand what you mean, but it looks like people want a poll, so I'll start the section, closely based on the preview text above (which I'll archive now to avoid any misplaced votes). Wnt (talk) 13:56, 4 December 2010 (UTC)[reply]
  • may be we are ready for the polls, but not yet, i feel, till the question of "irrevocable consent" is well settled. "sign" jayan 05:51, 9 December 2010 (UTC)

Summary of poll arguments

This section is intended to be a brief, neutral summary of opinions raised in the policy poll below. Please make NPOV edits for completeness, concision, neatness, or accuracy, but not for persuasion (that's what the poll comments are for). Also, !nosign! any changes to keep it tidy. Thanks, Ocaasi (talk) 15:50, 7 December 2010 (UTC)[reply]

Like it...

  • Policy is needed
  • Good enough for a start
  • Represents Compromise
  • Not having the policy is worse
  • Not a major change of existing policies, just a focused reorganization
  • Can help avoid social backlash and panic
  • Needed for legal reasons
  • Needed for ethical reasons
  • Protects the WMF
  • Prevents more strict censorship
  • Legally sound
  • Prevents becoming a porn repository
  • Better to have a centralized policy than multiple deletion rationales
  • Preempts WMF from imposing a less nuanced sexual content policy

Don't like it...

  • Slippery slope to more censorship
  • Educational purpose is not clearly defined
  • Doesn't explicitly protect content from deletion
  • Policies on illegality, privacy, pornography, and nudity already cover the material
  • Better to handle on a case by case basis
  • Instruction creep
  • Policy addresses legal issues but is not written by lawyers
  • Policy addresses legal issues but commons users are not legal experts
  • Sexual content should not be treated differently than other content
  • Not strong enough at keeping out pornography
  • Vague wording
  • The phrase 'low-quality'
  • Out-of-scope deletions should be normal not speedy
  • US-Centric legal concerns
  • Legal concerns that are unresolved by courts
  • Addresses sex but not offensive or violent content
  • After Wales deletions, implementation cannot be trusted
  • Biased against Western cultural taboos (it's too conservative)
  • Biased against non-Western cultural taboos (it's too liberal)
  • Better as a guideline
  • Needs more specific criteria for inclusion and exclusion
  • Better to solve this on the user end (personal image settings/filters)
  • Insufficient protection for erotic art
  • Only available in English language, which makes fair overall understanding and voting impossible

Questions

  • What does 'low-quality' mean?
  • What qualifies sexual content as educational?
  • What content cannot be deleted?
  • Would it be better as a guideline?
  • Can the legal discussion be rephrased in normal language?
  • Should out-of-scope decisions be normal rather than speedy deletions?

Tweaks

  • Shorter
  • Less legalese
  • Elaboration on legal interpretation
  • More specific criteria for inclusion/exclusion

Second poll for promotion to policy (December 2010)

This is a poll to adopt Commons:Sexual content as a Wikimedia Commons policy.

Please give your opinion below with {{Support}}, {{Oppose}}, or {{Neutral}}, with a brief explanation.

The poll concerns whether people accept or reject the November 26 revision. Edits made to the policy during the polling period may be temporarily reverted, and will need to be adopted by consensus, like changes to the other established policies.

Voting on the poll will proceed until 06:08, 15 December, ten days after the time that poll was first advertised at MediaWiki:Sitenotice.

A previous poll was closed as "no consensus" at Commons talk:Sexual content/Archive 4#Poll for promotion to policy. Wnt (talk) 07:47, 23 October 2010 (UTC)[reply]



Support

  1.  Support I fully support this stance.
  2.  Support There definitely needs to be a policy in place here. This one may need some tweaking, eventually, but it is a sufficient starting point, at least... Sebastiangarth (talk) 02:52, 9 December 2010 (UTC)[reply]
  3.  Support looks altogether well; US law may be followed, as the root servers are in the US, and the US law is goodHans Dunkelberg (talk) 17:06, 8 December 2010 (UTC)[reply]
  4.  Support will no doubt require further refinement over time, but otherwise is a great start. Bakerboy448 (talk) 02:14, 8 December 2010 (UTC)[reply]
  5.  Support May not be perfect, but it's a good start. I think it is important that we go on record with a policy in this area; further debate, especially on edge cases, can follow on the basis of this general consensus. - Jmabel ! talk 03:26, 5 December 2010 (UTC)[reply]
  6.  Support BUT only for educational purposes. Thank you, --???????8790Say whatever you want 06:40, 5 December 2010 (UTC)[reply]
  7.  Support as a reasonable initial revision. While it will no doubt require further refinement over time, this is best done as a live policy handling real cases rather than theoretical analysis as a draft. -- Mattinbgn/talk 08:00, 5 December 2010 (UTC)[reply]
  8.  Support I think the issues have been dealt with (see the headings above and in the archives). There is disagreement on some points, but the policy represents a reasonable compromise and not having a policy is far worse. --LPfi (talk) 08:36, 5 December 2010 (UTC)[reply]
  9.  Support as not having a policy risks bringing Wikimedia into serious disrepute. --Simple Bob (talk) 10:47, 5 December 2010 (UTC)[reply]
  10. OK But not the last decision...--Yikrazuul (talk) 11:52, 5 December 2010 (UTC)[reply]
  11.  Support Too important not to have a policy on, especially with likely input coming soon from the Foundation. Easy to make a list of edge issues and let them be clarified over time. Policy doesn't leave a giant loophole for content to be deleted--it establishes the principle (namely that commons is not censored for broadly educational content)--and then provides some pathways for handling disputes. This policy is a good idea, whether or not it's perfect, and having no policy on the issue is worse. Ocaasi (talk) 10:55, 5 December 2010 (UTC)[reply]
  12.  Support I see some opposers wanting a more restrictive policy, and a similar number wanting a less restrictive policy (or none), suggesting that this policy strikes approximately the best consensus our community could hope for. This wording has been extensively debated by editors with a wide range of opinions. --99of9 (talk) 11:12, 5 December 2010 (UTC)[reply]
  13. Moderate  Support -- mostly a restatement of laws which already apply to Commons, plus basic common-sense policies. Not sure that there's a burning need to adopt it right now, but if it helps head off the next Sanger-type panic, then it will have been worthwhile... AnonMoos (talk) 11:23, 5 December 2010 (UTC)[reply]
  14.  Support well written, takes a cautionary, balanced approach to dealing with the subject. While this may repeat some basics (correctly) this has releaved itself as a problematic issue so some extra effort for the sake of integrity of Commons is a good idea. Hekerui (talk) 11:29, 5 December 2010 (UTC)[reply]
  15.  Support This can and will no doubt evolve further, as do all our policies, but we really do need a policy in place for this. This is a good start. --JN466 12:54, 5 December 2010 (UTC)[reply]
  16.  Support, although I agree with AnonMoos - these are just basic common-sense policies. -- deerstop. 15:16, 5 December 2010 (UTC)[reply]
  17.  Support - It may not be perfect, but something is needed if only for basic legal and ethical reasons. Smallbones (talk) 15:41, 5 December 2010 (UTC)[reply]
  18.  Support - such a policy, repugnant as it may be, is necessary to help combat the more repugnant heavy-handed censorship and insufficiently-justified deletionism and prosecution, as well as to preserve, protect, and defend the WMF.   — Jeff G. ? 15:55, 5 December 2010 (UTC)[reply]
  19.  Support (for now) As for the doubters, are there any cases of this sort of policy having been applied heavy-handedly (or abused) in the Wikipedia universe? If there are, I'd like to see them as I take your concerns seriously. Mr.choppers (talk) 16:53, 5 December 2010 (UTC)[reply]
    re: "are there any cases of this sort of policy having been applied heavy-handedly (or abused) in the Wikipedia universe" - um, yes there are. do you really need a list? o__0 if so, try the deletion & undeletion archives on here, or on pretty much any wm project Lx 121 (talk) 00:34, 7 December 2010 (UTC)[reply]
  20.  Support to echo people above, it's a sensible step forward in codifying our basic response to real issues of legality and how Commons' existing policies apply in a more focused sense. It may not be perfect, but it hits all the high notes and we need something like this. David Fuchs (talk) 17:11, 5 December 2010 (UTC)[reply]
  21.  Support the guidelines and the text of this policy seems to be well-balanced. --DenghiùComm (talk) 18:00, 5 December 2010 (UTC)[reply]
  22.  Support a good first attempt. Such a guidline could simplify some debates. --High Contrast (talk) 20:18, 5 December 2010 (UTC)[reply]
  23.  Support A policy needs to be in place, even if not perfect yet. It can be fine-tuned progressively. --P199 (talk) 20:24, 5 December 2010 (UTC)[reply]
  24.  Support I'm not a fan of it in all details, but it does seem that the Foundation is pushing us in the direction of having some sort of standard here, and a lot of this merely clarifies current practice on the land. "Low-quality pornographic images that do not contribute anything educationally useful may be removed" is basically No Penis, a practice that will go on with or without policy, because no one wants to DR those.--Prosfilaes (talk) 20:34, 5 December 2010 (UTC)[reply]
  25.  Support Needs some sort of standard to be adopted. Miyagawa (talk) 20:52, 5 December 2010 (UTC)[reply]
  26.  Support per Jmabel and 99of9. Walter Siegmund (talk) 21:40, 5 December 2010 (UTC)[reply]
  27.  Support Could do with less words, but a reasonably clear start --Elen of the Roads (talk) 23:05, 5 December 2010 (UTC)[reply]
  28.  Support I was iffy on including "prominent depictions of the pubic area or genitalia" as "sexually explicit conduct", but I am somewhat familiar with USA v. Dost (1996) and am definitely willing to take precautions against child pornography. The policy, as I see it, is to allow the deletion of three types of images we should delete regardless of an existing sexual content policy: illegal material (child porn, etc.), material breaching personality rights, and material out of the project's scope. ?fetchcomms 23:27, 5 December 2010 (UTC)[reply]
  29.  Support. I'll be darned, you guys managed to come up with a decent policy on this that doesn't negatively impact our educational mission, but keeps us on the right side of the legal and court-of-public-opinion lines. Is it perfect? Certainly not, but policy documents can be modified just like any other page. Powers (talk) 23:45, 5 December 2010 (UTC)[reply]
  30.  Support it's a start. - KrakatoaKatie 07:29, 6 December 2010 (UTC)[reply]
  31.  Support Privatemusings (talk) 09:22, 6 December 2010 (UTC)[reply]
  32.  Support litenbjoll (talk) 10:30, 6 December 2010 (UTC)[reply]
  33.  Support Captures the existing policy and the previous existing practices and adds some details to clarify the intentions of the policy. This is an improvement and worthy of adopting now so that we have something written to explain to new users. FloNight♥♥♥ 09:42, 6 December 2010 (UTC)[reply]
  34.  Support Seems reasonable enough, being based on already existing principles and mandatory law, without giving way to moral panic (read:fox). Opposing voices calling out that this is/will lead to "censorship" seems to me as not read all relevant text or misinterpreted it, as there will be no more "censorship" after a policy as this is adopted as there is "censorship" on commons today. And the "slippery slope" argument is strange to me as all policy change will be subject to community consensus plus that the "not censored" policy clearly show that commons isn't ruled by fear generated by moral panic in the media and society. Lokpest (talk) 10:07, 6 December 2010 (UTC)[reply]
  35.  Support. A good start. SlimVirgin (talk) 11:51, 6 December 2010 (UTC)[reply]
  36.  Support Sounds good. Agreed with Lokpest. --Millosh (talk) 12:21, 6 December 2010 (UTC)[reply]
  37.  Support good start--shizhao (talk) 13:24, 6 December 2010 (UTC)[reply]
  38.  Support A workable policy that can be adapted as necessary. Edgar181 (talk) 13:28, 6 December 2010 (UTC)[reply]
  39.  Support Better to have a good policy we agree on than the alternatives. Oppose reasons are not compelling. ++Lar: t/c 14:36, 6 December 2010 (UTC)[reply]
  40.  Support Some sort of policy needs to be adopted. MarioCRO (talk) 16:20, 6 December 2010 (UTC)[reply]
  41.  Support Speaking as a rabid free-speech nut, I like this - David Gerard (talk) 16:39, 6 December 2010 (UTC)[reply]
  42.  Support As a supporter of free-speech and anti-censorship, i support this policy. Joyson Noel Holla at me 16:51, 6 December 2010 (UTC)[reply]
  43.  Support Definite improvement. NathanT 16:53, 6 December 2010 (UTC)[reply]
  44.  Support A decent start. I share the concerns about legalese - we should just speak plainly about what is harmful and what is helpful - but otherwise this is needed. I'd encourage the opposes/neutral camp to dive in and edit parts they have strong feelings about to make it acceptable; a softer, early-stage policy is better than none at all. Steven Walling 16:56, 6 December 2010 (UTC)[reply]
  45.  Support as this area is important to have very clear policies and guidelines in place. ···日本穣Talk to Nihonjoe 17:18, 6 December 2010 (UTC)[reply]
  46.  Support Personally, I believe every content in our projects should have clear educational value, independent of what the content is about. In this sense I agree with some of the people who objected this policy. But sexual content is a controvertial topic, in non-western cultures often more than in western cultures. And in the past I do think that from time to time marginally educational, or even non-educational content got uploaded on Commons, which do disturb and confuse our users. So I think to clearly emphasize educational value is a very important issue here. This is the main reason why I am in support of this policy--Wing (talk) 19:30, 6 December 2010 (UTC)[reply]
  47.  Support. A good start. Seems to strike a balance between privacy/scope concerns and not censoring wikipedia. Kaldari (talk) 19:42, 6 December 2010 (UTC)[reply]
  48.  Support This policy definitely needs to be elaborated on more, but the overall policy is definitely for the best. Oxguy3 (talk) 19:43, 6 December 2010 (UTC)[reply]
  49.  Support I would normally oppose a policy which treated sexual material differently from any other class of material, and it is already a requirement that we remove material illegal according to the law under which we operate. However, I see the opposition to this proposed policy is almost entirely from those who think it too permissive, and wish it rejected so they could develop a policy which would to my way of thinking be even worse. I am not willing to do anything that might tend to encourage such an effort. DGG (talk) 19:59, 6 December 2010 (UTC)[reply]
  50.  Support I agree with the policy as a working basis for a sexual content policy. It has to start somewhere, and I think this is a pretty good first step. Zach the Wanderer (talk) 20:29, 6 December 2010 (UTC)[reply]
  51.  Support The proposed policy seems appropriate and legally sound. Fred Bauder (talk) 20:31, 6 December 2010 (UTC)[reply]
  52.  Support --|EPO| da: 21:07, 6 December 2010 (UTC)[reply]
  53.  Support -- Avi (talk) 21:25, 6 December 2010 (UTC)[reply]
  54.  Support: for the most part, simply codifies current practice. --Carnildo (talk) 21:31, 6 December 2010 (UTC)[reply]
  55.  Support Mardetanha talk 22:07, 6 December 2010 (UTC)[reply]
  56. Support <--Spot the Wikipedian. Seems pretty reasonable to me. Fences and windows (talk) 22:12, 6 December 2010 (UTC)[reply]
  57. Support - Not perfect, but some standards are a good idea. KillerChihuahua (talk) 23:26, 6 December 2010 (UTC)[reply]
  58. Support This is sound and relies heavily on already existing policies. Workable and useful. notafish }<';> 23:44, 6 December 2010 (UTC)[reply]
  59.  Support I agree. No policy would be a very tricky situation indeed. If it isn't perfect now, the bugs can be worked out as soon as possible, but the groundwork should at least be laid. Timmietimtim314 (talk) 03:27, 7 December 2010 (UTC)[reply]
  60.  Support It is important to have criteria in place for such material, as fundamentally and foremost Wikipedia is an educational encyclopaedia used by all ages, and not an easily accessed repository for pornography. This unsigned post was made by 3atc3 (talk · contribs) at 04:44, 7 December 2010
  61.  Support - I think it is reasonable to have a guideline for those keeping the place free from non-educational, objectionable media. Nephron  T|C 04:48, 7 December 2010 (UTC)[reply]
  62.  Support Good to have sexual content on Wikimedia for educational purposes. The proposed policy seems to give enough handles to prevent abuse. Lets go with it and see how things turn out. Yhoitink (talk) 08:58, 7 December 2010 (UTC)[reply]
  63.  Support Agree with most of the reasons provided above - there is a need to prevent the misuse of the site. It's not prefect, there are lots of room for improvement, but this will work as a foundation for future improvements. Joeyfjj (talk) 09:06, 7 December 2010 (UTC)[reply]
  64.  Support In my personal capacity as an editor. This does not represent an official statement of support or opposition from the Wikimedia Foundation. Philippe (talk) 15:00, 7 December 2010 (UTC)[reply]
  65.  Support - Time to raise the bar. - Josette (talk) 15:55, 7 December 2010 (UTC)[reply]
  66.  Support Requesting that the poster at least claim to have obtained consent in particular is a reasonable expectation. Ŝxeptomaniac (talk) 17:37, 7 December 2010 (UTC)[reply]
  67.  Support Having a policy is certainly better than not having one, and helps combat censorship. — Loadmaster (talk) 17:47, 7 December 2010 (UTC)[reply]
  68.  Support - The proposed policy seems to cover the basics. Killiondude (talk) 18:08, 7 December 2010 (UTC)[reply]
  69.  Support -- Seems reasonable, not too restrictive, seems to cover the important areas. --SarekOfVulcan (talk) 18:59, 7 December 2010 (UTC)[reply]
  70.  Support A policy is needed and I support this provisionally. DMSBel (talk) 19:21, 7 December 2010 (UTC)[reply]
  71. Template:Ar son - 's tús maith é / it's a good start - Alison 20:52, 7 December 2010 (UTC)[reply]
  72.  Support Ottava Rima (talk) 21:19, 7 December 2010 (UTC)[reply]
  73.  Support on the assumption the Foundation would rather support than oppose. Some WMF edicts such as the blanket ban on hosting "promotional" images of living persons I have disagreed with, but in general I support WMF initiatives because they have some real responsibility for the projects and are relatively more informed about the challenges that need to be addressed.--Brian Dell (talk) 21:31, 7 December 2010 (UTC)[reply]
  74.  Support an overarching policy on these images is necessary. This is a good place to work from. I can't understand the anti-US law opposers below when the Foundation and WPs/Commons' servers are located in the US. Moving them offshore just to escape the laws governing pornographic material would be a waste of money, a public relations disaster, and not conducive to anyone's goals. Ed [talk] [en:majestic titan] 21:40, 7 December 2010 (UTC)[reply]
  75.  Support Policies help.--Sisyphos23 (talk) 21:41, 7 December 2010 (UTC)[reply]
  76.  Support as others have said, a good start. Kevin (talk) 22:00, 7 December 2010 (UTC)[reply]
  77.  Support ɳOCTURNEɳOIR 22:09, 7 December 2010 (UTC)[reply]
  78.  Support per what others have said. This is a good starting point. Griffinofwales (talk) 22:11, 7 December 2010 (UTC)[reply]
  79.  Support This policy is not perfect but an acceptable start. Regarding the speedy deletions: That was already policy: Material which is plain illegal or that falls under COM:PORN was already speedy deleted in the past. That our media must not violate US law should be obvious. Suggestions to move Wikimedia Commons to another country with a different legislation are not helpful. --AFBorchert (talk) 23:03, 7 December 2010 (UTC)[reply]
  80.  Support This policy will make the guidelines as to what sexual content is and isn't allowed on commons much clearer. Barts1a (talk) 23:28, 7 December 2010 (UTC)[reply]
  81.  Support. Wikipedia and other Wikimedia projects are not for porn. Demize (talk) 23:40, 7 December 2010 (UTC)[reply]
  82.  Support - a sensible policy. --Bduke (talk) 23:50, 7 December 2010 (UTC)[reply]
  83.  Support - (Strictly as a Commonist, per Philippe) This seems to be a good policy laying out the kinds of images that have to be deleted (mainly for legal issues or respect for the people depicted) while making explicit the wide range of acceptable sexual content.--ragesoss (talk) 23:56, 7 December 2010 (UTC)[reply]
  84.  Support It's more than a start. This is a necessity. Half price (talk) 00:09, 8 December 2010 (UTC)[reply]
  85.  Support - Better to have good clear policies. This is a good beginning and it can be fleshed out (;) with time.
    ⋙–Berean–Hunter—► ((⊕)) 00:51, 8 December 2010 (UTC)[reply]
  86.  Support This makes sense. Lovetinkle (talk) 01:12, 8 December 2010 (UTC)[reply]
  87.  Support - a perfect start. Sjones23 (talk) 01:49, 8 December 2010 (UTC)[reply]
  88.  Support - not perfect, but better than the status quo. Kirill Lokshin (talk) 02:08, 8 December 2010 (UTC)[reply]
  89.  Support - Wikipedia is meant for educational and info purposes only, not a porn site. This would work well. Creation7689 (talk) 02:13, 8 December 2010 (UTC)[reply]
  90.  Support - This is an encyclopedia, so I'm in favor of limiting such images to trully educational/artistic purposes. BatteryIncluded (talk) 02:18, 8 December 2010 (UTC)[reply]
    with respect to the 2 above user-voters, commons is a media repository NOT an encyclopedia. our project goals here are different from those of wikipedia; if the wmc project's goals were not different, there would be no reason to have a separate wmc project. see Commons:Project scope Lx 121 (talk) 03:13, 8 December 2010 (UTC)[reply]
    I chose my wording wrong, I meant Wikimedia Commons, not wikipedia. I was transitioning back and forth from both wikipedia to wmc continuously. Although I still agree with this policy. Creation7689 (talk) 20:20, 9 December 2010 (UTC)[reply]
    fair enough, it happens :) & yes, of course everyone is free to support or oppose the proposed policy. you may have slightly missed my point though; it wasn't meant as a technicality. wikimedia commons is different from wikipedia. our purpose here is different; we're not only collecting media materials for wikipedia, wikimedia et cie. we're collecting free and/or open-source media files (for educational purposes) for anyone & everyone to make use of. if all we wanted to do was stock up for wm projects, we wouldn't need a full standalone project for that; just a central co-ordination & storage service. we could also get rid of at least half the files we already have as redundant/unnecessary. it falls within our purview to have as wide & varied a collection of media file on here as possible. Lx 121 (talk) 01:01, 10 December 2010 (UTC)[reply]
  91.  Support - This proposed policy seems to be consonant with other wiki policies and meets the legal concerns of the Foundation. Wabbott9 (talk) 03:36, 8 December 2010 (UTC)[reply]
  92.  Support - This should have been on poll long ago. It's what one would call "common sense," IMHO. Jsayre64 (talk) 03:56, 8 December 2010 (UTC)[reply]
  93.  Support Well worded and reasonable, does not seem too restrictive or to permissive, and provides clear and concise guidance for things relating to sexual content. Ks0stm (TCG) 04:01, 8 December 2010 (UTC)[reply]
  94.  Support My support for this is uncontroversial to me but what objections I have are all debatable. My vote of support is without hesitation. Bluerasberry (talk) 04:30, 8 December 2010 (UTC)[reply]
  95.  Support - I support the many reason previously stated for implementing this: Moral, legal, etc. If left unchecked, commons could easily become a repository and distribution platform for pornographic content.--JDIAZ001 (talk) 04:36, 8 December 2010 (UTC)[reply]
  96.  Support - I see this as being that missing last major policy that was needed to complete our policy 'canon'. I feel it was long needed and I generally support, but I do wish that the concerns raised by the opposers, if possible, can be alleviated and get any kinks ironed out first. -- OlEnglish (talk) 04:44, 8 December 2010 (UTC)[reply]
  97.  Support - Some form of control is needed, not perfect but better that a weak control, if we wait for the perfect policy abuses might continue to occur.-Mariordo (talk) 06:08, 8 December 2010 (UTC)[reply]
  98.  Support - Agree with comment by Jmabel (talk · contribs), above. -- Cirt (talk) 07:01, 8 December 2010 (UTC)[reply]
  99.  Support - Not perfect, needs a good copyedit and content tweak, but it's a start. Cindamuse (talk) 07:37, 8 December 2010 (UTC)[reply]
  100.  Support I'm all for it.TucsonDavid (talk) 08:10, 8 December 2010 (UTC)[reply]
  101.  Support Seems entirely consistent with the foundation's purposes and applicable laws. Jclemens (talk) 08:14, 8 December 2010 (UTC)[reply]
  102.  Support Perfect? No. Long overdue, yes. It is simply the right thing to do. --Ckatzchatspy 08:17, 8 December 2010 (UTC)[reply]
  103.  Support Not perfect, but I don't agree with any of the opposing vote reasons. VMS Mosaic (talk) 08:48, 8 December 2010 (UTC)[reply]
  104.  Support - If anything, this proposal is too lax. - Schrandit (talk) 09:17, 8 December 2010 (UTC)[reply]
  105.  Support - It does not necessarily need to start this project perfectly, there shall be criticisms against it; but I believe that this project shall enhance Wikipedia's sensibility on matters like this. Moreover, if possible, consider also the pleas of our colleagues who have opposed this plan. It is also for Wikipedia's good. - Thirdeyerevo (talk) 18:10, 8 December 2010 (GMT+8)
  106.  Support - Censorship should not take away from education. <3 bunny 11:22, 8 December 2010 (UTC)[reply]
  107.  Support - A step in the right direction. Esuzu (talk) 11:52, 8 December 2010 (UTC)[reply]
  108.  Support - Needed for ethical reasons. --Sachinvenga (talk) 12:04, 8 December 2010 (UTC)[reply]
  109.  Support - Unfortunate, but necessary. Beyond My Ken (talk) 12:24, 8 December 2010 (UTC)[reply]
  110.  Support - While I am loathe to suggest that any form of censorship be introduced, I do agree that there should be some limits imposed for this particular topic. A brief glance through certain categories will reveal images that are not currently used in the project, probably never will be used in the project, and offer absolutely no value to the project other than serving as freely accessible pornographic content. --MacTire02 (talk) 12:48, 8 December 2010 (UTC)[reply]
  111.  Support - Indeed a good start.Lord Psyko Jo (talk) 16:21, 8 December 2010 (UTC)[reply]
  112.  Support Better than nothing, but IMO it shouldn't be restricted to sexual content; things like violence, hatred, disgusting stuff, subjectively offensive stuff (things that offend some people but not others) etc, basicly anything that any significant amount of people object to seeing or having avaiable online should also be judged under similar rules (so those type of things are only kept if they don't make Wikimedia break any laws and they positively contribute to at least one of the projects). Having said that, i haven't put much thought into this, there might be some important details i missed that would have made me change my mind and/or make other suggestions if i became aware of them. --TiagoTiago (talk) 18:01, 8 December 2010 (UTC)[reply]
  113.  Support - Wiki needs something, maybe others will see it as a step towards professionalism and start to take Wikipedia more seriously. Mike (talk) 12:35, 8 December 2010 (UTC)[reply]
  114.  Support Perhaps necessary, at all a good thing. --Singsangsung (talk) 18:38, 8 December 2010 (UTC)[reply]
  115.  Support I'm ambivalent about the "out of scope" issue, but on balance I think the proposal is a net gain. The fact is that the servers are in Florida and subject to applicable law, and no amount of votes by editors can change that fact. --Tryptofish (talk) 18:52, 8 December 2010 (UTC)[reply]
  116.  Support Not perfect, but better than no policy. Can always be improved once in place. Net gain if adopted. Kim Dent-Brown (talk) 19:12, 8 December 2010 (UTC)[reply]
  117.  Support Reasonable. Grimhim (talk) 19:37, 8 December 2010 (UTC)[reply]
  118.  Support But needs some work in the future.
  119.  Support "Good enough" start; I would like to see some policy going forward that non-logged-in users (anonymous) cannot view Sexual Content and only persons who are > 13 years old can register an account (to comply with COPPA. Safety Cap (talk) 21:57, 8 December 2010 (UTC)[reply]
  120.  Support This looks like about as much as can be done. StAnselm (talk) 21:58, 8 December 2010 (UTC)[reply]
  121.  Support Seems like a sound enunciation of policy. ScottyBerg (talk) 22:27, 8 December 2010 (UTC)[reply]
  122.  Support - Tiptoety talk 22:30, 8 December 2010 (UTC)[reply]
  123.  Support - Absolutely. We've needed this for a while. Nolelover (talk) 22:50, 8 December 2010 (UTC)[reply]
  124.  Support; although I remain mystified why people have such strong reactions to content that is of sexual nature and yet fail to blink an eye at strong violence. Meh. This policy draft is reasonable, if necessarily imperfect, and is a good foundation. Coren (talk) 00:19, 9 December 2010 (UTC);[reply]
  125.  Support Venustas 12 (talk) 01:10, 9 December 2010 (UTC)[reply]
  126.  Support – We need a policy on this content, and this one deals with it fine. MC10 (talk) 01:46, 9 December 2010 (UTC)[reply]
  127.  Support - This policy has a VAST NET POSITIVE effect. I am so sick of people uploading and hosting pictures of their penis for no encyclopedic purpose. Aeonx (talk) 04:43, 9 December 2010 (UTC)[reply]
  128.  Support - Regardless of the other good reasons to adopt this, it addresses an issue one sees in the pop-culture fields in particular: a tendency to see Wikipedia as free server space for fan sites or, as one other person here put it, amateur porn sites. Wikipedia is not free server space for anything and everything. There have to be limits. Why not clearly establish them? --207.237.230.157 05:14, 9 December 2010 (UTC)[reply]
  129.  Support It needs work in my opinion but its a good start. We need a good strong policy on content of this nature to not only protect children but to protect the Wikimedia Foundation--Dcheagle (talk) 05:16, 9 December 2010 (UTC)[reply]
  130.  Support Not perfect, but better than no policy. Gérard Janot (talk) 07:59, 9 December 2010 (UTC)[reply]
  131.  Support policy is greatly needed too many exploit this to make a point....77.97.18.242 10:10, 9 December 2010 (UTC)[reply]
  132.  Support it's the start we need, but definitely not completed. --Cvf-ps (talk) 11:26, 9 December 2010 (UTC)[reply]
  133.  Support Its something we need to address, Commons exists because of the freedoms the US law provides us to operate within Gnangarra 12:18, 9 December 2010 (UTC)[reply]
  134.  Support A good compromise to start with, certainly not perfect but no policy is. --Captain-tucker (talk) 14:43, 9 December 2010 (UTC)[reply]
  135. I completely support this policy. People will only be using these dirty pages for something personal. Not for a project or research. (Zach attack)
  136. I  Support this well thought out step in the right direction. "Slippery slope" arguments in opposition seem awfully political. --SB_Johnny talk 16:33, 9 December 2010 (UTC)[reply]
    "Awefully political" can still be (and in this case is) correct. Beta M (talk) 17:32, 9 December 2010 (UTC)[reply]
    Indeed. See for example vote no. 119 in this group for an example of how slippery the slope is. --Saddhiyama (talk) 17:40, 9 December 2010 (UTC)[reply]
  137.  Support --Kjetil_r 17:35, 9 December 2010 (UTC)[reply]
  138.  Support implementation of this policy as written. Kelly (talk) 17:36, 9 December 2010 (UTC)[reply]
  139.  Support because "like it..." Rursus (talk) 17:43, 9 December 2010 (UTC)[reply]
  140.  Support
    1. As stated, Wiki Commons is a "media resource for encyclopedia articles..."
    2. Information, in all its forms, should never be censored.
    3. Wiki Commons is not a "porn site, and ... images that do not contribute anything educationally useful may (should) be removed." gatorgirl7563 | 173.65.66.133 18:01, 9 December 2010 (UTC)[reply]
  141. Reluctant  Support I suppose it's better than nothing at all, but I still think we should delete all of it, for the reason I stated below. --The High Fin Sperm Whale 20:02, 9 December 2010 (UTC)[reply]
  142.  Support - obviously not perfect, but just as obviously necessary, and well thought out. PrincessofLlyr royal court 21:30, 9 December 2010 (UTC)[reply]
  143.  Support Commonsense. Johnuniq (talk) 21:57, 9 December 2010 (UTC)[reply]
  144.  Support Long over due. Of course, voting on these projects is pointless, as achieving consensus is no longer possible. Jennavecia (Talk) 22:05, 9 December 2010 (UTC)[reply]
  145.  Support Keep Wikipedia clean! --Wingdude88 (talk) 00:46, 10 December 2010 (UTC)[reply]

Oppose

  1.  Oppose This is a very slippery slope. Hence, my two immediate objections are: age of visitors, and policing. Neither can be monitored and enforced.Ineuw talk page on en.ws 07:23, 5 December 2010 (UTC)
    There's nothing here about the age of visitors, only the subjects of the photography. And of course we have no way to determine a visitor's age, even if we wanted to play parent. Wnt (talk) 04:49, 6 December 2010 (UTC)[reply]
  2.  Oppose for this revision. It should define more clearly what the educational purpose is, and what kind of images must not be deleted due to this issue. I also cannot get the point here: COM:SEX#Normal deletions: "Items likely to fall within the Wikimedia Commons project scope should not be deleted without using the normal deletion process unless they are copyright violations or illegal to host. These include: ..." Is it equivalent to: "If an item falls within the Wikimedia Commons project scope as listed below there, there is still a possibility for it to be asked for deletion and to be deleted due to the sexual content issue, only if we do have the normal deletion process."? If so, I simply can't agree on it. --Tomchen1989 (talk) 07:39, 5 December 2010 (UTC)[reply]
     Comment “Likely” refers to borderline cases, where the nominator thinks the image is outside project scope. The section is about deleting prohibited content. It could of course be written clearer. --LPfi (talk) 08:36, 5 December 2010 (UTC)[reply]
    It is a loophole. It would better be: "Items likely to fall within the Wikimedia Commons project scope should not be deleted without using the normal deletion process unless...". Otherwise, things fall within the scope cannot be imediately deleted, but the guardians can still request them for deletion via a normal deletion process, that'd be so annoying even if the item finally being kept. But even if the sentence is fixed here, it should be more specific to handle the non-photo images. --Tomchen1989 (talk) 09:30, 5 December 2010 (UTC)[reply]
    respectfully, i think you are mis-understanding the real meaning of this section, the key meaning of this text is that any admin can speedy delete any sexual content based entirely on their own, individual opinion about whether the item is "out-of-scope", without any discussion or debate process Lx 121 (talk) 14:16, 7 December 2010 (UTC) 14:12, 7 December 2010 (UTC)[reply]
    I should note that the situation that most contributed to the current policy on this is the "Korper des Kindes" series which was undeleted after the Jimbo Wales purge. The photos were taken in the 1890s by en:Guglielmo Plüschow, a notable photographer about whom something like ten books have been written, whose old castle apparently is now a museum of his photos in Germany which is open to the public. So it's clearly educational, artistic, historically significant content. It also looks rather like child pornography, and apparently Pluschow was actually convicted of "procuring" young boys, though in keeping with the laid-back sexual mores of the Victorian era he was only sentenced to a few months in jail. So what do you do in a case like that? The answer people favored in discussion is that either (1) someone has to convince the WMF that the image is actually illegal and they need to make an office action right away (which no policy on Commons can prevent), or (2) we have a proper deletion discussion and see what the community has to say about the legal issues. Wnt (talk) 05:24, 6 December 2010 (UTC)[reply]
  3.  Oppose IMO mature content, sexual content, whatever you name it should not be deleted just they display human genitals or intercourse. The line for educational use is blurred. My suggestion is such images be tagged as so and be hidden from non-registered users. I added Category:Images_from_15th_century_sexual_book_in_Iran The content is not deleted. But I believe users have the right to be warned before seeing any mature content. They should also have the right to choose the content types they might find inappropriate to view. The settings should be in user preferences. I suggest tags such as : HumanMaleGenitalia, HumanFemaleGenitalia, HumanSex... etc. --Nevit Dilmen (talk) 08:12, 5 December 2010 (UTC)[reply]
     Comment The proposed policy does not even suggest that anything should "be deleted just [because] it displays genitals or intercourse". Also, the educational requirement of COM:SCOPE is always blurry, we're not proposing any changes to the deletion review process for such blurry issues. Finaly, your book uploads are explicitly protected from deletion without discussion as they are historical artworks. Please can you explain what you oppose about it? 99of9 (talk) 11:03, 5 December 2010 (UTC)[reply]
  4.  Oppose No need for a policy on this topic, case by case, DR by DR is better in my opinion. - Zil (d) 08:53, 5 December 2010 (UTC)[reply]
  5.  Oppose. The proposed policy is just an example of m:instruction creep. Approximately one third of it only repeats what is already in other guidelines and policies (illegal content, copyvios and BLP violations). The other third is devoted to vague and useless interpretations of various US federal laws. These interpretations are not written by lawyers and may be misleading. The remaining third contains some rather explicit descriptions of sex acts, which the authors of the policy think are inappropriate for Commons. I am interested whether this proposed policy itself should be marked by some kind of a "sexually explicit" banner? Ruslik (talk) 10:05, 5 December 2010 (UTC)[reply]
  6.  Oppose per Zil and Ruslik. Also I'm afraid that this policy will be abused by certain people to get rid of pictures they don't like. Multichill (talk) 11:50, 5 December 2010 (UTC)[reply]
  7.  Oppose Images should be judged on their possible educational value regardless of interpretations of what "sexual content" may or may not mean to different cultures, religious groups or fringe lobby groups. -- (talk) 10:52, 5 December 2010 (UTC)[reply]
  8.  Oppose This is a slippery slope. In future it may become harmful for Commons! — Preceding unsigned comment added by Amit6 (talk • contribs)
  9.  Oppose If photos depicting sexual activity can be uploaded for “educational purposes” it will only be a short time before photos constiuting as pornographic material will be uploaded. With there uploaders defending how this pornographic material is "necessary for Wikicommons success” or how it is "useable education material". Aaaccc (talk), 5 December 2010 (UTC)
    Permitting photos depicting sexual activity to be uploaded is the status quo, and the history of this policy shows that no policy that bans that will pass. Opposing this policy does not advance your goals in that direction one bit.--Prosfilaes (talk) 18:02, 5 December 2010 (UTC)[reply]
    I am opposing this policy because I do not believe it would provide a clear enough distinction between "educational material" and "pornographic material". Aaaccc (talk), 5 December 2010 (UTC)
    To be clear, yes, this policy does allow uploads of pornographic material, unless it's actually illegal in the U.S., or lacks consent, or lacks any educational merit. It accepts that there is considerable overlap between what is pornographic and what is educational. Wnt (talk) 05:00, 6 December 2010 (UTC)[reply]
  10.  Oppose, but only because some of the wording in the policy is too vague as to be unenforceable. I'm specifically referring to the sentence low-quality pornographic images that do not contribute anything educationally useful may be removed in the lede. The term "low quality" is subjective; by what standards do we determine something is of low quality versus high quality? Does it mean some college kid who uploads fuzzy, low-res images of his genitals? Or does any image that someone does not like suddenly become "low quality"? And how do you determine whether something is useful educationally? Almost any image taken and presented in good faith has the potential to be educationally useful in some manner. This lede sentence leaves way too much up to subjective reasoning, individual bias and lobbying by people with agendas. A good policy should always be as objective as possible to eliminate these issues. Otherwise, I completely agree with the legal stuff and that Commons should not devolve into a webhost for porn (which is a real concern). The Garbage Skow (talk) 17:33, 5 December 2010 (UTC)[reply]
    That phrase came directly from COM:PORN, which was the existing policy. Apparently it was based less on ideology than on the tendency of some people to take quick snapshots of their penises to post here, leading to a glut of images without a particular use. This is presumably based on the policy that "snapshots of yourself and your friends" aren't wanted. Wnt (talk) 05:47, 6 December 2010 (UTC)[reply]
  11.  Oppose per Ruslik. AaronY (talk) 17:39, 5 December 2010 (UTC)[reply]
  12.  Oppose, strongly - slippery slope, censorship, the same reasons that have been rehearsed every time this comes up.--ukexpat (talk) 18:37, 5 December 2010 (UTC)[reply]
  13.  Oppose, strongly - illiberal and will lead to censorship, would be misused to get rid of "offensive" material --HaTe (talk) 21:29, 5 December 2010 (UTC)[reply]
    # Oppose This is in no violation of any rules, except one. OK, so we don't get to have a few instances of ancient art. IMO this is no big deal, compared to getting in the BBC News and probably deducting thousands of dollars of donations. And like Ineuw said, it is a slippery slope. Where do you put the line between allowing uploads of art and pictures from a college party? Why do we do so much damage to the Wikimedia foundation under the banner of "Commons is not censored", and for what? What a waste to through away a respected encyclopedia's reputation to have a few porn images. --The High Fin Sperm Whale 21:03, 5 December 2010 (UTC)[reply]
     CommentWhale, that's a reason to have a policy that controls the upload of specifically sexual imagery, not a reason not to have one. --Elen of the Roads (talk) 23:10, 5 December 2010 (UTC)[reply]
    I'm afraid I was somewhat mislead as of the nature of this policy. I shall change my vote. --The High Fin Sperm Whale 20:02, 9 December 2010 (UTC)[reply]
  14.  Oppose The policy itself would be welcome but what Ruslik said is true, too much redundant instructions. --grin ? 21:26, 5 December 2010 (UTC)[reply]
  15.  Oppose illegal content was (and will be) deleted, the current out-of-scope-rules are enough without special xxx-rules Rbrausse (talk) 21:32, 5 December 2010 (UTC)[reply]
  16.  Oppose Also due to "low-quality pornographic images that do not contribute anything educationally useful may be removed". "Low quality is too subjective and I have argued that some questionable images are of value since everyone reads an article differently.Cptnono (talk) 21:42, 5 December 2010 (UTC)[reply]
    Follow-up: It looks like a few people are concerned about the low quality bit. I would consider supporting if that part of the line was removed.Cptnono (talk) 00:43, 7 December 2010 (UTC)[reply]
    I think "low quality" as a criterion serves a real purpose: it's a subjective standard for people to discuss in deletion requests, which helps avoid mere voting and also puts the question more in the hands of the community than some objective measure(s) of quality might. That and "educational purpose" are both better left subjective for discussion by design. 71.198.176.22 02:33, 8 December 2010 (UTC)[reply]
  17.  Oppose - bad idea the last time we tried it, worse idea now. this thing is badly written. also how did out-of-scope get snuck into the criteria for speedy deletion!? that was not on the table, the last time i checked into this long-running conversation Lx 121 (talk) 00:37, 6 December 2010 (UTC)[reply]
     Comment It didn't. The proposal requires "obviously out of scope", not "out-of-scope". Trusting administrators to be able to make a call in cases that are a long way from grey seems eminently sensible to me. If an admin starts getting this wrong, they can be chided. --99of9 (talk) 12:12, 8 December 2010 (UTC)[reply]
  18.  Oppose} no way we put "out of scope" as a criteria for speedy deletion: "being in scope" is something that always, mandatorily should be discussed by the community before deleting: a single admin's evaluation can't and will never be enough. Apart from that it seems, more than a policy, a vague rant to subtily allow censorship. No way. And to the guys complaining above about possibly lost donations: We're not here for the money. We're here for a mission about free knowledge. Wikipedia could get billions by biasing its Coca Cola article to look sympathetic for them, but guess what? we don't. We have principles. Rejecting censorship is one of them. --Cyclopia (talk) 00:52, 6 December 2010 (UTC)[reply]
  19.  Oppose We have principles. Rejecting censorship is one of them. Next thing they will ask to censor images that might be of offensive to some ppl-- The Egyptian Liberal (talk) 01:12, 6 December 2010 (UTC)[reply]
  20.  Oppose} As already said; it needs re-drafting, & "out-of-scope" is NOT a reason for speedy deletion.Archolman (talk) 01:14, 6 December 2010 (UTC)[reply]
  21.  Oppose} For it to be a policy it needs to be unambiguous and clear. For a non-US resident it appears to be neither. Dost test and Miller test seem to be important to US citizens. The Dost article section on case law is without clear conclusions and Miller lack citations at crucial points. The Miller test appears flawed in not defining who the panel of jurors should be. The text above show considerable mission creep- on definitions of scope and deletion, and could be used as a future precedent for unrelated policies. All the other definitions seem a little prissy but acceptable but until the other issues are tightened up- it must be an oppose.--ClemRutter (talk) 02:11, 6 December 2010 (UTC)[reply]
    Dost and Miller are important to the Wikimedia Foundation, because those are the laws that it has to abide by.--Prosfilaes (talk) 02:58, 6 December 2010 (UTC)[reply]
    Follow the links- look at the articles as if it is the first time you have seen them- what was the conclusions drawn from the Dost case law. The Miller test (Or three pronged test) is littered with citation needed tags. To quote the article: For legal scholars, several issues are important. One is that the test allows for community standards rather than a national standard.. Those are the facts we need to use in forming a judgement. Miller is incredibly parochial; and it being proposed that the global wikipedia deletion policy should be determined by a village hall meeting in a place we have never heard of. While this is a legitimate concern for the staff, it is no way to form a policy. I am tempted to suggest that if the infrastructure could not support our servers in terms of bandwidth etc-they would be moved, and in the same way if our content cannot be supported because of local legal constrains the answer is to hire a lorry--ClemRutter (talk) 11:18, 6 December 2010 (UTC)[reply]
  22.  Oppose per Zil and Ruslik. --nsaum75¡שיחת! 05:00, 6 December 2010 (UTC)[reply]
  23.  Oppose} Some rhetorical questions: Do we have a policy that facilitates deletion of certain culturally offensive imagery? No? Do we have a policy that facilitates deletion certain imagery of violence or murder? No? Not even if the perpetrator or victim of the violence requests deletion 40 years later? Shall it become OK to flood commons with images DEPICTING AND GLORIFYING HATE AND DEATH BUT NOT LOVEMAKING? NOT ON MY WATCH! What started as obviously vile, anti-sex, agenda-driven instruction creep became an innocuous-looking almost-reasonable sounding but-not-quite-right proposal, due to some edits by talented and well-meaning editors. But no, I will never accept the argument that it's OK to show children how adults can kill each other, and hate each other, but not how adults can love each other. Because it makes no sense. Think about it. Amen. --Walks on Water (talk) 06:48, 6 December 2010 (UTC)[reply]
  24.  Oppose per Ruslik and for the "Material obviously outside of scope" speedy deletion: who decide what is "obviously outside of scope"? the last spring i see tens of ancient drawing, illustrations and good quality pictures of not-so-nude woman, many in use, speedy deleted as "out of scope" by different admins. Leave the speedy deletion based on subjective opinion of few people about the in scope/out scope IMO is only a trojan horse for new mass deletions. To remove illegal images we don't need a policy (the are illegal, so we can speedy them as we do with copyviol) and for the low quality images and nude pics we alredy have COM:PORN and COM:NUDE. About the consent, people who upload images without consent, you probably will have no problem lying about it (the disclaimer about copyright is perhaps blocking all the copyright violation?). --Yoggysot (talk) 07:14, 6 December 2010 (UTC)[reply]
    The speedy deletion for out of scope material is subject to a long list of exceptions, which hopefully cover the things you describe. Apparently there's some argument for downloading low-quality penis snapshots that have been uploaded in remarkable abundance. However I would not mind removing the out of scope speedy deletion entirely; the policy as written reflects the balance of a fairly small number of editors who have been working on it and discussing it. Wnt (talk) 13:05, 6 December 2010 (UTC)[reply]
    The problem here is trust. "Out of scope" was used as justification for deleating images with a clear educational value used on Wikipedia. We had to spend hours of our time to trace the images that were deleted, and humbly ask for undeletion. It was extremely frustrating and humiliating. It can happen again.Ankara (talk) 13:35, 6 December 2010 (UTC)[reply]
    As Ankara said, in the past we have had evidence of a very wide interpretation of "out of scope" in sexual topics. Certainly, after the speedy deletion the list of exceptions could be used to request (and probably obtain) an undeletion, but then wikipedians must control the delinker log, then go on every page to reinsert the image, etc... and the next days it can be speedy deleted again without debate only because someone think that "sex"="out of scope", and so on... For example, if someone read the study of controversial content, and use his recommendation as yardstick to apply this policies, he can speddy deleted "vast majority of" the "over 3,000 images on Commons (by our count) in various categories and sub-categories around “Female toplessness” and “Nude women”", because "they are out of scope", resulting in mass deletion wrost than the Jimbo anti-sex war of this spring. IMO it's better to decide if an image is "out of scope" in the normal deletion process, as happens with all the non-sexual image. I think that a guideline (or a series of guidelines) that explain how to categorize ad write the description of all the file about controversial topic (sex, nudity, violence, religion, etc) can help, but i don't see the need for a preferential way to the deletion for not copyviol and not illegal sex-related pic (or in general sex related file).--Yoggysot (talk) 03:22, 7 December 2010 (UTC)[reply]
  25. --Elian Talk 12:25, 6 December 2010 (UTC)[reply]
  26.  Oppose. Good honest effort by lots of great people, but ultimately I don't want a special policy for sexual content-- fairness dictates that whatever solution we create should apply equally to all cultures' taboos, not just Western cultural taboos. Also, out-of-scope speedy deletion was tried and it failed. Jimmy showed conclusively that no one editor can discern scope when he speedy-deleted all that art. The founder wasn't able to use out-of-scope-speedy-delete responsibly, there is no way all our admins should be trusted with a call like that. Lastly, Obscenity is a legal term, for use by lawyers. Neither the average reader nor the average admin can be tasked with deciding obscenity. The colloquial use of "Obscene" is very different than the legal definition, which is infinitely more narrow. Putting an 'obscenity test' inside a policy would be a very bad idea, requesting, in effect, that our editors reach legal conclusions without any legal training. --Alecmconroy (talk) 13:36, 6 December 2010 (UTC)[reply]
    •  Comment Honest efford, okay, maybe, but good is something else. Imho, obscene is a religious judgement akin to sin. Imho, we should not even be discussing it as a cause of refusal of material. Of course, personal rights of depicted must be obeyed. Thus:
  27.  Oppose--77.182.18.173 11:48, 7 December 2010 (UTC)[reply]
  28.  Oppose (a) it's plain unnecessary. (b) since we're an international and intercultural site, using US/Florida law is not acceptable. The pure suggestion to value it above all others in the world is obscene and fascist. (c) Servers and office space can be had almost everywhere on earth, so if we were wanting to have materials challanged by someplaces law, we can simply store and serve them elsewhere. (As a side note, US govt. escape their law in Guantanamo, why not act like so?) (d) Obscenity is too variable a concept, you cannot build on it. --Purodha Blissenbach (talk) 10:34, 6 December 2010 (UTC)[reply]
    Wikipedia's servers are located in the US state of Florida, therefore Wikipedia is specifically and only subject to Florida and US law. Wikipedia already works this way, anything against the law in Florida isn't allowed on Wikipedia. Swarm (talk) 03:57, 7 December 2010 (UTC)[reply]
  29.  Oppose - Per HaTe, Ruslik and The Egyptian Liberal. --Локомотив 15:59, 6 December 2010 (UTC)[reply]
  30.  Oppose as policy, acceptable as guideline. --Foroa (talk) 17:09, 6 December 2010 (UTC)[reply]
  31.  Oppose unacceptable as policy. I still deny the need for such a policy, the draft is not concise and specific enough to base decisions on it, and finally the rules are not dealing with the different cultures of Commons and the projects, but are despite the laudable effort still centered too much on one cultural heritage. --h-stt !? 18:29, 6 December 2010 (UTC)[reply]
  32.  Oppose Wouldn't improve anything.--Lamilli (talk) 18:39, 6 December 2010 (UTC)[reply]
  33.  Oppose - As someone who has worked in a system where sexual content has very real relevance, I can honestly say that this proposed policy (having failed before to become one) still needs a considerable amount of work before it even comes close to being suitable. It doesn't define conditions well enough, and simply relying on the DOST test as a means of assessment of potentially unsuitable images of children is not a reliable method of assessment by itself. I would recommend that the creators of this proposal go back to the drawing board, get hold of some books by David Finkelhor and start working out what they need to define in order to give accurate, usable conditions under which decisions can be made. BarkingFish (talk) 19:13, 6 December 2010 (UTC)[reply]
  34.  Oppose - Abusus non tollit usum. Kameraad Pjotr 20:03, 6 December 2010 (UTC)[reply]
  35.  Oppose - US-moral isn't Wikimedia moral! This is an international project. If the US-laws are not good for us, the servers have to find a better place (Sweden, Switzerland or an other). Marcus Cyron (talk) 20:54, 6 December 2010 (UTC)[reply]
  36.  Oppose US moral is stupid and sucks, Wikimedia and Wikipedia are free projects and shouldn't be restricted this way. --Julius1990 (talk) 20:58, 6 December 2010 (UTC)[reply]
  37.  Oppose Most users here are sensible, and though it may take time work, and sometimes a little heat, in my opinion the right balance between the censor everthing and include everything camps is usually arrived at. Setting things in stone with such a policy will please neither camp, some will see it as too restrictive others too liberal.--KTo288 (talk) 21:23, 6 December 2010 (UTC)[reply]
  38.  Oppose, strongly. The guideline seems illiberal and prone to misuse. Images should be judged on their possible educational value regardless of interpretations of what "sexual content" may or may not mean to different cultures, religious groups or fringe lobby groups. --Pinnerup (talk) 21:36, 6 December 2010 (UTC)[reply]
  39.  Oppose Per Ruslik + If this was adopted, other file policy proposals may crop-up citing this as a reason to adopt it, "What's good for the goose is good for the gander" effectively. --George2001hi (Discussion) 21:46, 6 December 2010 (UTC)[reply]
  40.  Oppose, strongly. It opens the door for censorship of all kinds, and is simply not necessary. Let's continue to decide case by case. --AndreasPraefcke (talk) 21:47, 6 December 2010 (UTC)[reply]
  41.  Oppose I agree with most of the content but I do not agree with point 3 of [5] - the speedy deletions of scope. We have admins and users that think we should not host nude images and allowing speedy deletions will probably end in a mass deletion (again). I'm sure we will see admins delete all new images without checking if image is better than excisting ones. Images out of scope should go through a regular deletion per COM:SPEEDY#Regular_deletion. --MGA73 (talk) 21:50, 6 December 2010 (UTC)[reply]
  42.  Oppose About everything seems to be said, in particular brought to the point by Ruslik. --Abderitestatos (talk) 23:30, 6 December 2010 (UTC)[reply]
  43.  Oppose - per Alecmconroy and others. MrBlueSky (talk) 00:47, 7 December 2010 (UTC)[reply]
     Oppose - please stabalise the policy proposal before asking for community opinion the obsecenity section has had a change in meaning since this poll was opened[6] Gnangarra 01:39, 7 December 2010 (UTC)[reply]
    • The change was simple clarification. - Jmabel ! talk 03:23, 7 December 2010 (UTC)[reply]
    • Whenever we get 100 sets of fresh eyes across a text, they are bound to find some improvements. Any change in meaning was extremely minor, and surely nobody's support or oppose hinged on it. If it did, they are welcome to revert the change to the original phrasing. Do you really want to oppose an entire proposed policy giving only such a small technicality as your reason? --99of9 (talk) 09:22, 7 December 2010 (UTC)[reply]
      • Being a contributor for around 4 years and an admin for around 3 years I have not taken lightly the decision to oppose, I recognise the importance of the policy and I'm aware of the history behind its need. The time taken so far and the number of people involved there should be a moritorium on changes while the poll takes place so that everybody is agreeing to the same thing. Even since I highlighted it and said oppose because of instability there has been another change(only spelling). One cannot expect people to support the policy until its stable for the whole period of the poll. Gnangarra 10:11, 7 December 2010 (UTC)[reply]
        • Have you read the introductory text above this poll? We are voting on a specific version (Nov 26). We do sort of have a moratorium on even slightly contentious changes. If any of the changes since then are disliked by anyone, they can be reverted to obtain a full consensus later. --99of9 (talk) 10:19, 7 December 2010 (UTC)[reply]
          • then someone needs to lock-down the text ; i understand what you're saying about floating revisions always happening on a wiki, BUT it isn't a fair vote, if the text of the ballot question being voted on keeps getting changed during the polling. for this process to be credible, the nov.26 text needs to be what the people who are voting will see Lx 121 (talk) 14:12, 7 December 2010 (UTC)[reply]
     Comment I've withdrawn my oppose though I still think the policy should be stable before any poll is commenced, I also believe that there is a valid and real need to impliment a policy on sexual content as Commons(The Foundation) has legal obligations that we cant ignore. Weighing up the differences I've elected to support this because I believe that the community can address any further adjustments that need to be made. Gnangarra 12:14, 9 December 2010 (UTC)[reply]
  44.  Oppose First, let sensitive people use filter software. On the software manufacturer, is to comprehensively develop a case by key words. You could also see a lot more sexual material, passing beside a nearby kiosk. I am confident with the work of administrators and patrolers. --Vhorvat (talk) 01:54, 7 December 2010 (UTC)[reply]
  45.  Oppose - It would be a welcome to censorship. Playmobilonhishorse (talk) 03:58, 7 December 2010 (UTC)[reply]
  46.  Oppose, strongly - per AndreasPraefcke and others. --Rlbberlin (talk) 05:24, 7 December 2010 (UTC)[reply]
  47.  Oppose per MGA73 (i.e. I agree with much of the proposal, but not out-of-scope speedy deletion). --Avenue (talk) 11:50, 7 December 2010 (UTC)[reply]
  48.  Oppose COM:CENSOR and COM:PORN perfectly regulate such issues, particularly child pornography. Specifically, the very first sentence of COM:CENSOR says that "files and other materials which are not lawful for Commons to host on its servers in Florida will be deleted immediately upon being identified as illegal". Occam's razor is fruitful here. Brandmeister (talk) 13:17, 7 December 2010 (UTC)[reply]
  49.  Oppose-solution in search of a problem. Current policies are adequate to deal with this. -Atmoz (talk) 14:16, 7 December 2010 (UTC)[reply]
  50.  Oppose-Well, even if an image or a video contains sexually explicit content, it may be suitable for minors (for instance, a British sex-ed film Growing Up was criticized by some people since it featured actual film rather than drawings of naked people, which included intercourse and masturbation, however teachers and pupils then did give it positive feedback), so there is no need to make a policy regarding sexual content.--RekishiEJ (talk) 14:44, 7 December 2010 (UTC)[reply]
  51.  Oppose --Habakuk (talk) 15:55, 7 December 2010 (UTC)[reply]
  52.  Oppose How many sexual pictures here does Commons need anyway? Secondly, there are already some individuals angry at Commons for having their photos used here when they were taken at what they thought were what was a private but open social gatherings and then placed on flickr but licensed freely. --Leoboudv (talk) 20:06, 7 December 2010 (UTC)[reply]
    some examples? --Yoggysot (talk) 02:18, 8 December 2010 (UTC)[reply]
  53. Strongly  Oppose. Existing policies already deal with this - Scope, Commons is not censored, Nudity, Photographs of identifiable people, General disclaimer, What Commons is not. --5ko (talk) 20:33, 7 December 2010 (UTC)[reply]
  54.  Oppose mostly per Ruslik, Alecmconroy and Brandmeister. - CharlieEchoTango (talk) 23:09, 7 December 2010 (UTC)[reply]
  55.  Oppose Cgtdk (talk) 23:39, 7 December 2010 (UTC)[reply]
  56.  Oppose - Part censorship, part redundancy. --M4gnum0n (talk) 23:48, 7 December 2010 (UTC)[reply]
  57.  Oppose as a policy, Would strongly Support as guideline. As guideline status would allow wiggle room and as other say this seem redundant with existing policies ResidentAnthropologist (talk) 23:57, 7 December 2010 (UTC)[reply]
  58.  Oppose We aren't in dire need of this (other policies cover it well enough) and this is a slippery slope down the road to censorship. Themfromspace (talk) 00:37, 8 December 2010 (UTC)[reply]
  59.  Oppose No new policy needed which opens the door to censorship (eg with it's vague rules for deletion ("high quality")). Illegal images were deleted also in the past, so all was fine. Although I am sad that many people wasted time to make up this policy. --Saibo (Δ) 01:12, 8 December 2010 (UTC)[reply]
  60.  Oppose mainly due to the portion which allows "out-of-scope" speedy deletion. The events of this May made it abundantly clear that deciding what types of sexual content are obviously out-of-scope is an extremely subjective and contentious determination. Black Falcon (talk) 01:43, 8 December 2010 (UTC)[reply]
  61.  Oppose Existing policies already deal with this. Possible censorship.--Chrono1084 (talk) 01:52, 8 December 2010 (UTC)[reply]
  62.  Oppose i respect the intention behind this effort but the concept of "educational value" is obviously ill-defined --Jan eissfeldt (talk) 03:57, 8 December 2010 (UTC)[reply]
  63.  Oppose I oppose a policy or section that is absolutely useless. I was just asked by a Federal judge why after allowing my nudes to be shown on Wiki to minors I can be opposed to GOOG doing it also. Either place all nude images behind an age disclaimer or keep out the Googlebot-image bot. This is not just an opposition to ALMOST getting it right but a DEMAND that the Googlebot-image bot be excluded by 12-10-2010 or that all photo content by me be removed. I am comfortable with it being here and it is some of the best content and should remain. Wiki must either keep out the GOOG image search or delete content donated by me to prevent it being shown to my minor children while at school on GOOG safe searches. CurtisNeeley (talk) 04:40, 8 December 2010 (UTC)[reply]
  64.  Oppose Unneeded, slippery slope, and per Ankara (re the sum of all human knowledge, in the Comments section below). Existing policy allows immediate removal of unlawful files and the reasonably prompt removal of files deemed (by consensus) beyond scope of Project. If there's an ongoing or prospective problem, more policy isn't the answer; better enforcement is. Let's take the common sense already found here to heart and not further stigmatize sexual content with unnecessary new regulatory verbiage. Rivertorch (talk) 06:06, 8 December 2010 (UTC)[reply]
  65.  Oppose Although I commend the authors of the policy for the work they've put into it, this seems to be mostly redundant and unnecessary. I'm worried that users could construe this new policy into some form of censorship of the Commons, and it's important that we keep away from instruction creep when it's not needed. Nomader (talk) 06:13, 8 December 2010 (UTC)[reply]
  66.  Oppose As others have said, unless we are going to introduce a policy on when paintings that depict Abrahamic prophets are acceptable and when they should be deleted based on their perceived educational value, or a policy on when we can include images of dead soldiers and when we can't, then this is way too slippery a slope. Max Rebo Band"almost suspiciously excellent" 06:46, 8 December 2010 (UTC)[reply]
  67.  Oppose Not generally, but with differentiation. --Perhelion (talk) 08:03, 8 December 2010 (UTC)[reply]
  68. Strong  Oppose Wikipedia is not censored. EngineerFromVega (talk) 08:07, 8 December 2010 (UTC)[reply]
  69.  Oppose It is the task of the legislative to define which kind of sexual content is illegal. Illegal content must be deleted. So there is no need for a general policy for sexual content. As always a specific review for a specific file is required to decide if this file is out of scope. Yeah, this is not fun, but it is fair and just. --Catfisheye (talk) 08:25, 8 December 2010 (UTC)[reply]
  70.  Oppose I don't think the pro-arguments are really convincing while the danger of a slippery slope and censorship are real. As Nomader says above, the proposed new policy is in parts redundant to those we currently have because no one can claim that we need this new policy to delete prohibited content (like child pornography or copyright violations or images that violate Commons:Photographs of identifiable people). So it's likelier that with this new policy people will try to use it to justify censorship than that we really need it for clarification. Also, imho, all content on Commons should be treated equally, following the same rules. Last but not least, any policy that is, or might be applied as, stricter on sexual content than on other content can and will lead to fragmentation of content, with people uploading such files deleted here back to the Wikipedias it came from if those have less strict policies, thus defeating the whole purpose of having a common repository for files. Regards SoWhy 08:28, 8 December 2010 (UTC)[reply]
  71.  Oppose Too vague, very easy to use as justification for what is actual censorship. It also seems to entirely hinge on US laws. Child pornography is already forbidden as are depictions of sexual acts which do not contribute educationally to an article. So I say, a case by case approach is good enough. A new policy is simply not needed.--Astepintooblivion (talk) 09:50, 8 December 2010 (UTC)[reply]
  72.  Oppose I don't see the requirement for a new policy here, plus the reliance on US laws is troubling to non-US people such as myself.--Topperfalkon (talk) 10:41, 8 December 2010 (UTC)[reply]
  73.  Oppose I dont believe the "slippery slope" concerns have been addressed convincingly. Even with this policy, we ill be forced to deal with contentious images in a case by case basis because "obscenity", "sexual content" are all subjective terms and have widely different interpretations in different cultures. --Sodabottle (talk) 10:47, 8 December 2010 (UTC)[reply]
  74.  Oppose Concerned about slippery slope to censorship, and especially concerned about speedy rules. Most of the proposed page also seems to be covered by other policy pages, and so I'm not sure why this page is needed. --Falcorian (talk) 13:08, 8 December 2010 (UTC)[reply]
  75.  Oppose Although I understand the fear of prosecution is valid one of the greatest things about WP is the fact that this site, for the most part, has been willing to toe the line on issues like this. Regular montering, by all editors will more than likely keep the pages clean and leagle. I for one know after I change a page and add it to my watch list any time there's an edit I at least check the history to be sure there are not IP address users in the history since my edits. If there are I check the past edits up to my last edit and revise according to standards. The slippery slope theory is also valid, and one of the biggest fears of us who oppose this poll. Lets keep WP in the hands of those who work so hard, and offten for free, to keep this site at the standards we belive it should be.
  76.  Oppose --Janneman (talk) 14:06, 8 December 2010 (UTC)[reply]
  77.  Oppose US laws mean nothing to me (and I think the mentioned laws on "obscenity" are just funny and outdated), and the proposed policy is too vague and can easily be misused in either way. --Thogo (Disk.) 14:08, 8 December 2010 (UTC)[reply]
    as m:Steward you should be the last person to say US laws means nothing, The Foundation is a US organisation, the servers are in the US they are subject to US law. Gnangarra 14:16, 8 December 2010 (UTC)[reply]
    Don't tell me what I should and what not. Stewards have to abide by the Foundation policies and by nothing else. I'm not subject to US laws unless I'm currently in that country. --Thogo (Disk.) 14:49, 8 December 2010 (UTC)[reply]
    Foundation policies abide by US law, you know. Your identity is verified as a Steward so and if you break US laws regarding privacy you will be held responsible. By using US servers you are on US sovereign soil. That is how it works. Ottava Rima (talk) 18:28, 8 December 2010 (UTC)[reply]
    The privacy policy of the Foundation is not based on US law, just so you know, it's much stronger than US law. But that's none of your business, and it definitely doesn't belong here, so every further off-topic comment will be removed. --Thogo (Disk.) 23:55, 8 December 2010 (UTC)[reply]
    It is my business as it is the business of every user here. We have an Ombudsman just to investigate those who would -dare- make such claims as the above, as it is a threat to our privacy, our safety, and a violation of US law. If you hate US law so much, keep it off Commons as -that- has no reason here. Commons is not for you to make a point. You can be sure your actions and words will come up during the next discussion of confidence regarding you, as you have a long history of actions regarding Commons that makes it seem that your actions might not line up with what is best for the Foundation and its projects as they can put us into serious legal jeopardy. Ottava Rima (talk) 02:16, 9 December 2010 (UTC)[reply]
    This could be the beginning of an ugly controversy, as Ottava Rima is prone to try things like that. The problem is, if it is crucial for a Wikimedia steward to respect the nuances of U.S. law, how can any non-American stand for election as a steward? But such a motion would be a huge kick in the face for the international Wikimedia community. One reconciliation for this is what we set out to do here: make the specific policy closely follow the U.S. law, so that as long as people follow the policy they follow the law. And if the policy becomes consensus, stewards are required to follow it. Wnt (talk) 16:38, 9 December 2010 (UTC)[reply]
    Especially in this case we could have not much worse reference as the US law. It's nearly offensive against the users from other countries, that in general don't life after this forsaken rules. Someone should move the servers out of this desert and bring it into the lands of the free. Would definitely help and ensure that Wikipedia will stay free from censorship. --Niabot (talk) 18:04, 9 December 2010 (UTC)[reply]
    Foreigners who come to the US are expected to follow our laws, so it isn't hard. Look at child porn - it wont be acceptable here regardless of a person being in a country it is. Ottava Rima (talk) 20:49, 9 December 2010 (UTC)[reply]
  78.  Oppose US law? This is Commons! --Gereon K. (talk) 14:12, 8 December 2010 (UTC)[reply]
  79.  Oppose The policy mostly reflects american interests. I don't think this policy provides a good framework for our educational mission. yes, sex is a thing we need to cover, even graphically. To many people don't know enough about it. Who wants to get porn, will probably find better resources than commons. --bluNt. 14:15, 8 December 2010 (UTC)[reply]
  80.  Oppose --alexscho (talk) 14:33, 8 December 2010 (UTC)[reply]
  81.  Oppose«« Man77 »» [de]·[bar] 14:35, 8 December 2010 (UTC)[reply]
  82.  Weak oppose It's a shame it has to come to this, but here we are. The proposed policy is not bad, but I don't like the idea of speedily deleting files for being out of scope. Files that are believed to be out of scope need to go through the "normal" deletion process, not silently deleted by a single admin who has xyr own opinion on the project's scope. If point three is removed from the speedy deletion criteria, then I would be willing to support this as policy. Reach Out to the Truth (talk) 14:48, 8 December 2010 (UTC)[reply]
    I think you should reconsider your position and change it to Strong oppose. Current version of the policy creates a lot of possibilities of deleting the media which should have been created, and it's very careful never to explain what stuff will definitely be kept safe. This policy can turn away many new users who will do their best to contribute, and then have their contributions deleted. Beta M (talk) 17:52, 9 December 2010 (UTC)[reply]
  83.  Oppose Per Ruslik and others. --Kbdank71 (talk) 14:57, 8 December 2010 (UTC)[reply]
  84.  Oppose Existing policies are sufficient when ad hoc dealing with really problematic files. Any further definitions, standards, tagging and sorting (in other word censorship) of so called "sexual content" are unacceptable for me. Again and again certain prude groups push similar proposals trying to bowdlerize this worldwide project and impose their one-country based definitions to all nations, but I will resist it because I think Commons should be as much free as possible. --Miaow Miaow (talk) 15:15, 8 December 2010 (UTC)[reply]
  85.  Oppose No Policy of Censorship; decide case by case ----Wmeinhart (talk) 15:25, 8 December 2010 (UTC)[reply]
  86.  Oppose as per AndreasPraefcke and blunt --Paramecium (talk) 16:48, 8 December 2010 (UTC)[reply]
  87.  Oppose as per Marcus Cyron --Gamma127 (talk) 16:58, 8 December 2010 (UTC)[reply]
  88.  Oppose as the two guys before alrady mention. --Kuebi (talk) 17:02, 8 December 2010 (UTC)[reply]
  89.  Oppose --~Lukas talk 17:20, 8 December 2010 (UTC)[reply]
  90.  Oppose --Magadan (talk) 17:28, 8 December 2010 (UTC) If US law is a problem for content on our servers then please move our servers elsewhere.[reply]
  91.  Oppose until mention of the Miller Test and US obscenity law is removed, per my discussion below. Gigs (talk) 17:36, 8 December 2010 (UTC)[reply]
  92.  Oppose ---<(kmk)>- (talk) 17:40, 8 December 2010 (UTC) Wasn't the US the mother of freedom of speech?[reply]
  93.  Oppose The porn problem on Commons seems to exist mostly in some people's imagination. PDD (talk) 17:45, 8 December 2010 (UTC)[reply]
  94.  Oppose per many of above and in particular the speedy deletion of "obviously out of scope" files. Davewild (talk) 17:54, 8 December 2010 (UTC)[reply]
  95.  Oppose Speedy deletions as "out of scope" are not acceptable—among other problematic points. --Rosenzweig δ 18:28, 8 December 2010 (UTC)[reply]
  96.  Oppose, holy no. —DerHexer (Talk) 18:41, 8 December 2010 (UTC)[reply]
  97.  Oppose - Per the "out of scope" deletion rule, as well as the unclarities and virtually unending problems I foresee from the implementation of the Miller test. Probably a lot more issues as well. --Saddhiyama (talk) 19:04, 8 December 2010 (UTC)[reply]
  98.  Oppose--Verum (talk) 19:44, 8 December 2010 (UTC)[reply]
  99.  Oppose Please don't copy extremly prude and completely outdated US moral laws... Chaddy (talk) 19:54, 8 December 2010 (UTC)[reply]
    You do know that the US has some of the most liberal laws regarding porn and that the majority of the world belongs to cultures that oppose porn completely, right? Ottava Rima (talk) 23:09, 8 December 2010 (UTC)[reply]
    {{citationneeded}}. With all due respect I very much doubt that. Compared to China and Saudi Arabia sure, but most European countries, which I think is what we are comparing here, there are just no contest. The US definitely has some of the the most liberal laws when it comes to political free speech, much more so than most European countries, but regarding pornography laws they are not comaprable. It just seems we can't have the best of both worlds, but I must admit that if I had to choose I would choose the more free political speech. I still oppose this policy proposal, though, as there seem to be no reason to choose, as existing policy already covers it. --Saddhiyama (talk) 16:51, 9 December 2010 (UTC)[reply]
    You mean China with 1/4th of the world's population or Islam representing 1/5th? India has tougher laws on porn than the US, as many countries. US has most of the world's porn studios for a reason. Your statement seems to suggest an insignificant portion of the world (Europe, and only a few countries in Europe at that) can somehow trample over most of the world's population. Ottava Rima (talk) 20:49, 9 December 2010 (UTC)[reply]
  100.  Oppose Per Ruslik and many others. Adrian Suter (talk) 21:10, 8 December 2010 (UTC)[reply]
  101.  Oppose <insert reason here> --Schnatzel (talk) 21:36, 8 December 2010 (UTC)[reply]
  102.  Oppose --Joe-Tomato (talk) 22:43, 8 December 2010 (UTC)[reply]
  103.  Oppose-- Neozoon (talk) 22:57, 8 December 2010 (UTC) Thanks for the time invested to generate this proposal. Two main points why this can not be a policy: Speedy deletion of "Out-Of-Scope" pictures, we had this already and the inability to validate the Miller-Test within a community of international contributors. The current situation is better than having this.[reply]
  104.  Oppose Ukko.de (talk) 23:04, 8 December 2010 (UTC) per Brandmeister, Jan eissfeldt and others[reply]
  105.  Oppose Principal objection is to Speedy Deletions, Clause 3. ("Material obviously outside of scope....[ ]. Scope policy is very general and sexual content is frequently deleted under this policy. However, several categories of sexual content detailed below are likely to fall within the project scope, and should not be speedy deleted."). All material regarded by an individual as being outside of scope should be proposed for deletion and never speedied. There is so much subjectivity here that it is bound to be unfair and unworkable. Clause 3 should be removed. Anatiomaros (talk) 23:34, 8 December 2010 (UTC)[reply]
  106.  Oppose We don't need a special policy for sexual images. If an image of whatever sort is inappropriate, it has to be explicitely argued that this is the case in the context of whatever article it is used in. US law can never be an argument that would lead to a different consensus than one that would otherwise have been reached. If this then leads to a conflict with US law, then one has to think about moving WikiMedia out of the US. Count Iblis (talk) 02:09, 9 December 2010 (UTC)[reply]
  107.  Oppose. For the most part I'd actually support it as a policy if it were not for one glaring omission. There seems to be no mentioning of how to deal with erotic/pornographic art. If "Content clearly not educational or otherwise in scope" were to be interpreted as excluding most erotic/pornographic art unless it belongs to famous artists, I would strongly oppose the policy. If on the other hand it would be only applied to common pornographic pictures (the usual adult sites stuff) I'd support the policy. However for now without clarification on that subject I oppose the policy.--Kmhkmh (talk) 02:29, 9 December 2010 (UTC)[reply]
  108.  Oppose. Regardless of how well this proposal is watered down and polished--even with the best intentions--it would still inevitably be used by some as a platform to support eventual censorship of sexual content, as well as any other %CONTROVERSIAL_MEDIA_TYPE%.   — C M B J   05:13, 9 December 2010 (UTC)[reply]
  109.  Oppose Out of scope and speedy do not fit (and see PDD). sугсго 07:11, 9 December 2010 (UTC)[reply]
  110.  Oppose I don't see any need for this --Sargoth (talk) 07:38, 9 December 2010 (UTC)[reply]
  111.  Oppose The problem with sexual contents is mainly an us-problem. If an content is pornographic or non-pornographic should be decided case by case. Commons was founded to collect free contents, not to collect non-pornographic contents. These contents are mayby useable for an enzyklopedia or not. Any censorship could block the aim of collect free contents at all. There are al lot of sexual files are who are able to use in Wikipeida. We don't need no censor with exaggerated moral reservation. Widescreen ® 08:33, 9 December 2010 (UTC)[reply]
  112.  Oppose No need to go beyond existing legal obligations, just follow the laws that apply. Fossa?! 09:12, 9 December 2010 (UTC)[reply]
  113.  Oppose --Cartinal (talk) 10:29, 9 December 2010 (UTC)[reply]
  114.  Oppose --Don-kun (talk) 10:46, 9 December 2010 (UTC)[reply]
  115.  Oppose Since censorship is already in best pratice. Should we now make it the default? ... -- Niabot (talk) 10:51, 9 December 2010 (UTC)[reply]
  116.  Oppose Any in-scope/out-of-scope determination needs to be made through a deletion discussion, not through admin action such as speedy deletion. If the allowance for speedy deletion of out-of-scope files is removed from the proposed policy, I will support it, because in that case the policy will merely pull together already-existing policies and guidelines (office deletion of illegal files, speedy deletion for copyvio, etc.). cmadler (talk) 10:53, 9 December 2010 (UTC)[reply]
  117.  Oppose There are so many things wrong with this that i'm afraid that i will forget to mention all of them, but i'll try. 1) Having the policy that says "only educational material is allowed" is bad enough (because education with images comes from the articles they are used in, for example an image of rape would not be educational in the article on animal liberation, but would be in something like "BDSM community and sexual violence" outlining the differences, etc); but this policy (if it will become policy) will give the right to determine whether or not the image (or video, audio, etc) is non-educational judging exclusively from what is seen (or heard, etc). 2) Any suggestion that the burden of proof should be on the person who has uploaded the content will open all the content from an individual who cannot access internet to deletion. 3) The policy copies the current legal situation where the servers reside, which can mean trouble if those laws change. Let's say there's (unlikely) chain of events which will lead us very quickly to the law in Florida making it illegal to censor the media based on the age of the submitter or based on sexual practices the person is engaged in. This policy would become in clear violation of such a law. Also let's say that obscenity law will finally be stricken from the record as idiotic, it will de facto remain the law on wikicommons, because it's a part of the policy. Laws of the territory can be mentioned, but only in the way that says "we must follow all the laws where the server stands at the time of the decision", it would be then appropriate to link to some article like "en:Laws against sexuality in the USA" or some-such. 4) The policy of wikicommons should serve the purpose of enlarging the wikicommons without detracting from the educational value, rather than preserving educational value and if possible enlarging the sum of all works. It is a very big distinction, and people who work hard at making commons grow should be thanked, rather than being forced to jump through extra hoops in order to prove to somebody who has not contributed such content that everything is ok. 5) The policy as it is written is set out to define what sexual content will exist, but is too busy talking about what must be deleted. No attempt is made to define exactly what will be "definitely approved", and as such may detract the potential contributor. Some people work very hard to submit content, and they do not want to do that as a test case for some admin to see if they can get away with deleting more than their share. ... Beta M (talk) 11:24, 9 December 2010 (UTC)[reply]
  118.  Oppose We don't need a copy of prude US laws. If we had to follow such criterias, maybe also accordingly to other countries, soon our matter will fail. --Blatand (talk) 12:36, 9 December 2010 (UTC)[reply]
  119.  Oppose - Policy already requires us to only host content which is legal under US law, so I don't see the need to treat sexual content separately. We also already have a policy on personality rights, so there's no need to duplicate this here either. -- ChrisiPK (Talk|Contribs) 12:58, 9 December 2010 (UTC)[reply]
  120.  Oppose There are laws already. An extra policy is not required. Hybscher (talk) 13:48, 9 December 2010 (UTC)[reply]
  121.  Oppose The three broad criteria are illegal (which wouldn't be allowed anyway), lack of consent (we have personality rights guideline already), and the last (and most horrifying) "not educational or otherwise in scope". I'm reminded of book burning parties. I know a broad swath of people are offended by sexual content. But Commons isn't based upon moral guideline. Commons has no right not to be offended. What is "educational" is so nebulous as to be indefensible, and this policy if passed would cloud any debates about such content, rather than clarify. --Hammersoft (talk) 14:53, 9 December 2010 (UTC)[reply]
  122. strongly  Oppose as an author in wikipedia and contibutor on commons -- Achim Raschka (talk) 15:44, 9 December 2010 (UTC)[reply]
  123.  Oppose - think that just because some people have a taboo shouldn't mean they have to censor information. I agree with policies that remove illegal content, but if we make a policy for sexual things are we going to make a policy for gross cats, useless fruit pictures and whatever else? It's a slippery slope, I think. Delete the illegal pictures, and don't worry if the images have human body parts in them or not. Also I am a little uncomfortable that this policy debate seems to be only taking place in English ...this will be a Commons wide policy right? We need discussion from all minds. Anyway, cheers, Nesnad (talk) 16:26, 9 December 2010 (UTC)[reply]
  124.  Oppose -download | sign! 16:54, 9 December 2010 (UTC)[reply]
  125.  OpposeThis is not needed in my opinion.
  126.  Oppose Per Ruslik. -- kh80 (talk) 18:10, 9 December 2010 (UTC)[reply]
  127.  Oppose Per above Druifkes (talk) 18:15, 9 December 2010 (UTC)[reply]
  128.  Oppose mj 18:52, 9 December 2010 (UTC)[reply]
  129.  Oppose Aineias (talk) 19:33, 9 December 2010 (UTC)[reply]
  130.  Oppose per Marcus Cyron and Julius1990 --Grim.fandango (talk) 20:02, 9 December 2010 (UTC)[reply]
  131.  Oppose most of this proposal seems redundant to me - particularly the speedy deletion reasons simply duplicate existing copyright, legal and scope guidelines. The normal deletions -section seems rather pointless "don't make baseless deletion requests". The prohibited content lists child pornography and other illegal content, privacy rights and scope, all which should be covered by existing rules. The file description and categorization guideline is too strict - there is no reason to quarantine every slightly racy picture into separate subcategories. MKFI (talk) 20:24, 9 December 2010 (UTC)[reply]
  132.  Oppose --Flussbus (talk) 20:59, 9 December 2010 (UTC)[reply]
  133.  Oppose This is Silver seren. I think that this is indeed a bit of instruction creep and, also, redundant to what is already performed on Commons. Furthermore, after the huge debacle over child pornography before, I don't exactly trust that this won't be used to further damage the collection of images here on Commons. I mean, someone could easily say that those lithographs from the 1800s and early 1900s that depict child pornography should go, since they are explicit and focus on sexual content. However, we all know that they need to be retained for their historicity and the fact that they are used on the Wikipedias as depictions in various historical articles and, of course, the article of the creator. 165.91.173.6 21:09, 9 December 2010 (UTC)[reply]
  134.  Oppose per Marcus Cyron, against censorship based on US point of view. --Alupus (talk) 21:43, 9 December 2010 (UTC)[reply]
  135.  Oppose per Marcus Cyron, against censorship based on US point of view. --Andim (talk) 21:58, 9 December 2010 (UTC)[reply]
  136.  Oppose If the content is not illegal where the servers are located, then it should be allowed. If users are offended, then they should not look for nor at the offending material. It is a big project and there are plenty of pictures to look at without seeking out these.--Die4Dixie (talk) 22:26, 9 December 2010 (UTC)[reply]
  137.  Oppose per Marcus Cyron. --Gripweed (talk) 22:55, 9 December 2010 (UTC)[reply]
  138.  Oppose It is up to the single Wikis to decide the use... and the English Wikipedia is worldwide, not USA only. Simplicius (talk) 23:03, 9 December 2010 (UTC)[reply]

Comment

  •  Comment This was discussed many times. We don't have the resources to check whether the written consent is from the real model and thus it is of little value. On the other hand, serious Wikimedia users posing for sexual content would probably prefer to do so anonymously. --LPfi (talk) 07:02, 6 December 2010 (UTC)[reply]
  •  Comment It is noted above that the poll concerns whether people accept or reject the this revision. So it would be better if supporters who predict a great future improvement of this proposed guideline discuss the current revision, or discuss the improvement in detail. Once it becomes a policy, it is a policy. --Tomchen1989 (talk) 09:35, 5 December 2010 (UTC)[reply]
  •  Neutral We do not need a special policy. I do not want this. Sexual images should be treated like other types of images. If it means we get less donations, who cares? We will survive anyway, and my integrity is not for sale. “Imagine a world in which every single person on the planet is given free access to the sum of all human knowledge”. Sexual knowledge is also knowledge. On the other hand, perhaps this policy make it harder for new attempts to censor and delete images (as we saw earlier this year).Ankara (talk) 21:48, 5 December 2010 (UTC)[reply]
  •  Commentreading it now, I think it will do. It does not restrict images by how explicit they are. Handing off low grade images uploaded from Flikr, in order to protect good quality images is a reasonable trade-off in my view. --Elen of the Roads (talk) 23:16, 5 December 2010 (UTC)[reply]
  •  Neutral I share the policy's concern that we have a problem with certain pictures that clearly shouldn't be here, but on the other hand I think that such a policy could easily be abused for censorship, even with all the disclaimers and affirmation of the contrary in place. --rtc (talk) 23:48, 5 December 2010 (UTC)[reply]
  •  Comment The used terminology looks to be dilettantish and garbled in some formulations. E. g., paraphilias are defined as a predisposition (a gene), not as some acts which can be photographed (every type of practics can be and can not be a manifestation of paraphilia, no type of practice is a paraphilia itself). As regards the proposal, still at the present time, especially in the United States, uncensored educational encyclopedia cannot be created and published. Fortunately, the most fanatical censorhip of today's western power system is limited only to the sexuality-related content and some few other themes. Every attempt to vindicate uncensored educational content would be welcome. If some compromises are necessary to achieve it, nothing doing. However, if we will accept censorship in this theme, whatever other theme can come after. --ŠJů (talk) 06:03, 6 December 2010 (UTC)[reply]
  •  Comment Ultimately, we should have a sexual content policy. I haven't read this in detail to work out if I think the proposed is acceptable or not. However, I think consensus amongst those working here is that this is pretty good. Therefore, I'd suggest that if there's no consensus to adopt as policy, can we at least get it established as a working guideline? Then it does have some value to allowing the community to function in this area - actions based on guidelines are more likely to be accepted than actions based on a proposal.--Nilfanion (talk) 12:20, 6 December 2010 (UTC)[reply]
    I generally agree this could fly as a rough guideline. The people here put a lot of work into this, it is a valuable document that gives some insight into the issues raised by sexual content. It's good to read and to consider-- it's just not a document that should be quoted as policy. --Alecmconroy (talk) 14:53, 6 December 2010 (UTC)[reply]
  •  Neutral I've been involved in writing this policy and making it as best as I could negotiate to favor free speech, and as I've responded to various Oppose votes above, some of the objections to it should be invalid. I have accepted, however, the model that current Commons policy should reflect where Wikimedia is subject to prosecution, even though Purodha makes an appealing argument that it shouldn't be. To my way of thinking, Wikimedia is such a strong force for freedom of inquiry right now that we can't risk it for a small gain. In a political protest where there are some people who are willing to be arrested but a lot more who aren't, and Wikimedia definitely belongs to the have-a-wife-and-kids group of moderates. If Wikimedia moves to a different country we can always discuss a lighter policy - certainly if that country has more restrictive laws on other topics people will be discussing those. I am moved by Walks on Water's complaint that this seems to stigmatize sex as opposed to say violence; this policy is in fact a scar, a response to a media and hence an administrative onslaught that should not have been; but it did happen. I can't disagree with the comments of Cyclopia and others that the mission of free knowledge is the fundamental principle. And Ruslik makes a good point against instruction creep; I've said at many points throughout the writing that most of what can be done by this as a policy can be done by it as an essay, merely guiding readers to preexisting policies.
However, I also have to ask, if people oppose this policy, what will happen instead? Having a policy means that people will have to watch it and make sure it doesn't creep its way down a slippery slope, yes. But settling everything by AfDs means a lot of content could get lost unless someone is watching every obscure discussion. It is also possible that there could be another rash of admins speedily deleting and undeleting, then arguing to de-admin one another... will people be around to defend good admins and demand public discussion? And then there's the risk that Powers That Be in WMF could impose a policy because "no consensus can be made anyway", and from what I've seen such policies tend to be ham-handed. Are people willing to make the kind of massive protest that would be necessary to reverse such an action? I say this as someone who has seen Pennsylvania lose Arlen Specter, a powerful voice for scientific research in the Senate, in favor of a hard-core Obama supporter, only to have all that support evaporate in the general election, leaving us with a rubber stamp the Republicans found in a bank in Hong Kong. If people are serious about rejecting the relatively small compromises imposed by this policy, we need their opposition to remain firm later.
My purpose here is to make this a decent policy, not to force people to accept it. I think the Commons community has a generally enlightened perspective and I'll trust whichever way it decides. Wnt (talk) 13:35, 6 December 2010 (UTC)[reply]
You absolutely can't speedy delete just for out-of-scope. Because once deleted, only admins can even see the image in order to discuss whether it's in scope or not. A speedy deletion alienates the majority of the editor population and cuts them out of the discussion. A speedy delete is a conversation killer, not a conversation starter. And for the very difficult question of scope, we _need_ that discussion to take place. --Alecmconroy (talk) 15:16, 6 December 2010 (UTC)[reply]
When we receive the 10,000th blurry picture of a bloke's penis with his hand clamped around it, do you suggest that we have a full-blown deletion discussion each and every time? We have better things to do, and I've seen deletion discussions sit there, stale, for weeks and months. If an admin does abuse the right to speedy, deleting stuff with a realistic educational value, then let other admins raise it in a community discussion. If their speedy deletions are out of line, the community can throw the book at them. --JN466 14:45, 7 December 2010 (UTC)[reply]
Whatever comes out of this poll, the next subject heading will probably be about speedy deletion of out of scope images. We'll need stronger facts to make any real headway on the issue in either direction. How many such penis pictures are there really? How much extra time does it actually take to close a normal deletion discussion rather than make a speedy delete? Aside from the great Jimbo purge, how often are such speedy deletions done improperly? Does the undeletion mechanism really work properly when it is needed? Wnt (talk) 15:05, 7 December 2010 (UTC)[reply]
As AFBorchert notes above, speedy deletions of illegal material or material falling under COM:PORN are already current practice. They will continue whether this is adopted or not. --JN466 02:14, 8 December 2010 (UTC)[reply]
correction to the above comment: it is the practice of some admins to do this, it is not a policy, or even a guideline, & has never been approved by a community vote. the fact that you feel "they will continue, whether this policy is adopted, or not" raises the question of whether such users should continue as admins, especially if this policy fails to achieve consensus (with that item intact). if the admins abusing speedy for this purpose would instead spend their time in closing the backlogged, old deletion debates, the problem would resolve itself. & frankly, i do not find it acceptable that participating in deletion debates should be moved from full community access, to an admin prerogative, which will happen for sexual content, if that part of this policy is approved.Lx 121 (talk) 03:00, 8 December 2010 (UTC)[reply]
  •  Neutral This is not censored to be a children's encyclopedia. We give info that is necessary to give, even if it's not safe for work. At the same time, starting controversy with parents and religious groups is not a fight an encyclopedia that wants to be a respectable, objective source of information should want to get into. I think, at least, that pages with sexual pictures should have some kind of warning that the content on that page is not safe for work. J390 (talk) 19:54, 6 December 2010 (UTC)[reply]
     Comment We tried to ensure that by the section about categories. People that see the file other ways, e.g. by it being used in an article (that should be handled by the Wikipedias) or through a search, cannot be handled by this policy: we would need technical infrastructure that is not in place. There is also the very problematic question about what media are "safe for work" (or even worse: "safe for children") in different cultures. How would the tags be maintained? --LPfi (talk) 09:41, 7 December 2010 (UTC)[reply]
    I feel like "safe for work" is a misnomer. More like "safe for work if your boss is an idiot". No reasonable manager should punish an employee because a Wikipedia page turned out to contain something unexpectably explicit. Of course, the fact that the employee is browsing Wikipedia at all is another story, and an explicit image might draw unwanted attention - but the same is more true of say, a page about a union. If we really want Wikipedia to be safe for work maybe we should implement a Rogue-style "supervisor key" to flash the window back to some work-relevant page the reader has preloaded. Wnt (talk) 15:14, 7 December 2010 (UTC)[reply]
  •  Comment i find it interesting how most of the "support" votes are one-line comments, usually just saying that they agree, without adding anything much to the debate, but the "oppose" section is full of discussions about the issues at stake Lx 121 (talk) 00:30, 7 December 2010 (UTC)[reply]
     Comment I's easy for (US)Americans to agree because they're indoctrinated with the (US)American POV. We likely have a large group of them, or similarly thinking persons, here, who are advocating "their" customary realms of censorship. --Purodha Blissenbach (talk) 01:47, 7 December 2010 (UTC)[reply]
    • We've sat here for months working on this; what do you want the supporters to do, recapitulate months of discussion and compromise?--Prosfilaes (talk) 01:49, 7 December 2010 (UTC)[reply]
      • um, yes actually; i was following the developments here intermittently, until it seemed like the whole thing was getting nowhere. i'd still like to know how "out-of-scope" ended up being included as a criteria for speed deletion? Lx 121 (talk) 14:00, 7 December 2010 (UTC)[reply]
        • It was perceived that current praxis is having obviously out of scope media deleted ("blurry penises"). We did not want to forbid something that is actually going on without anybody protesting. It is better having explicit statements in the policy than everybody turning their back when some aspects of policy is violated. And at least I felt that the "obvious" point could be handled by the administrators, with most serious material explicitely exempted from such deletions. If not, no policy or lack of policy will do. --LPfi (talk) 11:23, 8 December 2010 (UTC)[reply]
      • Prosfilaes nails it: I've written paragraphs and it makes little sense to just repeat them here. If anyone should be accused of a one-liner mentality, it is the people who say "NOT CENSORED" and imply that no further inquiry or nuance is needed.--Brian Dell (talk) 22:19, 7 December 2010 (UTC)[reply]
        • i'm sorry, but when you call a poll to decide this issue, you should be expect to reiterate your reasoning & have it challenged. "NOT CENSORED" is a pretty clear policy, & a widely supported one, this isn't. as for "inquiry or nuance" it's up to the nominator & supporting voters to make their case for deletion; if "NOT CENSORED" is relevant to the item under consideration, then that is part of it. the default option is "keep" not "remove" Lx 121 (talk) 00:04, 8 December 2010 (UTC)[reply]
  •  Comment I'm not suggesting that WMF leaves the US of A just in order to get perhaps a better legal environment hosting some specific content. Rather, other organizations, or individuals, in other states could do that. There are several states in the world, where there are practically no restrictions at all.
    Judging content by its "educational value" is utterly silly. Take somejudges judgements, whatever they be, make an educational course explaining what made these judgemnts possible, how they are supported by both the content, and somejudges educational background and expectations. Whatever the judgements, the specific content is now of some educational value. Since this is always possible, the conclusion has to be, there is no content having no educational value. Even before you use it so, you cannot dismiss content, because you then you prohibit the education that would use it. --Purodha Blissenbach (talk) 01:37, 7 December 2010 (UTC)[reply]
I'd like to think this is true, but could you name the country? I'm thinking that some countries that have a good reputation about sexual material have very problematic laws that would interfere with articles about racism and Holocaust revisionism. Wnt (talk) 02:40, 7 December 2010 (UTC)[reply]
  •  Comment No opinion on this policy, but I do hope those closing this poll will properly discount all !votes that are just ranting about the US. Not liking US morals is not relevant to policy. Heimstern (talk) 11:42, 7 December 2010 (UTC)[reply]
a vote is still a vote Lx 121 (talk) 14:00, 7 December 2010 (UTC)[reply]
Heimstern: criticizing the proposal to have an American bias is a valid argument. We are a global project, not a American.Ankara (talk) 14:13, 7 December 2010 (UTC)[reply]
Indeed! Being US-centric is a very real and valid concern, even among US editors. Wikipedia is full of copies of notable truly offensive hate-speech. Our Muslim readers, for example, are very, very troubled that we include insulting depictions of Muhammad. How can I explain to them that it's okay to have a policy for one taboo, porn, while we ignore all their cultural taboos?
This is a perfect case for a culture-neutral, user-controlled, opt-in content filtering-- like the kind recommended by the study. It would solve all our problems without ever jeopardizing our ability to host truly notable controversial material. --Alecmconroy (talk) 16:38, 7 December 2010 (UTC)[reply]
re a claim further up, votes are not just votes on Wikipedia (or Commons to my knowledge). Assessing consensus is not narrowly defined in a way that can be gamed. I note that a person voting Support says his vote is as an editor not as a very heavily involved official, and I find that more persuasive than some "drive-by" vote. re the USA, the projects have to be situated somewhere and that means the legal norms of the hosting jurisdiction are relevant. Argue that the servers be moved, fine, but complaining about the social or cultural norms of the hosting jurisdiction is off-topic. Re moving the servers, note that the United States government has been a huge enabler of the Commons by legally designating so much of its work public domain. I do not see it as an enemy of the idea of a free encyclopedia.--Brian Dell (talk) 22:35, 7 December 2010 (UTC)[reply]
Criticizing the proposal at US-centric is fine, sure, but saying "US morals suck", which is what some of the oppose !votes say, is not valid. As for the original reply to my comment, consensus is not a vote. Also agree completely with Brian Dell. Heimstern (talk) 23:32, 7 December 2010 (UTC)[reply]
"consensus is not a vote", but you still want to make sure that the votes you do not like are removed? seems to be a bit of a contradiction there... XD Lx 121 (talk) 00:04, 8 December 2010 (UTC)[reply]
No, I said I wanted !votes discounted, not votes. And not the ones "I don't like"; the ones that make no argument save "US morals suck". Or for that matter, "US law is irrelevant", since that is patently false for as long as the WMF servers remain in Florida. Consensus means opinions (not votes) are weighed in accordance with the strength of the argument, and that's all I'm asking for here. Probably some on both sides need discounting. Heimstern (talk) 04:10, 8 December 2010 (UTC)[reply]
in that case, all the "support" votes that make no valid arguement & simply say "i agree with this" should be removed also. if we are weighing this (& who is doing the weighing?) on the strength of the arguements, then this proposal is already dead. Lx 121 (talk) 18:41, 8 December 2010 (UTC)[reply]
For my part, I don't believe we should be hosting any material that is illegal under US law-- so when I talk about being too US-centric, I mean US culture, not US law. (Servers are in US, so US law applies). What I want to avoid is a case where some readers have 'more rights' than others-- if we are going to let US users hide images that offend them, we should let readers all around the world have the same right to decide for themselves what content is offensive. --Alecmconroy (talk) 23:37, 7 December 2010 (UTC)[reply]
There is a difference between valid content removal and censorship; what you are proposing is definitely at the 'censorship' end of the scale. Barts1a (talk) 23:42, 7 December 2010 (UTC)[reply]
  •  Comment I'd like to point out that we would like Wikipedia/Wikimedia to be accessible to as large an audience as possible, including children and countries other than the U.S.A., and as such we want to limit the amount of censorship imposed on that access. Categorizing images likely to be deemed "offensive" or "pornographic" helps our goal of widespread access by limiting content likely to be deems "objectionable" to well-labeled areas (which could be censored by governments and children protection software), freeing up all of the other image areas for free and uncensored access. The alternative is to risk the entire site being censored instead of specific sections of it. — Loadmaster (talk) 17:43, 7 December 2010 (UTC)[reply]
To me, we absolutely don't want [unsupervised] children here! The world is full of information so disturbing, it makes pornography look like a lullaby. The world has murders and genocides and diseases and abuse, and adults need to document them all so that we can learn from those horrors. But stressing over the porn in the hopes of making wikipedia child-safe is like trying to babyproof a construction site-- Wikipedia is inherently not child-safe, we can't change that fact, and we shouldn't kid parents about that reality. --Alecmconroy (talk) 18:44, 7 December 2010 (UTC)[reply]
I don't see Loadmaster as insisting that Wiki projects adhere to a prescribed notion of child safe. I rather see the entirely reasonable suggestion that we decline to play activist, something entirely in keeping with the cornerstone principle of neutrality, which means selecting an appropriate hill to die on when it comes to picking fights with external norms. If Wiki is going to stubbornly take a stance that leads to it being marginalized and ostracized even in "liberal" cultures, the cause should not be trivial.--Brian Dell (talk) 22:12, 7 December 2010 (UTC)[reply]

 Neutral - I think there are legitimate concerns being addressed by this policy, and I'm glad to see it moving ahead. If it overlaps with other policy, it should link/refer to those policies, but not restate them which might lead to confusion. I think there are other legitimate concerns about the wording, including the speedy deletions of possibly "out-of-scope" material. Keep working on it, and you'll get it right. - Themightyquill (talk) 16:49, 8 December 2010 (UTC)[reply]

  •  Comment I read the suggestion that we should label content as "safe for childen". Why on earth should we? It is safe for children. Or have you ever seen a computer image stabbing a child? A defective bike may be unsafe for children to ride, but electronic images?
    Now what you indeed are talking of is that there are picture which are subjectively deemed unsafe for adults, concerning their respective ideologies, such as Dalai Lama in China, images of the Prophet Muhammad in some arabian countries, or what they call obscene in Florida. While I am far from advocating that unexpecting users shall have to run the risk of being unvoluntarily confronted e. g. with a funy scetch of a holy, I think it's not our duty to judge in someone elses name. So, if the State of the Holy Sea wants specific content labelled as heresy, to protect catholics against it viewing it accidentally, they should supply the labour and cost of doing so. If they were wanting us to host or serve their judgements, they should of course pay for our additional cost. --Purodha Blissenbach (talk) 20:51, 8 December 2010 (UTC)[reply]

Alerting other projects

Since other projects are dependent on commons for image hosting, they will want to be alerted to any potential changes. Is there a way we can get neutral watchlist notices up on places like EnWiki and DeWiki, like the sitenotice currently running here on commons? --Alecmconroy (talk) 18:24, 7 December 2010 (UTC)[reply]

EnWiki and DeWiki? Why on earth? Either on all projects or by individuals alerting projects by their own choise. --LPfi (talk) 11:32, 8 December 2010 (UTC)[reply]
Agreed. All projects would be even better. I just mentioned EN and DE since they're the two biggest, but any project that is reliant upon commons should be alerted. --Alecmconroy (talk)

Out-of-scope decisions should not be speedy deletion criteria for files

COM:CSD doesn't indicate that scope concerns are a legitimate speedy deletion criteria for files; only for pages. I think this is a serious flaw in the current proposal and would like to see it changed. Asking individuals, even admins, to make subjective scope decisions seems like a recipe for conflict. Of all the issues raised in the objections, this one seems the most substantial to me.

I'm asking here if there are any objections to removing "Material obviously outside of scope" from the proposed speedy deletion criteria additions so that the people who have already indicated their support for the proposal in general can indicate whether they have any objections. 71.198.176.22 02:12, 8 December 2010 (UTC)[reply]

Of course I object to changing something so significant in the middle of a vote! I'm afraid you'll have to vote accordingly and then wait until after the poll to re-discuss this. 99of9 (talk) 10:12, 8 December 2010 (UTC)[reply]

I'm not proposing that it be changed before the vote is closed, but I am proposing that the vote be closed in the next few days, and discussion move on to the objections raised by more than one person. We can begin such discussion at once. I'd like to start by asking for people to propose alternatives to this edit, which was indicated as potentially problematic by the editor in the summary and severely changed the character of the "out of scope" section. My opinion remains to delete the remaining, small, "Material obviously out of scope" bullet point paragraph which was made out of the earlier much larger section that was moved into the next section. 71.198.176.22 11:01, 8 December 2010 (UTC)[reply]
As it says in the intro to the poll, it will be closed on the 15th of December. --99of9 (talk) 12:03, 8 December 2010 (UTC)[reply]
  •  Support unless the admins doing the speedy deletions can provide specific facts to make a compelling argument. It's clear from the poll comments that this part of the policy is objected to by many people, and favored by few. However, as I commented above, I would like to know how many speedy deletions really happen and how much time and effort it actually saves to do things that way rather than by normal AfDs, among other things. But I think at this point the burden of proof is on the admins; otherwise we can change the policy to rule out all speedy deletions for scope, and see what experimental data that provides. Wnt (talk) 12:46, 8 December 2010 (UTC)[reply]

If this is the loophole which allows deletion of blurry cell-phone camera snapshots of penises and similar (see File:What Commons Does Not Need.jpg), then removing it would disalign policy from practice (as explained above) and set the stage for future bickering... AnonMoos (talk) 03:00, 9 December 2010 (UTC)[reply]

this "practice" is undertaken by only a handful of admins who seem to have a particular interest in the subject. i have pointed out in the past that it is a violation of policy. with respect, it would be more accurate to say that the inclusion of this provision is an attempt to "align" policy with the preferred practices of those admins who have not been following established policy in this regard. it is also pretty clear from the responses here & in the "oppose" section, that i am not alone in feeling that all admins should be required to follow policy on deletions, & it is not "ok" for some admins to simply ignore written policy, & make up their own rules, when it suits them Lx 121 (talk) 08:44, 9 December 2010 (UTC)[reply]
Those can be speedily deleted if they don't have an explicit declaration of consent with the reason "not permitted to host." I would like to try that for a month or twelve to see how it goes before rejecting the potential solution. Ginger Conspiracy (talk) 03:33, 9 December 2010 (UTC)[reply]

Potential "Miller Test" Material-- speedy delete or not?

The current version gives conflicting instructions. First, we're told to speedy delete any material that is illegal for Wikimedia to host. Later, we're told that "Because of the complex, subjective nature of obscenity law, work should not be speedy deleted on the basis of perceived obscenity".

It's confusing to tell people to speedy delete anything illegal, then teach them all about obscenity law, only to finally inform that that obscenity is just too complex for a lay audience to decide.

Perhaps the obscenity law discussion belongs in some essay, guide, or instructional page--- something that doesn't carry the force of policy. Right now, we try to educate an international population about some US laws that hinge upon US interpretations of 'community standards' in unknown ways, thus making it utterly impossible for a non-lawyer to accurately gauge an image's potential legality.

--Alecmconroy (talk) 03:40, 8 December 2010 (UTC)[reply]

I too have concerns about the "Miller test" provision. The courts have generally been very lenient with the Miller test, rendering it nearly moot. However, the plain language of the Miller test, if applied in a reasonable way, would exclude large swaths of material. I can't really vote on this proposal as long as it states that material must comply with the language of the Miller test, considering the potential for abuse there by editors who take the Miller test at face value instead of applying as the courts have. Gigs (talk) 04:08, 8 December 2010 (UTC)[reply]
If you can cite material explaining this greater leniency, it would make a very welcome footnote to this policy. My untutored impression is that w:Mike Diana lived in Florida. Wnt (talk) 12:54, 8 December 2010 (UTC)[reply]
Regarding the impact of US laws, this is a consequence of the fundamental NOTCENSORED pillar. I've tried to prevent the policy from excluding any sort of material except for some actual reason beyond our control. We aren't picking out stuff and saying it's off limits simply because we don't like it. As a result, the face of Commons is inevitably going to reflect the tread of the boot that stepped on it. We're not going to say that we walked into a doorknob or fell down the steps. Wnt (talk) 13:01, 8 December 2010 (UTC)[reply]
Wnt -- What "Wikipedia is not censored" means is that an image is not automatically deleted just because someone considers it offensive or unsuitable for viewing by children. However, "Wikipedia is not censored" does NOT mean that we can't make a considered deliberate decision not to host certain types of material which we have concluded does not further our goals. AnonMoos (talk) 03:07, 9 December 2010 (UTC)[reply]
i'm sorry, but your agruement is sophistry; when you "make a considered deliberate decision not to host certain types of material", that is censorship. once we define the project's goals (which, for wmc would be: media repository, educational material, free and/or open-source only), then the basic considerations for inclusion/exclusion of materials are: 1. relevance (i.e,: scope) 2. legality (copyright, other legal restrictions on permitted content, etc.) if you want to add a 3rd parameter it should probably be: 3. technical limitations (storage, bandwidth, etc.) on that basis, you are welcome to argue why you feel that sexual content "does not further our goals", but please do not pretend that it's not censorship! Lx 121 (talk) 21:36, 9 December 2010 (UTC)[reply]

I think it's fair to say that administrators can make satisfactory judgements about anything out of a wide gray area, and the information in the proposal helps them understand where and how wide the gray area is. We already have instructions at COM:CSD to delete some kinds of illegal material (copyright violations), and technically, it is impossible to license a lot of the material forbidden by anti-pornography statutes, so the CSD for non-free content applies too. Therefore, the proposal represents an incremental improvement in any case. 71.198.176.22 10:47, 8 December 2010 (UTC)[reply]

i have no problem with removing material that it would be illegal for wmc to host, but which statues are you referring to? o__0 Lx 121 (talk) 21:36, 9 December 2010 (UTC)[reply]
I think everyone agrees that obviously illegal images should be immediately deleted. Many such kinds of illegality can be judged by our editors-- copyright or consent, for example. But US obscenity case law is never obvious to untrained eyes. Non-specialists cannot correctly apply the miller test.
If you let people start using their own personal standard to apply miller to our content, how do you resolve disputes? All editors can do is argue back and forth on whose standards best represent 'the Florida, US community standards'.
Legal questions are for office. Don't have our admins singlehandedly decide such important and complex legal questions. Don't make our editors debate over legal obscenity, each arguing based upon their own impressions of what "FL, US community standards" look like-- that will just spark divisive flamewars over religious, cultural, and philosophical differences.
If someone thinks an image is obscene, tell OTRS and let the lawyers take a peek. If it's truly obscene, you can be sure they'll be the ones best suited to actually do the deletion and take whatever other legal steps are necesssary. --Alecmconroy (talk) 11:55, 8 December 2010 (UTC)[reply]
I agree; again, I think the proposal represents an improvement over the existing COM:CSD in the direction you're supporting. 71.198.176.22 12:02, 8 December 2010 (UTC)[reply]
I would say that legal questions are for lawyers, but not necessarily that legal questions are for office. If it's something where great haste is needed to avoid discrediting Wikimedia, that's one thing, but if it's a debatable case, we might as well make the decision in public. There's no reason why Wikimedia's legal counsel can't give input directly to the AfD process. Mostly, however, my hope is that the act of discussing a borderline case in AfD will goad those involved into presenting good arguments that the content actually has educational, scientific, artistic, or political importance, thereby hardening Commons against any potential prosecutions. Wnt (talk) 12:51, 8 December 2010 (UTC)[reply]
And then the closer judges whether their arguments met the Miller Test or not? No thanks. Gigs (talk) 17:33, 8 December 2010 (UTC)[reply]
I too raised Dost and Miller as a reason for an oppose. It is ironic, as I do spend a lot of time policing out images from a en:wp article pages that fall within the spirit of Dost and Miller. But that is en:wp, and I can envisage occasions where even those images may have an educational content. Before however I make a decision on a policy I need to understand completely the issues- and from the Miller test and Dost test- I don't get that. Just warning bells. They are only Start-class articles- if they were GAs then I might have enough information- to take the risk and form a policy. At the moment I sense a Weapons of mass destruction fiasco is being engineered. So even before, we take into account whether we wish to guide a global project on the perceived opinions on the views of a small community somewhere on the 26th parallel, could someone work up those two articles to GA status.--ClemRutter (talk) 20:15, 8 December 2010 (UTC)[reply]
Illegal material will be deleted by administrators, with or without this policy. The Miller test is used to deem illegality, so it is relevant whether or not we mention it. But if a work, taken as a whole, has serious literary, artistic, political or scientific value, then it will not be regarded as obscene. That means, more or less, that no work that is in project scope is obscene by this test. Educational is not mentioned, but I understood that would not be a problem.
So mentioning the Miller test is not about allowing censoring, but about ruling out some arguments about illegality in the deletion debates.
--LPfi (talk) 20:58, 8 December 2010 (UTC)[reply]
If applied properly, the Miller test won't be a problem for any of our content. If applied improperly, taking the test at face value, it could lead to lots of problems. Let's cut the current Miller test text-- nobody knows what that test even means within the context of a global project like ours. --Alecmconroy (talk) 06:52, 9 December 2010 (UTC)[reply]
one obvious point here: we already have a policy for deleting illegal materials, which makes that aspect of "commons:sexual content" irrelevant/redundant. as re: "the miller test", i don't see any way that we can realistically train all our admins on here to administer it, to a fair, even, & credible legal standard. failing that, it becomes a hopelessly subjective "bogeyman" standard, being applied differently by every admin, & the source of yet more internal strife @ commons. it's a nice idea in principle, but if even the courts can't make it work as a practical, implementable, objective standard; how are we supposed to? for materials that aren't clearly illegal for us to host, the place for discussion & consideration (especially of issues re: scope!) is in open community debate, NOT by admins in secret, with no true open review of their actions possible Lx 121 (talk) 21:46, 9 December 2010 (UTC)[reply]

other languages

Well, I know that it is a lot of work, but if i see this case correctly, people that have not the sufficient profiency of English should deal with the outcome of this voting of people who have. Hm, thought this is a multilinugal project, where even non-English speakers are welcome? --Catfisheye (talk) 23:21, 8 December 2010 (UTC)[reply]

If you want to translate it, go ahead. But translations are hard, we don't have anyone volunteering to do translations of this document, and it would take several translations to even start to cover our non-English speaking users. Not to mention that disagreements are turning over delicate turns of phrase that won't translate exactly, and hence those voting based on versions other than the English won't get an entirely accurate view of the proposal. So translation is not our first priority.--Prosfilaes (talk) 00:51, 9 December 2010 (UTC)[reply]
I considered the problems you mentioned, but knowing that, how could the result of this poll be regarded as representative, i. e. obligatory for all, if a not neglectable amount of users is being excluded from the beginning? That of course does not apply only to this poll, but is a general problem. Maybe a system can be adopted, where translations do exist with a caveat for being limited to that what a translation can do, but the voters have to vote for the English version. But how to secure that translations are being made. Do you know which chapter of the foundation is responsible for Commons? --Catfisheye (talk) 01:47, 9 December 2010 (UTC)[reply]
This is a fair argument, except... isn't every Commons process done in English? Deletion discussions, admin noticeboards, village pump, etc. Hopefully the policy won't infringe very much on other projects --- with the notable exception of explicit consent. Handling the language issues regarding that may not be easy. Wnt (talk) 02:17, 9 December 2010 (UTC)[reply]
It's really critical that we involve non-english-speakers in this decision-- they are going to be the ones most affected, since they won't even have the opportunity to come to commons and argue for undeletion. I think I'd feel very uncomfortable about that set up, if the shoe were on the other foot and it was a german-language community deciding what can and cannot be allowed here.--Alecmconroy (talk) 06:32, 9 December 2010 (UTC)[reply]
I think the way to involve non-english-speakers is to have those understanding English report about the proposal and key points in it. In most communities there is at least a big minority knowing English. If the community finds the proposal problematic, we can hope enough people will come here also from those communities. There is easily some hundred people knowing English in every big language wikipedia, enough to overturn any result here. Of course the USA and Western Europe are overrepresented, so we should listen carefully to those representing other cultures.
The same is true about being affected by policies. There is a village pump for all languages, at least in the Wikipedia project. Those not knowing English can ask for help there. I think translation of introductory texts is more important than translation of proposals: when new users feel at home in Commons, using their own language to upload and describe their files and discuss problems (at their village pump or user talk pages), they will know how to find somebody to help them with the English.
--LPfi (talk) 08:30, 9 December 2010 (UTC)[reply]
It's hard to argue that this isn't a concern, but I'm not volunteering to do the translation either. In the near term, if the policy passes, some latitude will be needed for people who don't speak English. If need be (and I don't think it really should) the other projects can also host their own images locally. Certainly on en. that has to be done quite often for copyright/fair use reasons. Wnt (talk) 16:26, 9 December 2010 (UTC)[reply]
The biggest problem of this poll is the language. Who is willing to participate in a poll that he can't even understand or at least in detail? Commons is a project that provides images for all language versions. A poll that is only hold in English language is not representative at all. First the translations. Then the poll. Otherwise it is like cheating on all the other projects. --Niabot (talk) 20:27, 9 December 2010 (UTC)[reply]
Whatever. I've noticed no impulse on anybody's part to start translating it, so all this amounts to is a stalling tactic, which can be used to block any new policy. Most current policies have a handful of translations; none of Commons:Blocking policy, Commons:Photographs of identifiable people, or Commons:Nudity have more than five translations, none of them into German, Russian or Arabic. There is no hope of getting this translated into every language of Wikimedia, so at the very least you're going to have to specify which languages you're agitating on behalf of.--Prosfilaes (talk) 21:57, 9 December 2010 (UTC)[reply]
Sorry, but that's not cool. Every contributor--regardless of native tongue--should be a part of all major Wikimedia-wide policy decisions. The fact that this isn't already mandated absolutely amazes me. And if finding volunteers to translate proposals is such a barrier, then we need to have the oversight to prepare a reliable plan in advance.   — C M B J   22:49, 9 December 2010 (UTC)[reply]
I agree with OP Catfisheye and others: It really is a problem in our multilingual project: confusio linguarum. But it is a problem for all polls here. It is not only relevant for this one.
The European Union has a big, big, very costly institution (en:Translation Centre for the Bodies of the European Union) to provide all the translations to all languages in the EU. But Commons has more languages than the EU and cannot pay certified, professional translators to translate 100% correct.
I do not know the solution to this problem right now - but it surly is a problem for Commons. It especially is a problem when it comes to polls where the language restriction creates a bias towards a specific result - as it is surely the case with this topic. This poll cannot be representative for all members of the Commons community. Some parts of the community rely on our wonderful translated templates and get it done to upload images and do the stuff then need to do. So this poll is not accessible for them.
This problem does not matter here as long as it is closed as "no consensus", but if it would be not we have to ask the question if it really is a consensus or if it looks like a consensus due to a strong bias in the voters able to vote. Cheers --Saibo (Δ) 23:09, 9 December 2010 (UTC)[reply]

A few little queries

I've gone through some of this, copy-editing without intending to change the substantive meanings. Among little issues I've spotted (and may not understand) are these:

  • "A media file that is in use on one of the other projects of the Wikimedia Foundation is automatically considered to be useful for an educational purpose. Such a file is not liable to deletion simply because it may be of poor quality: if it is in use, that is enough. Exclusive use of a file in userspace does not constitute educational purpose." I'm surprised that no assessment would be made of whether the use of the file is educational. This is a requirement, for example, of some fair-use rules. Is material allowed on talk pages alone, and other WP space alone?
  • "Your communication will be confidential and will be viewed only by trusted volunteers who have identified themselves to the Wikimedia Foundation by name. For further information see Meta:OTRS." So WMF employees won't be able to access it? Tony1 (talk) 07:09, 9 December 2010 (UTC)[reply]
The main point is that the projects are allowed to define themselves which images are suitable and we on Commons should not delete images that are deemed useful (for an educational purpose) in the projects. We might think that an image is useless low quality porn, but if it is used on a Wikipedia article, then we should not think we know better than the editors of the Wikipedia. User page use is explicitly not considered educational use (but COM:PS has some comments about scope and user pages).
Use on talk pages is more tricky, but I think removing an image from a WP: or talk page could be very disruptive. If there is no problem with the image (other than being out of scope), I think leaving the image is certainly the right thing to do. If the number of such images is too high, then the problem is with the projects, not with Commons hosting them. Individual disturbing images might be protected this way, but there are probably many more disturbing clearly in-scope images. ("Fair-use rules" is probably referring to some individual Wikipedia? The problem is different as it can be handled locally.)
--LPfi (talk) 09:05, 9 December 2010 (UTC)[reply]
Good catch about the WMF employees - when we make a promise like that we should be able to adhere to it literally. Wnt (talk) 16:47, 9 December 2010 (UTC)[reply]

I've removed some of the mid-sentence bolding. (1) It looks messy. (2) It often excludes the critical word, which comes before the bolding. I do not believe it aids comprehension, unless perhaps bolding is used at the very start of each point, as an inline heading, as it were (such as in Speedy deletions). Tony1 (talk) 07:12, 9 December 2010 (UTC)[reply]

Images of deceased people

I don't know if this is right place for this, if not I'll delete. I hate to re-open an old dispute. There needs to be clearer image policies about situations where the assumed consent of a deceased person should be required.

When I joined Wikipedia completely new and inexperienced last February, I was right away immersed in a dispute regarding the nude image of a murdered child on Wikipedia. Her private area was barely visible but it was there and if that was someone I loved exposed that way, I would have been devastated. For the record I hate censorship but this was a case of the rights of a person comes first.

When I tried to have the imaged removed, editors :

  1. Gave me the old not censored disclaimer
  2. Insulted my intelligence with pseudo logic and platitudes
  3. One editor flamed me with some serious personal attacks
  4. Tried to ignore me away under the pretext of Don't Feed the Trolls when I tried to make a peace gesture

I was so intimidated but I didn't go away, I'm here. The image was finally deleted three months later for a copyright technicality, but the way that poor child was exposed for over a year is a stain on Wikipedia's reputation. Slightsmile (talk) 17:02, 9 December 2010 (UTC)[reply]

The goal of Wikicommons is to create a repository of free works which then can be used for educational purpose. If somebody's feelings get hurt in the process, it is bad of course, but if there is a policy it should be to help those people deal with their feelings rather than doing the a disservice by denying their problem and removing the images. Beta M (talk) 17:28, 9 December 2010 (UTC)[reply]

while we're chatting.....

there's a deletion discussion quietly going nowhere featuring two lovely people having sex as radiohead plays gently in the background. There's no indication that both parties agreed to the worldwide publication of their efforts, nor that the band members involved in the creation of the music approve either. In the charming wiki world we live in, one of those facts will at some point, maybe after a month or two, ensure that the file is removed - funny old game.... Privatemusings (talk) 22:52, 9 December 2010 (UTC)[reply]