Commons talk:Sexual content: Difference between revisions

From Wikimedia Commons, the free media repository
Jump to navigation Jump to search
Content deleted Content added
Line 854: Line 854:
# {{oppose}} Really having a specific policy for sexual content is a cultural bias not appropriate for an encyclopedia especially if assessments are to be conduced on idiotic and backward grounds like the [http://en.wikipedia.org/wiki/Miller_test Miller Test]. The scope and usefulness principles are enough, and I trust the admins to take care of the puritanical US laws not to threaten the hosting until Wikipedia can find a more open-minded hosting location. [[User:Leuenberg|Leuenberg]] ([[User talk:Leuenberg|<span class="signature-talk">talk</span>]]) 20:44, 12 December 2010 (UTC)
# {{oppose}} Really having a specific policy for sexual content is a cultural bias not appropriate for an encyclopedia especially if assessments are to be conduced on idiotic and backward grounds like the [http://en.wikipedia.org/wiki/Miller_test Miller Test]. The scope and usefulness principles are enough, and I trust the admins to take care of the puritanical US laws not to threaten the hosting until Wikipedia can find a more open-minded hosting location. [[User:Leuenberg|Leuenberg]] ([[User talk:Leuenberg|<span class="signature-talk">talk</span>]]) 20:44, 12 December 2010 (UTC)
#:Real encyclopedias like Britannica don't have such images, so your rational would back banning the sexual content completely. Thus, your oppose is really a support. [[User:Ottava Rima|Ottava Rima]] ([[User talk:Ottava Rima|<span class="signature-talk">talk</span>]]) 22:40, 12 December 2010 (UTC)
#:Real encyclopedias like Britannica don't have such images, so your rational would back banning the sexual content completely. Thus, your oppose is really a support. [[User:Ottava Rima|Ottava Rima]] ([[User talk:Ottava Rima|<span class="signature-talk">talk</span>]]) 22:40, 12 December 2010 (UTC)
#::Britannica is a general encyclopedia, specialized encyclopedias about sexual topics (examples easily found on amazon [http://www.amazon.com/Encyclopedia-Unusual-Practices-Brenda-Love/dp/1569800111 Encyclopedia of Unusual Sex Practices], [http://www.amazon.com/Sexploration-Edgy-Encyclopedia-Everything-Sexual/dp/1569755051/ref=sr_1_6?ie=UTF8&qid=1292210044&sr=8-6 Sexploration: An Edgy Encyclopedia of Everything Sexual]) have illustrations and pictures about sex, and specialized medical encyclopedias have also pictures that we, for precautionary principle (and I agree with this principle), don't have and will never have, with or without this policy (i.e. nude pictures of minors for articles about [[:en:Child development]]). With the current rules we already host only a subset of what can be found in specialized paper encyclopedias. --[[User:Yoggysot|Yoggysot]] ([[User talk:Yoggysot|<span class="signature-talk">talk</span>]]) 03:27, 13 December 2010 (UTC)
# {{oppose}} --[[User:Toter Alter Mann|Toter Alter Mann]] ([[User talk:Toter Alter Mann|<span class="signature-talk">talk</span>]]) 21:10, 12 December 2010 (UTC) <small>A reason would be nice... [[User:Barts1a|Barts1a]] ([[User talk:Barts1a|<span class="signature-talk">talk</span>]]) 22:25, 12 December 2010 (UTC)</small>
# {{oppose}} --[[User:Toter Alter Mann|Toter Alter Mann]] ([[User talk:Toter Alter Mann|<span class="signature-talk">talk</span>]]) 21:10, 12 December 2010 (UTC) <small>A reason would be nice... [[User:Barts1a|Barts1a]] ([[User talk:Barts1a|<span class="signature-talk">talk</span>]]) 22:25, 12 December 2010 (UTC)</small>
# {{oppose}} - Abusus non tollit usum. --[[User:Reddi|Reddi]] ([[User talk:Reddi|<span class="signature-talk">talk</span>]]) 23:29, 12 December 2010 (UTC) <small>A real reason would be nice... [[User:Barts1a|Barts1a]] ([[User talk:Barts1a|<span class="signature-talk">talk</span>]]) 00:46, 13 December 2010 (UTC)</small>
# {{oppose}} - Abusus non tollit usum. --[[User:Reddi|Reddi]] ([[User talk:Reddi|<span class="signature-talk">talk</span>]]) 23:29, 12 December 2010 (UTC) <small>A real reason would be nice... [[User:Barts1a|Barts1a]] ([[User talk:Barts1a|<span class="signature-talk">talk</span>]]) 00:46, 13 December 2010 (UTC)</small>

Revision as of 03:27, 13 December 2010

Archive 1 (Dec 2008-Apr 2009) | Archive 2 (April 2010) | Archive 3 (May 2010) | Archive 4 (June 2010) | Archive 5 (June - October 2010)

Recent Changes

I did a pretty major copy-edit to the whole policy. No content changes of any kind were intended. Here's a summary of what I did.

  • Fixed typos and obvious grammar errors
  • Shortened phrases for brevity
  • Rewrote some wikilinks for flow and simplicity
  • Cleaned up header and list formatting
  • Added H2 header to combine File Descriptions and Categorization under 'Working with sexual content'
  • Added H2 header to combine 2257 and obscenity law under 'Legal issues'
  • Removed duplicate policy links
  • Rearranged sections to put most important info first
  • Added italics to two key areas
  • Refactored legal discussion to separate Wikipedia procedure from historical background

If there are any questions, shoot. The diff won't be very helpful, since there was a lot of shuffling and it happened over 10 different edits, but there are no content changes. I'd just give it a top to bottom read to make sure that it didn't change anything substantive, and hopefully it's a cleaner, shorter version of an important policy. Ocaasi (talk) 05:20, 2 November 2010 (UTC)[reply]

Limits on what 2257 applies to

Should we add a note that {{2257}} should not be used indiscriminately and never on pictures of children? I removed 2257 from a lot of pictures of nude children; pictures of children that 2257 applied to would need deletion, not tagging. In the larger sense, I don't want to see this indiscriminately spammed across a bunch of files in Category:Nudity when it only applies to sexual activity.--Prosfilaes (talk) 07:29, 2 November 2010 (UTC)[reply]

That seems like an important point: record keeping is not for illegal images; deletion is for illegal images. I don't know when exactly the line is drawn, but if you have that sense, it sounds good. Ocaasi (talk) 09:36, 2 November 2010 (UTC)[reply]
That's not my point at all; my point is that record keeping is not for mere nudity.--Prosfilaes (talk) 11:07, 2 November 2010 (UTC)[reply]
Alright, misread then. I think they're both good points. I will add yours as a counterweight, explaining both ends of the 2257 spectrum. Check in and see if it works. Sorry I identified you as privatemusings in the edit comment, I had just written on that talk page. Ocaasi (talk) 12:45, 2 November 2010 (UTC)[reply]
I tried this: Content identified as a sexual depiction of a minor should not be tagged with a 2257 template or amended to include 2257 information; it should be deleted. Ocaasi (talk) 10:21, 2 November 2010 (UTC)[reply]
New Version: 2257 requirements apply specifically to sexual content including subjects of a legal age. If content is of child nudity that does not qualify as sexual then 2257 notifications are not needed. On the other hand, if child nudity is identified as sexual, then it should not be tagged with a 2257 template or amended to include 2257 information; it should be deleted and reported to the WMF. Ocaasi (talk) 12:55, 2 November 2010 (UTC)[reply]
Yes - I can say unequivocally that 2257 should never be used on images of minors (unless the image happens to also include images of adults who fit it). Explicitly sexual images of minors should be deleted, not tagged - mere nudity is generally not a reason for 2257 tagging. I've added brief notes to the tag regarding this. Dcoetzee (talk) 21:44, 4 November 2010 (UTC)[reply]

How about the artworks? They shouldn't be counted as sexual just because someone is naked (i.e. a god) unless it really contains sexual content. 221.126.247.208 10:08, 12 December 2010 (UTC)[reply]

What does this mean?

Can someone tell me what this sentence is trying to say:

By sporadic application of the First Amendment obscenity prosecution for adult sexual content or for sexual artwork, depicting apparent minors has become rare and unpredictable, but explicit sexual content depicting actual minors has not been protected by the courts and its prohibition is strictly enforced. Ocaasi (talk) 01:16, 4 November 2010 (UTC)[reply]
The Feds could sue a site hosting w:lolicon claiming that the material was legally obscene, but given that obscenity requires that it be without artistic, etc. value, it's never cut and dry, so they rarely do so. However, material showing actual children doesn't have such requirements, and will get slammed down quickly and hard. (In practice, I understand a lot of cases are stopped by the fact the FBI won't generally go after teenage CP unless they can get a confirmed ID and age of birth. But that's not really relevant for us.)--Prosfilaes (talk) 01:45, 4 November 2010 (UTC)[reply]

Check please:

Obscenity prosecutions of adult sexual content, sexual art, or caricatures depicting apparent minors has become rare and unpredictable; however, explicit sexual content depicting actual minors has received no protection and prohibitions against it are strictly enforced.

--Ocaasi (talk) 09:57, 4 November 2010 (UTC)[reply]

I read that, and it still doesn't make sense, because there is nothing qualifying 'protection' - the sentence above uses the term 'modification', and 'protection' does not obviously apply when what is actually being referred to is a loosening of restrictions on the depiction of such materials. How about Obscenity prosecutions of adult sexual content, sexual art, or caricatures depicting apparent minors has become rare and unpredictable; however, there has been no such modification where explicit sexual content depicting actual minors is concerned, and prohibitions against it are strictly enforced. --Elen of the Roads (talk) 22:50, 5 December 2010 (UTC)[reply]
Ah, I see what you mean. The perils of a la carte fixes. It will probably have to wait until after the poll, but that should be an uncontroversial improvement. I think we should start a list of to-fix-afterwards edits, since all of the new eyeballs will be wasted while the draft is !vote-locked. Maybe I'll do that. Ocaasi (talk) 18:14, 6 December 2010 (UTC)[reply]

Archive

Would someone mind archiving the old threads on this page. I'm a bit inexperienced in the archive business...Ocaasi (talk) 01:16, 4 November 2010 (UTC)[reply]

I've given it a go - although we may not even require 4 / 5 months of threads above? cheers, Privatemusings (talk) 02:19, 4 November 2010 (UTC)[reply]
Excellent...Ocaasi (talk) 07:37, 4 November 2010 (UTC)[reply]

Question about content without explicit consent

Edit went from:

This shall not in itself be a reason to delete them, but the community will discuss whether or not consent can be assumed in individual cases.

to

The community is free to discuss whether or not consent can be assumed in individual cases.

Do we want to remove the statement that lack of explicit consent is not sufficient for deletion? Is 'community is free to discuss' weaker than 'community will discuss'? Which is intended... Ocaasi (talk) 07:48, 4 November 2010 (UTC)[reply]

I think we need the statement, because it's been such a controversial question in this discussion and is likely to arise over and over. A small but persistent minority have wanted a strict requirement of explicit consent for everything. I think it's important that we make it clear that is not the policy we are proposing. - Jmabel ! talk 15:08, 4 November 2010 (UTC)[reply]
Agreed. There are many works for which explicit consent is unnecessary and unreasonable (e.g. notable artwork that happens to include a nude person). Dcoetzee (talk) 21:39, 4 November 2010 (UTC)[reply]
That sentence is not about "everything". There are already a list of cases where explicit consent is not necessary (including non-photographic artwork..., and old photographs). That sentence is only about old uploads. We compromised that there would be no mass deletion of old untagged uploads, but that individual discussions would proceed. So, how does the new wording not convey that? --99of9 (talk) 09:40, 5 November 2010 (UTC)[reply]
Logically, it shouldn't matter if we say explicit consent is necessary or not if we're leaving it to case by case discussion anyway. Even so, I agree that I'd like to keep the extra statement anyway, just to be clear about it. Wnt (talk) 09:37, 5 November 2010 (UTC)[reply]
I think there's a risk of being overly verbose, a bit confusing, and perhaps even mildly prejudicial, to be honest. The intent of my edit (as requested) was to keep it simple - a deletion discussion where an older file is deleted due to perceived consent issues is ok, as is a deletion discussion where an older image is kept despite consent issues being raised. If our intention here is not to mandate things either way, which I think it is, then I think it's prolly best the way it is :-) Privatemusings (talk) 23:25, 7 November 2010 (UTC)[reply]
I combined the bullet point with the other point about this to reduce confusion (I hope). Looking at the text and messing about with it this way and that, I didn't see any obvious way to make the text clearer without making it more confusing. ;) I think that as long as we have agreement here that we're not somehow intending this to be interpreted to allow deletion by default of old material (which I don't think is how it reads), I suppose we can spare readers of the policy the extra verbiage. Wnt (talk) 04:02, 8 November 2010 (UTC)[reply]
I kept the combination, but reworded it a bit, dropping the future tense and instead stating it like this:
Uploaders of self-produced sexual content are expected to make a positive assertion that the subjects of the photos consented to the photograph and to its public release. Provision of further evidence of consent is welcome (using the OTRS process) but is not normally required. Content which lacks assertions of consent, usually from files uploaded before implementation of this policy, should not be automatically deleted. The community should instead discuss whether or not consent can be assumed in each case.
I think that covers both issues ok, while keeping it as straightforward as possible. Ocaasi (talk) 17:42, 8 November 2010 (UTC)[reply]

< I tried again, because I felt that 'your' version kind of implied that future uploads of sexual content which are missing explicit assertions of consent might be ok - whereas I think we've agreed that they're necessary. Hopefully it's even clearer now :-) cheers, Privatemusings (talk) 23:45, 8 November 2010 (UTC)[reply]

I see how your version was an improvement in affirming the present requirements. I tightened the phrasing just a touch, no change to content (except 'discuss' to 'determine', if you want to be really technical). On the other sentence, you went from the 'should not be automatically deleted' to 'free to discuss', which though correct is a somewhat weaker way of saying that prior content without consent 'shouldn't be deleted without discussion'. I think in cases where people get deletion-happy, 'free to discuss' doesn't set a requirement for them to make that effort. How about both changes? Ocaasi (talk) 05:55, 9 November 2010 (UTC)[reply]
I thought that version sounded a little too favorable toward random proposals for deletion of old content, while being too restrictive toward proposals to delete new content that seems very suspicious despite an assertion of consent. (A different issue, mentioned only briefly in the previous discussions, was that you might come up with some content that seems intrinsically suspicious despite an assertion - a flash snapshot of a startled couple in a hotel bed, etc. The assertion should generally be a good defense, but we weren't going to require editors to be completely blind, either) I'm not sure my last edit is the best way either... Wnt (talk) 09:14, 9 November 2010 (UTC)[reply]

I've made a small edit, but I'm still concerned by this new wording. One thing that bugs me is that old random Flikr uploads will now be treated differently to new random Flikr uploads (without assertions). The old need reasonable suspicion to delete, the new ones should not be uploaded. --99of9 (talk) 09:55, 9 November 2010 (UTC)[reply]

In general I think we should have some hint of positive trustworthiness (e.g. trusted uploader) to keep the old stuff, rather than just no negatives. --99of9 (talk) 10:00, 9 November 2010 (UTC)[reply]
Looks ok to me. Maybe 'preferred' instead of 'encouraged' just as a word choice. Ocaasi (talk) 14:11, 9 November 2010 (UTC)[reply]
I thought the last draft (99of9) was OK also. I just added a footnote linking an article about the Tyler Clementi case as general background - but if someone doesn't like that I won't insist on keeping it. Wnt (talk) 03:55, 11 November 2010 (UTC)[reply]
That case is really significant (or it will be soon), but I have a few thoughts. I'd like to tighten up the summary a bit just for brevity. Also, do we need to consider victimization issues? This case is receiving ample attention right now, and the family hasn't been avoiding publicity, but maybe until this actually winds up in a court ruling we should just speak generally about it. I could go either way. Thoughts? Ocaasi (talk) 20:13, 11 November 2010 (UTC)[reply]
I've seen several news cycles about Clementi, and in the U.S. his case is being used as something of a rallying cry against bullying. I don't think it would be inappropriate to name him. The legal case against the people involved in the videotaping is likely to be a significant test of a 2003 "invasion of privacy" law — as I understand it, they were operating their own webcam in their own (shared) dorm room. It may be more important in social terms, however — before this case, I think that many people would think of the consent issues more in the context of actresses suing to try to keep sex tapes off the Internet, whereas now some are going to think of it as a life and death issue. I don't think that's really fair — it was prejudice and bullying, not video, that caused this tragedy — but I can't pretend it won't be true, and editors (many of whom may not be from the U.S. and never heard of the case) should have some warning.
As people here know by now, I believe in the "not censored" principle of Wikipedia, and even now I wouldn't have acquiesced to demands for an explicit assertion of consent for future content without a tangible reason. If scenes from the Clementi video had been posted to Commons without an explicit assertion of consent, policy wouldn't touch Clementi (we wouldn't know who he was), nor most likely the uploader (who would be breaking the law and existing policy with or without an explicit statement about it, and probably wouldn't need a notice to tell him this). But it might have protected people "distributing" the video, which I think might be interpreted to affect various types of innocent editors by various arguments which it is probably not productive to detail here. By having an explicit statement to fall back on, they would have a firm defense, whereas without it prosecutors might well end up Wikilawyering in the courtroom.
I am not at all happy to have yet another way for it to be illegal to copy content you find on the Internet!, but from what I've read (which questions whether the video was viewed by anyone) I don't know if the current prosecution could lead to constitutional review of that part of the law; if it did this is not the test case I'd want to see used to make the decision. Wnt (talk) 21:46, 11 November 2010 (UTC)[reply]

Commons Study on Controversial Content

I noticed there's no mention, not even a link to the recent study on controversial content. Is that intentional? Does this policy assume that those recommendations have no teeth or force until the foundation decides to do something about them? Does it make sense to link to them in a 'see also' section, perhaps? Ocaasi (talk) 10:01, 4 November 2010 (UTC)[reply]

See also seems a good idea but the study is just a study, not a policy mandate. I think we ought to adopt this proposal as policy on our own rather than by being forced to via a WMF mandate, though. ++Lar: t/c 11:44, 4 November 2010 (UTC)[reply]
Yes, it's always nicer to self-legislate, like when McDonald's added apple slices in lieu of FDA regulation. I'll try a see also link, which seems to be the right fit between informing and capitulating to the (thoughtful) meddling of the outsiders. Ocaasi (talk) 11:56, 4 November 2010 (UTC)[reply]
(I really don't get the apple-slices/McDonalds reference. Was this a US problem?)Archolman (talk) 00:43, 6 December 2010 (UTC)[reply]
(There were rumblings about Big Food becoming the next Big Tobacco, facing multi-billion dollar obesity lawsuits, congressional hearings, etc. with documents showing execs tried to market to kids and get them addicted to fatty, salty, hormone-ridden snacks. So the industry took a hint and voluntarily adopted milk, apples, more salads, and a few other things to health-wash their image. Which uncharitably suggests by analogy that that is what we're doing, but there's either a) no truth to that or b) nothing wrong with it) Ocaasi (talk) 18:20, 6 December 2010 (UTC)[reply]
The study makes few specific recommendations, and those it makes largely apply to other projects (image hiding on Wikipedia) or are not limited to sexual content as defined here (semi-nude women, which are not considered sexual content by our definition). In any case, I'm not very favorable about it. Wnt (talk) 09:31, 5 November 2010 (UTC)[reply]
If the recommendations are passed on by the foundation, we can then consider whether to expand our definition of sexual content. The good thing is that what we've developed here is a framework for how to deal with whatever is considered sexual. --99of9 (talk) 09:44, 5 November 2010 (UTC)[reply]
Yes - it is important though that we adopt this (in some form) as policy before the Foundation considers the study, or else I fear they might swoop in, say we're not taking clear action, and institute something more conservative. Dcoetzee (talk) 20:53, 5 November 2010 (UTC)[reply]
It's premature to speculate on how to deal with a hypothetical arbitrary policy, which might affect anything. But I think we've designed this policy to deal with a very specific class of content, based to a large degree on American legal distinctions. If the WMF imposed some policy about bare breasts I think we would be best off (at least) to try to keep it as something separate rather than trying to integrate it here. The whole point of an imposed policy is that it is imposed on you, so how can you edit it? So how and why would you try to unify it with any other? Wnt (talk) 03:47, 8 November 2010 (UTC)[reply]
Agree with that, except that when something is 'imposed' on you, it can still be in your interest to adapt it to your purposes, or at least to integrate it with them. But it's premature to bring it up while those issues are just floating through the proposals-and-recommendations cloud. Ocaasi (talk) 17:45, 8 November 2010 (UTC)[reply]

Do we need assertions of consent from unidentifiable people?

The way the last part of the "Prohibited content" section is currently written, we "must" have an assertion of consent even for close-up photos showing, e.g., a particular kind of STD rash. Firstly, is there any reason to require an assertion of consent when the subject of the photograph isn't identifiable? Furthermore, wouldn't requiring an assertion of consent in such cases hinder the ability to illustrate educational medical sexuality articles? 71.198.176.22 16:44, 16 November 2010 (UTC)[reply]

were this proposal adopted, yes - sexual content, as defined on the proposal page, would require an explicit assertion of consent. If you read the note related to the 'close up' type of image you use as an example, I think it's likely such an image would fall outside of the purview of this policy. Personally, I think an assertion of consent would be desirable anywhooo - don't you? cheers, Privatemusings (talk) 20:10, 16 November 2010 (UTC)[reply]
No, I don't think it would be desirable to have to get consent for close up, unidentifiable images because what is the point of such a hassle? I'm not sure what the note you linked to has to do with images of unidentifiable people, either. Is that the note you meant to link to? 71.198.176.22 04:26, 18 November 2010 (UTC)[reply]
I see there was a "grammar tweak" from "with sexual content all are subject" to "all sexual content is subject".[1] This is an example of the perils of copy editing. Additionally, there was some effort to define "identifiable" quite expansively before, including e.g. EXIF information, which has since been taken out; I think the feeling was that it was best to have one single definition of identifiability under the main policy about it.
However, on consideration, I think we should think through carefully just what we want or need to say about this. For example, Wikimedia might want to steer clear of unpublished photos from a secret locker room camera, no matter how carefully cropped and anonymous they may be. On the other hand, we would want to rule in non-identifiable photos from a retired surgeon who has taken a collection of photos for personal use and in medical publications, but never obtained consent to publish them here. My first thought on processing this is that we need to exclude even non-identifiable photos if the photography itself was an illegal violation of privacy, but that we don't need to exclude them otherwise. For example, in the locker room example, it is possible that the act of investigating the dissemination of the "anonymous" photos would end up revealing that they were of a particular person, and put Wikimedia in an entirely undesirable position. But before trying to implement such a zig-zag line, I should see what others here have to say. Wnt (talk) 13:27, 18 November 2010 (UTC)[reply]
I think if you re-read the copy-edit closely, it actually removed a redundancy and did not change the meaning at all. Generally, though I respect legal writing, policy, and constitutions, etc. there is something wrong if a single word or phrase can obscure the intent of the guidance. So we should make it more plain and robust if that is what happened.
I think your 'illegal itself' notion is a good distinction, but I don't know how it would be incorporated. It seems more than reasonable that content which appears to be illegally acquired should be excluded, even if it does not violate other provisions (identifiability, sexual content with children). But how would we know what 'looks illegal'. I think the only question is whether those situations should be a) speedy deleted, with the possibility of discussion afterwards; b) discussed per regular deletion, with the burden on showing it is legal; or c) discussed per regular deletion, with the burden on showing it is illegal.
My gut instinct is that where there identifiability or sexual content with children, a) is the obvious choice; where the content is blatantly sexual or exposing body parts, (or if there are subtle identifiers that could end up as clues), that b) is the best option; and if there is just a generic body part but no children, sex act, or obvious identifiers, that c) is okay. That's not very simple, but it's my best stab at it. Thoughts? Ocaasi 02:20, 19 November 2010 (UTC)[reply]
There's really no way to verify that consent was really obtained for photography under any scenario - even if we kept 2257 paperwork, which we shouldn't, it would be child's play to forge an electronic copy of a driver's license and a signature for which we don't have a valid exemplar. Here we're just trying to figure out a required standard, without much thought for serious enforcement beyond the assertion of consent. However, I think it's possible to be pretty sure in some cases that no consent was obtained, allowing AfDs to be raised. Even so, I don't think that speedy deletion will usually be appropriate for suspected lack of consent, simply because it is going to be a complex conclusion to draw from media that don't explicitly say that they are violating policy. I think that the current guidance that speedy deletion is only for uncontroversial cases is sufficient. As for burden of proof, I don't think that any AfD on Commons truly has a "reasonable doubt" standard; it's more a "preponderance of the evidence", which puts b and c as pretty much the same option. Disturbing the current language on the point could be trouble, since we've had enough trouble just agreeing that there should be a reason to suspect lack of consent... Wnt (talk) 12:00, 19 November 2010 (UTC)[reply]
I've added yet another bullet point to the section, trying to deal with this issue out in the open; it should make any other rewording unnecessary.[2] Provided I didn't screw it up... we're diving into some rather fine detail nowadays. Wnt (talk) 12:25, 19 November 2010 (UTC)[reply]
No no no. Why are we suddenly reopening something we've hammered out so hard already? I thought we had agreed to stick fairly closely to copyedits until we had a clean stable page to vote on. This "close crop" stuff allows someone to put their ex-girlfriend all over commons as long as he cuts her up small enough. Consent is important. I will revert, please obtain wide consensus before making such significant changes since we've already found a compromise point. --99of9 (talk) 13:27, 19 November 2010 (UTC)[reply]
What would the point of a vindictive ex putting unidentifiable images up? Anything intended to embarrass is necessarily going to be identifiable. I don't think this is a legitimate point of compromise since we already have dozens of close-cropped images of genitalia and breasts, several of which are currently being used to illustrate wikipedias. The proposal as worded would require the deletion of those images unless the uploaders, many of whom are long gone, come up with assertions of consent. I don't believe anyone actually wants that. Do they?
I think Wnt's bullet point was a reasonable solution, so I'm replacing it. 71.198.176.22 03:37, 20 November 2010 (UTC)[reply]
Hmmm, I don't want to break down a fragile consensus, but at the same time I'm responding to a November 4 "grammar tweak" as explained above. I don't really want to go back to that rather peculiar phrasing either, especially since we apparently had more difference of opinion about what we were reading there than we imagined. I'd hoped that my wording might be acceptable, but is there a way we can fix this? I should emphasize that if "identifiable" is interpreted narrowly enough, 99of9 does have a point about the cut-up photo sequences. I originally considered some sentences that tried to define "identifiable" strictly, e.g. specifying that a photo can be identifiable from context, clothing, tattoos, etc. That should include cut-up photo sequences where one end has the head and one end has some other part. The problem is, I see this really as a question for the photographs of identifiable people policy - we shouldn't have two different definitions of "identifiable" knocking around or chaos will ensue. Wnt (talk) 05:20, 20 November 2010 (UTC)[reply]
Wnt, I don't understand how you could have otherwise interpreted that sentence before the grammar tweak. Can you please spell out what you thought it meant? The diff you provided seems like a straightforward improvement on the English, and I can't for the life of me see any change of meaning, even implied meaning... --99of9 (talk) 12:09, 20 November 2010 (UTC)[reply]
The pre-Nov. 4 version read "All media is subject to Commons:Photographs of identifiable people.... In addition with sexual content, all [media subject to COM:IDENT] are subject to the following requirements:" It seems to me that the following points were meant to modify the normal requirements of the COM:IDENT policy. Since the consent issue has previously been part of COM:IDENT, this is what was to be expected. Wnt (talk) 13:51, 20 November 2010 (UTC)[reply]
IP address, your statement "The proposal as worded would require deletion of ..." appears completely false to me. We have a dotpoint that explicitly addresses files uploaded prior to activation of this policy. Since that was the justification of your revert, I trust that you will restore to correct your error. Further, my point above was that "I think ... is a reasonable solution" is no longer a good enough reason to make functional changes to the text. We have a carefully agreed compromise, and one or two people is no longer sufficient consensus to make that kind of change. We are supposed to be copyediting. (PS, please consider logging in and backing your opinions with the reputation of your username.)--99of9 (talk) 12:09, 20 November 2010 (UTC)[reply]
I should agree with 99of9 that this part of the policy isn't about the lack of explicit assertions of consent, which would affect the old files, but only about actual lack of consent, which is a much narrower matter. What I'm interested in preserving here is the use of photos taken with consent but not with explicit consent for use here, primarily medical and surgical photos.
To give an example of the sort of things I'm thinking about, consider an NIH-funded paper about a surgery which is published in a private journal, but which by law is required to be under an open license. The paper may contain a low rsolution black and white photo of the procedure, which is published in the journal, but include "supplemental data" at the journal's, author's, or department's web address with a wider array of high-resolution color photos and video. Such photos can be used on Commons by copyright, but may not be regarded by some as having been published (especially since it is not that uncommon to update supplemental data after the paper is out...). The patient would have consented to a medical publication and other uses by the author, but probably not by us.
I think there is a way to write this to clearly rule out the cut-up ex scenario, and clearly rule in the medical images scenario. Wnt (talk) 23:18, 20 November 2010 (UTC)[reply]

I hadn't realized that the text currently includes "Explicit assertions of consent are encouraged but not required for sexual content uploaded prior to implementation of this policy" which does indeed address my concerns about earlier images above. However, I really do think requiring an assertion of consent for unidentifiable images is much less desirable than allowing them. The chance that an uploader is going to try to embarrass someone with an unidentifiable image is tiny, and there's no way it would work if they tried, by definition. That means requiring consent for unidentifiable images is needless instruction creep. Also I hope Wnt can tweak his text to address the external publication issue he describes. I don't understand why we shouldn't still be trying to make improvements to the proposal; we improve policies even after they're adopted. 71.198.176.22 06:51, 21 November 2010 (UTC)[reply]

Want to bet? Send me a full frontal nude image of yourself, your full name, and the copyright holder's release, and I'm sure I can embarrass you on commons without breaking existing policy. --99of9 (talk) 08:42, 21 November 2010 (UTC)[reply]
If you cut an image in two, where one half is identifying and the other is embarrassing, then the existing policy would allow publishing the embarrassing part here. We do not want that, as the identifying part can be published later or elsewhere. That is why we want consent for most sexual content. Truly unidentifiable images and images from serious sources are no problem. --LPfi (talk) 11:48, 22 November 2010 (UTC)[reply]
As I said, 99of9 has a good point if we define identifiability too narrowly. But this scenario extends to many things beyond this policy. After all, a full frontal nude image of a typical person, especially if female, can be quite embarrassing even if the actual genitals are obscured, and it isn't within the scope of this policy at all. Nor do I think we should consider a separate COM:BREASTS policy to deal with that; after all, it might be embarrassing for a man to have his belly cropped out and used to illustrate the obesity article and so forth, if the picture still gets linked back to his name.
My feeling is that we can't force our policies on someone with truly malicious intent - he can always falsely assert consent for the whole image, even fake "evidence" such as it is, so why focus on a later malicious release of a second half of an image? But we might warn editors against doing this unintentionally. Based on this discussion my thought is to add some redundant warning about identifiability, though trying not to set up a separate definition. Wnt (talk) 15:41, 22 November 2010 (UTC)[reply]
I agree; it doesn't matter how close-cropped an image is if it has been, or is likely to be, associated with someone's name or identity. In that case an assertion of consent should be required for new uploads, and if the person associated with an image uploaded any time wants to withdraw consent or claims that there never was consent, we should respect that and delete the image. I am not sure those points are in the policy, and they should be. I'm going to try to put them in. 71.198.176.22 21:54, 22 November 2010 (UTC)[reply]
best way forward in my view, and the one currently written up, is for the sexual content policy on consent to apply to all sexual content - it's easier, clearer, and just plain better that way :-) Privatemusings (talk) 20:26, 22 November 2010 (UTC)[reply]
"All sexual content" includes both images and text, a substantial subset of which there is a wide consensus to allow instead of delete. Which version are you referring to as "the one currently written up?" 71.198.176.22 21:54, 22 November 2010 (UTC)[reply]
not by our definition :-) - take another look..... Privatemusings (talk) 22:36, 22 November 2010 (UTC)[reply]
I'm sure that whoever wrote "media depicting" didn't intend to include textual media or textual descriptions, but text is a kind of media, and according to two of the dictionaries I just checked, depictions include text as well as pictures. Do you think it would be a good idea to resolve such ambiguities before asking people to approve the draft? 71.198.176.22 23:48, 22 November 2010 (UTC)[reply]
  • "Media" here is clearly in the sense that Commons is WMF's media repository. We can't just say "images" because it also includes audio, video, etc. Still, in the context of Commons content, when we refer to "media" we mean the media files, distinct from descriptive text. I believe this is pretty consistent across policies & guidelines. - Jmabel ! talk 05:15, 23 November 2010 (UTC)[reply]
Perhaps I was going about this the wrong way. I'm trying a different edit here [3] explicitly stating that "previously published" includes certain websites. Some of my other concerns are actually not prohibited by the current language of the first section, which only says "public release" rather than release on Commons. I'm not going to work myself up just now over the right to take other people's snapshots for personal use then release anonymized cropped bits on Wikipedia for fun; no doubt there is some case in which this is terribly important but I'm not thinking of it today. An aberration in this edit which might raise some hackles is that I refer to a Wikipedia policy in a link - unfortunately, I didn't see a Commons policy to cite about reliable sources! Wnt (talk) 19:57, 23 November 2010 (UTC)[reply]

Really not ready for prime time

I think we've got a reasonable consensus on the meaning of the policy we will present, but it's really not yet all that well written. If I have time (and focus), I may take shot at this myself, but I'd welcome another good writer trying in my stead.

For the most part, I would hope any rewrite would be shorter, not longer, but there is one thing I think is missing: an explanation of why sexual content raises certain issues that are not found, or are found only in a lesser degree, in other content. E.g.

  1. Sexual depictions of minors can trigger criminal legal concerns in many countries.
  2. Sexual depictions of identifiable living or recently deceased people have enormous potential to prove embarrassing to those people or their associates, and therefore it is more important than usual to be confident that the subject gave any necessary consent for a photo to be distributed.
  3. To a greater degree than in other subject-matter areas, hosting numerous sexual images with little or no educational value is likely to be detrimental to Commons' reputation. Few people will consider it a serious problem if Commons has superfluous pictures of domestic cats or of the west face of St. Paul's Cathedral. Far more will consider it a problem if Commons has superfluous amateur pornographic images.

If someone wants to rework that, great, but I think something like this is needed.

Also, there is currently one very confusing passage in the lede: "This policy addresses how Wikimedia Commons should handle materials outside Commons' project scope..." Is that really what we mean to say? It seems to me that the policy is, instead, about determining the boundaries of that scope. "Materials outside Commons' project scope" are routinely deleted. - Jmabel ! talk 21:02, 16 November 2010 (UTC)[reply]

I'm not sure if writing style is justification for holding up a vote. After all, it's a policy, not a literary masterpiece, and having a policy that we agree on the meaning of already seems like an accomplishment. We keep having gaps of up to a week when no one edits the policy, and waiting for major revisions could take a very, very long time. And every change, however small, seems to risk reopening disputes. I see you've made improvements, and so far we've been delaying the vote for copy-editing; still, it has to end sometime. Wnt (talk) 11:38, 17 November 2010 (UTC)[reply]
I think Jmabel is getting at the fact that we have a good document that needs one more paragraph in the lead. It can be done. Points 1 and 2 are clearly in the article in different sections. Point 3 is somewhat more controversial because it gets into the why of this policy, and whether there are extra-legal, extra-commons concerns about mere reputation or social responsibility involved. I can't address the last point, but I'll incorporate the first to for the intro and see if we can discuss the third.
Sentences added: Sexual content must be handled differently from other content, because sexual depictions of minors can trigger legal issues in many countries; depictions of identifiable living or recently deceased people may be embarrassing or violate their privacy; and hosting sexual images can be done in a way which protects the reputation of Commons among the public to safeguard their continued support.
Question: Is point three accurate: are we writing this policy out of any consensus-concern about the reputation of Commons or about some general social concern?
I have to say that I oppose this paragraph altogether. The policy as written really does not treat sexual content differently: Commons has never had a policy of hosting illegal content, it has long had the Commons:Photographs of identifiable persons policy, and it does not censor photographs based on perceived external or even internal opposition to their content. Furthermore, it is not actually contributing to the policy in any identifiable way - it's exactly the sort of excess and confusing verbiage that we should be trying to chop out. Wnt (talk) 00:58, 18 November 2010 (UTC)[reply]
I do not like the paragraph either. There are other things that can trigger legal issues. Depictions of identifiable people may be embarrassing. Sure sexual content is extra sensitive, but I think people know that. And the third sentence is strange, it sort of says the thing backwards. We want to have these things straight, why do we say we do it to safeguard our support? --LPfi (talk) 13:53, 19 November 2010 (UTC)[reply]
I've offered a compromise (?) sentence about the above.[4] But I don't know if this will satisfy Jmabel, nor would I object to its deletion since it is more or less redundant. Wnt (talk) 01:08, 18 November 2010 (UTC)[reply]
I see what you're getting at, but if identifiability and illegality are sufficient, what is the point of this policy at all? By your description, either sexual content is already covered by existing policy and a sexual content policy is redundant (or merely guidance for applying the main policies), or else sexual content is handled differently than other content. I don't have a preference either way, but I think we should know which is which. I'm going to try rephrasing the added sentenced to say the same thing with less perceived distinction. See if it gets closer.
Sentences added (2): Sexual content tends to need closer attention because of the legal and privacy-related issues it raises. In order to ensure that such content does not violate U.S. law, or other Commons policies, this policy addresses processes to handle sexual material that falls within Wikimedia Commons' project scope and to identify and remove material that does not.
Re: Wnt's point about voting despite style, I agree. Policy on Wikipedia is rephrased constantly, but it doesn't affect the basic guidance. User:Ocaasi 13:25, 17 November 2010 (UTC)[reply]
I'd be perfectly glad to see us head toward a vote at this time.
As for the matter of reputation: that's essentially what started this all in the first place. Jimbo launched a bit of a crusade, deleting even some images that were clearly significant works of art. His actions were almost certainly in response to sensationalistic news stories besmirching Commons' reputation. - Jmabel ! talk 16:19, 17 November 2010 (UTC)[reply]
I recall that his edits generally said "out of project scope", and I would like to think that whatever external pressures were brought to bear on him, that he would not have launched an all-out attack on a category of content if much of it were not actually at odds with existing policy. We know some files were deleted that should not have been, but many of his deletions stuck based on preexisting policy. Wnt (talk) 01:15, 18 November 2010 (UTC)[reply]

Wikimedia leadership response to controversial content

Linking these because they may be of interest to people watching this page, and to inform policy discussion.

  • en:Wikipedia:Wikipedia_Signpost/2010-11-15/News_and_notes#Controversial_content_and_Wikimedia_leadership
  • Sue Gardner: Making change at Wikimedia: nine patterns that work: "we’re the only major site that doesn’t treat controversial material –e.g., sexually-explicit imagery, violent imagery, culturally offensive imagery– differently from everything else. The Board wanted –in effect– to probe into whether that was helping or hurting our effectiveness at fulfilling our mission."
  • [Ting Chen] writes: "the ongoing controversial content discussion is a result of our strategic planning (development and adaption in the nonwestern cultures) and the response of the changes in public policy and in our responsibility."
  • meta:Talk:2010_Wikimedia_Study_of_Controversial_Content:_Part_Two#Recommendations_discussion: Inviting feedback on the 11 recommendations of the study, which are:
    1. no changes be made to current editing and/or filtering regimes surrounding text
    2. no changes or filters be added to text on current Wikimedia projects to satisfy the perceived needs of children
    3. creating Wikijunior
    4. review images of nudity on Commons
    5. historical, ethnographic and art images be excluded from review
    6. active curation within controversial categories
    7. user-selected shuttered images (NSFW button)
    8. no image be permanently denied with such shuttering
    9. allow registered users the easy ability to set viewing preferences
    10. tagging regimes that would allow third-parties to filter Wikimedia content be restricted
    11. principle of least astonishment to be codified

Feel free to visit the last link and provide your own feedback. Dcoetzee (talk) 04:03, 17 November 2010 (UTC)[reply]

Should we allow subjects to withdraw consent for images uploaded prior to adoption?

Is there any reason not to allow subjects associated (correctly or incorrectly) with a potentially embarrassing image to withdraw consent for its inclusion, as a BLP-strength support for deletion? I remember people suggesting that there was, but I'm having a hard time remembering why. 71.198.176.22 22:03, 22 November 2010 (UTC)[reply]

the granting of, and subsequent withdrawing of consent is a tricky issue. I'd be happy to continue discussions about the best policies in this area whilst the current draft is adopted. Privatemusings (talk) 22:37, 22 November 2010 (UTC)[reply]
I'd rather we put forth a draft addressing this issue, because it is tricky, instead of delaying addressing it before asking people whether they want to support the draft. But I'm not opposed to addressing the issue in a wider proposal instead, since it's not specific to sexual content. Commons:Living persons would seem to apply, but that has only been edited by one person about a year ago, and it's in pretty bad shape. Can we use foundation:Resolution:Biographies of living people instead? It doesn't say anything about images, but the spirit of that Resolution seems very clear to me here. Alternatively, should this issue be addressed in Commons:Photographs of identifiable people instead of here? Or do we want to limit withdrawal of consent to sexual content? 71.198.176.22 23:48, 22 November 2010 (UTC)[reply]
Currently the photographs of identifiable people policy is the only place where consent issues come from; depending on ongoing discussions above, things may or may not stay that way.
My position is that the point of Commons is to put images permanently into the public domain. So consent should not be revocable, just as the copyright terms are not revocable. That said, I think we should heavily weight the subject's statement about consent: if he says he never consented, or never consented to public distribution of the image, this outweighs the photographer's assertion of consent. Only hard evidence like a verifiable prior publication of the image is likely to outweigh the subject's statement. It is true that in many cases subjects can abuse this to effectively revoke consent, but at least we're not committing ourselves to the principle that Commons images are only temporarily public. I should note that a subject should only be able to challenge an image if he asserts that it correctly associated with him; dispelling false rumors isn't our responsibility. Lastly, in the interests of defending the public domain, we must not remove published photos that have reached the public domain, no matter how the subject objects; otherwise we fall behind the archives of the original magazine and other copyrighted resources. Wnt (talk) 18:52, 23 November 2010 (UTC)[reply]
2 remarks, though I'm not sure either affects the policy we are discussing here:
  1. I presume you mean "public domain" in a metaphorical rather than a legal sense. I've posted tens of thousands of my photos to the Commons, but I have not placed them in the public domain.
  2. As a matter of practice, we often remove photos as a courtesy to the subject of the photo. If the picture is readily replaceable for all current uses, and there doesn't seem to be anything particularly notable about it, and if the subject of the photo objects to its presence here, we usually do this. For example, if we have a routine photo of someone around the threshold of encyclopedic notability, and they don't like the particular photo, and they provide us with a better one (or we already have a better one), we almost always accede to their wishes. - Jmabel ! talk 05:32, 24 November 2010 (UTC)[reply]
As a matter of practice, not policy, and when the image is replaceable. I would much rather have deleting photos on request a matter of practice when it's convenient or reasonable, rather than policy.--Prosfilaes (talk) 06:33, 24 November 2010 (UTC)[reply]
This is worth a separate policy about. Generally, regardless of legality, copyright, or current policy, I can't really see a good reason not to permit someone to change their mind for any reason at any time. If an image is harmful to a person's reputation, it's not worth hosting it, and we should probably be able to replace it anyway. The only situation I can see where this might not apply are mass uploads to non-sexual content (i.e. trains), where the author wakes up one day and wants to make a profit; that might be too late. Thoughts? Also, where would be the place to discuss this aspect of policy? Ocaasi (talk) 05:25, 25 November 2010 (UTC)[reply]
This discussion is getting off topic but, yes, there are lots of reasons not to allow people to revoke their permissions. None of the following has particularly to do with sexual imagery (which is why this is off topic) but...
Is there a better place where this discussion could happen?
  1. When you upload a photo you took, you grant an irrevocable license. If someone uploads a bunch of useful images, then later stalks out of the project in a fit of pique, they don't get to revoke those permissions and remove their pictures from Wikipedia.
  2. Example rather than abstraction here: the images from Abu Ghraib are certainly embarrassing to their subjects (both Iraqi and American) but we certainly wouldn't remove them because Lynndie England found an image of herself embarrassing.
  3. Example again: a U.S. Senator might object to us hosting anything other than their official photo, but it would still be entirely legitimate (for example) for us to host their high school yearbook photo or a photo contributed by someone who photographed them at a public event.
  4. Similar example, less obvious: if I take a photo of someone about whom we have a Wikipedia article, and I take it entirely in accord with the usual rules in the relevant country (e.g. photographed in a public space in the U.S.), and that person doesn't like Wikipedia having their photo, frankly, unless we decide we want to do a courtesy to them, it's their problem, not ours. Now, if it's really not a good photo, and especially if we have a better one, I'll probably support doing that courtesy, but it's a courtesy, not a matter of policy.
Jmabel ! talk 06:36, 25 November 2010 (UTC)[reply]
I think we're discussing different cases. Generally, I was only referring to pictures that the subject took themselves or granted permission themselves to use, not other public domain images which someone is merely the subject of. Re Lyddie England, she's a public figure at this point, and the photograph wasn't produced or uploaded by her--same with the Senator who presumably didn't upload the content that s/he is seeking to take down. I was trying to think of counterexamples where we should not honor a request for someone to take down a photo they took themselves or granted permission themselves. Can you think of any for those? Ocaasi (talk) 06:48, 25 November 2010 (UTC)[reply]
One of the most frustrating things Commons does is going around deleting images in use, for good reasons or bad. In an optimal world, we would never delete an image used in a current or historical version of any page on a Wikimedia project. We already give photographers much more rights then authors of encyclopedia articles, who have their names buried in the history and don't get to choose their license; there's no need to privilege them further. A photographer shouldn't get to rip their photo out of an article any more than an author could rip their text out of a WP article. We have to treat the portrayed with all the respect of a professional scholar or journalist, which usually coincides with their own desires; when it doesn't, we destroy our own quality as a news source and encyclopedia by letting them dictate to us.
Sexual content is complex, and I will freely acknowledge that the issues surrounding it mean it will get deleted much more freely then most content. I still think it important that policy demand that uploads to Wikimedia projects are not revokable, and that people think before uploading, instead of coming back and demanding that users on 14 Wikimedia projects fix up 47 pages because images that they assumed--and should have had every right to assume--were permanent are now being deleted.--Prosfilaes (talk) 07:09, 25 November 2010 (UTC)[reply]
What you're saying makes sense about the problems caused by revoked images. As long as the warnings to the uploader are sufficiently clear, I think they should at least need a 'good reason' and maybe an OTRS ticket. On the other hand, what about: a photo of a girl in a bikini which ends up on a very popular article like 'beach'; a masturbating woman who decides 5 years later that for employment, personal, or family reasons that the image is harming her; a picture of someone's pierced genitals which has become un-anonymized from cross-publishing in a body-modification mag; a topless photo from an 18 year old which 20 years later doesn't seem like such a good idea. I'm stretching credulity on some of those, but I'm looking for what the precise rationale is. We definitely shouldn't let people revoke content for permissions, except perhaps, when we should. Ocaasi (talk) 07:23, 25 November 2010 (UTC)[reply]
I'm happy to let all of those be dealt with under practice, not policy. But what about some other examples: someone writes the definitive biography of a Stalinist lackey that, while its positive spin has been toned down, stands as the Wikipedia article 10 years later, when he wants it deleted because it will hurt his political career. Or a women who contributes extensively to a Wikiversity project on polyamory who's now worried about the effects on her marriage. Or someone who rewrote masturbation to portray it as dangerous, and filled a talk page with arguments for keeping it; in any of those cases, do we shred articles or talk page to preserve the modesty of the author?--Prosfilaes (talk) 16:27, 25 November 2010 (UTC)[reply]
We're obviously well into hypothetical-world, which is fine with me. I think the major difference between our examples is that photographs are different than text. In both cases, there is a contribution (photo, writing). In both cases, the contributions can be linked to a contributor (uploader, writer). But in the case of the photograph the uploader (or consent giver) is the subject--the contribution bears the mark of the person who uploaded it right out in the open. In the case of an article writer, a screen name could easily be anonymous; and even if the screen name was a Real Name, the text itself would not obviously reveal who did it--you'd have to dig through the history to find the user and then check the diffs, etc. A photograph gives pretty much all of that information without any effort or words, which is why I think the comparison doesn't fit for me. I agree that text contributions like the ones you described should not be revoked, but I don't think that settles the photograph situation.
You're probably right that policy can avoid this issue, but it might be worth beefing up warnings on the upload page to make clear that consent is non-revocable. Once this COM:SEX policy is set, it might be worth linking to or summarizing as well. Something like: "You can upload sexual content to commons if: it is of your body and you understand it will be publicly seen in any context under an open license which you cannot revoke; it is of someone else who consented to and understood those terms; it is of content already released under compatible permissions." Currently the COM:SEX policy is geared towards Commons curators rather than uploaders, which is understandable but not ideal for a broad issue that should address the concerns of both 'users' and 'editors'. Ocaasi (talk) 02:17, 27 November 2010 (UTC)[reply]
So far as I know the upload page has always said that the consent is non-revocable. The point is, once an image goes up on Commons, and especially once it is used to illustrate Wikipedia, it is going to get mirrored all over the world. Publishing a photo here isn't much different than publishing in a sex magazine. Now Wikimedia does recognize a "right to vanish", and the uploader's account name, and presumably any mention of the subject's name in the image, could be removed from the image on this basis. But bear in mind that if we make a policy decision to delete a photo, we're also making a policy decision to delete its derivative works. We're telling the contributor who has taken that picture and put it beside another and labelled up the comparative anatomy between glans and clitoris, labia and scrotum and so forth that all the work he did was for nothing. Now if we were returning the original uploader to secrecy and anonymity we might be tempted, but knowing we're not? Wnt (talk) 18:58, 29 November 2010 (UTC)[reply]
I see what's going on here and the cats-out-of-the-bag nature of a non-revocable, open license. I think where it's pushing me is that we need to be a little more explicit about what exactly placing an image, especially on commons means. Basically, I think we should spell it out: "If you upload an image of your naked body or sexual act onto Wikipedia Commons, you are granting license for anyone in the world to use the image anywhere they choose so long as they follow the terms of the license. Also, you can never revoke your consent once it has been granted." Maybe it doesn't have to be as scary as that, but it's basically just the facts, and it is a little scary. That's what I'm getting at. We shouldn't change policy regarding revoking consent (although I bet OTRS does it quietly on occasion), but we should be abundantly clear that this content is not just for a single Wikipedia article and that uploaders have absolutely no control over content once it is uploaded. For-Ev-Er. Ocaasi (talk) 05:28, 30 November 2010 (UTC)[reply]

Does the upload form create a binding contract when images are submitted? If so, what is the consideration of that contract? Does the notation that the license is non-revocable carry any weight when the uploader isn't reimbursed? 71.198.176.22 14:26, 2 December 2010 (UTC)[reply]

I'm not a lawyer, but similar terms are stated by quite a few prominent sites on the Internet. On Facebook, for example, once you upload content you give them a pretty nearly unlimited license in that content, and it is not revocable either. - Jmabel ! talk 02:10, 3 December 2010 (UTC)[reply]
Armchair analysis (also not a lawyer): Consideration is the mutual exchange of something of value, either money, goods, or a promise not to do something else (e.g. I can contractually pay you $100 not to paint your car red). In the case of cc-by-sa licenses, there is obviously no two-way exchange of value or goods (unless one considers Commons to be exchanging a service by hosting and allowing the distribution of these materials, which is nice but legally unlikely); there is also no exchange of money. Is there a promise 'not' to do something that applies? Well, many copy-left licenses requires attribution, so there is a promise not to distribute the content without doing and also under a compatible copyright. Still, I don't think this is what the courts have in mind, since the license itself should probably be separate from the consideration which makes it binding. There are things which have no contractual power, but can still not be taken back. They're called gifts, and once you give them, you lose your claim of property over them. Although Copyleft licenses are couched in the terminology of contract, they are more just gifts with a few strings attached (that's how I personally think of them; copyleft lawyers probably have more nuanced terminology). This 2006 article discussed consideration and the GNU GPL, seemingly coming down against contract designation (on consideration grounds) but for license status, however that differs. There's more of this out there if you want to poke around the Google or the Google Scholar, or check out http://www.eff.org or ask at the RefDesk, where they are wicked smart.Ocaasi (talk) 09:10, 3 December 2010 (UTC)[reply]
I spend some time at the RefDesk, and I wouldn't trust it for legal advice (which is specifically banned there anyway... ). The main problem here is that if uploads are revocable, it applies to every single image on Commons, so it's not specifically relevant to this policy. I would also worry that any change to the site upload notice might risk the status of all uploads and should be done only with formal legal input. The only question here is if some special courtesy deletion policy is required here, which I don't see. Wnt (talk) 15:39, 3 December 2010 (UTC)[reply]

Simulation resolution

What is the resolution between a diagram and a simulation? 71.198.176.22 20:36, 24 November 2010 (UTC)[reply]

are you asking for thoughts on whether or not / how this policy should differentiate a diagram (drawn, or computer generated I presume) with a simulation (presumably photographed?) - I'm not 100% of how to kick off a response, so a clarification would be helpful to me :-) Privatemusings (talk) 00:30, 25 November 2010 (UTC)[reply]
Yes, if we are going to prohibit photographs and simulations both, then wouldn't it be a good idea to provide some guidance about what is a simulation and what is merely a diagrammatic drawing? 71.198.176.22 22:40, 25 November 2010 (UTC)[reply]
do you have trouble discerning the difference - aren't they different just by definition? :-) Privatemusings (talk) 01:21, 26 November 2010 (UTC)[reply]
This policy isn't about prohibiting photographs, just keeping them within Commons policies. Diagrams are specifically exempted from speedy deletion because certain uncontroversial reasons for deletion can't exist for them: they can't be child pornography, and they can't be created without consent, and the act of drawing them probably implies a certain educational purpose. Simulations aren't specifically defined here, and might be regarded as diagrams; but it is also possible to imagine looking at a "simulation" and not being sure if it is a drawing or a photograph run through a few photoshop filters, or more controversially (for example) a nude rendition of a particular person recreated from X-ray, terahertz, or other forms of scanning. In any case, no one should claim a simulation can be speedy deleted if it doesn't fall into one of the prohibited content category at all. So I'm not going to stickle on having simulations specifically exempted from uncontroversial speedy deletion when it will probably just raise more arguments. The term is just too prone to varying interpretations to use for policy purposes. Wnt (talk) 18:46, 29 November 2010 (UTC)[reply]

I asked the question poorly. I'm referring to "simulated sexually explicit conduct" -- where is the boundary between that and diagrammatic drawings? For example, are the cartoon illustrations used for sex acts in the English Wikipedia simulated sexually explicit conduct or diagrams, and why? 71.198.176.22 12:01, 30 November 2010 (UTC)[reply]

The policy doesn't try to define a difference between simulations and diagrams. At the beginning it defines sexual content to include "simulated" images, which aren't really defined. The main dividing lines are (1) what is prohibited content, which very likely does not include them, and (2) what must go through a full deletion review, with the additional comment that "Some categories of material which are generally useful for an educational purpose include: diagrams, illustrations..." (which reiterates that diagrams and illustrations probably aren't going to be prohibited by the policy).
So to take the example of File:Wiki-analsex.png, the file probably counts as "simulated sexually explicit conduct", so it gets sorted through the policy. Then you ask if it's illegal (probably not; fortunately even photos aren't counted as "obscene" nowadays), or if it's taken without consent (nobody to consent), or if it's out of the project scope (the policy specifically says that illustrations are generally educational). And if someone wants to argue for its deletion, it has to go through a proper discussion.
This may be a roundabout logic, but it's the result of a people trying to come up with a policy from different points of view, perhaps regarding situations we haven't thought of yet. And to be fair, a photo like that isn't all that different from the sort of "animated/live action" image that you could find in the film A Scanner Darkly, which arguably would need (for example) the consent of the original participant. Wnt (talk) 01:00, 1 December 2010 (UTC)[reply]

Time to take it to a poll?

No, it's not perfect. But few things in this world are, and it's been pretty stable, and I'd like to see us adopt this before we are overtaken by events from the Foundation level. - Jmabel ! talk 02:26, 25 November 2010 (UTC)[reply]

I think that timeline is important. The remaining questions are:
  • Should a user be able to revoke sexual content including themselves? (If it was uploaded by them; if it was not uploaded by them; if they gave prior consent in either case)
  • How to handle 'close-up' images. What qualifies as identifiable? Is consent required? What is the process or 'burden' regarding identifiability and consent in these cases?
  • How to handle what could be an illegally acquired image? What does illegally acquired 'look like'? What is the process or 'burden' in these cases?
These are increasingly minor and should only be resolved first if they have the potential to change the outcome of the vote. Ocaasi (talk) 07:02, 25 November 2010 (UTC)[reply]

Here's my answers fwiw :-)

  • Revoking consent will remain outside the purview of this policy, to be handled through practice. The principle of irrevocable consent is important to all wmf content generation, so the best thing for this policy to do is to invite contact with the office, which is suitably in place.
  • Close-up images which are sexual content, as defined, require assertions of consent, as does all sexual content. We've left the burden to be decided through community discussion - I'm fine with that.
  • We've stated 'Provision of further evidence of consent is welcome (using the OTRS process) but is not normally required.' - in the event that an image has a reasonable basis for being illegally required (as discussed and established through community discussion) this would, in my view, qualify as 'not normal' hence the community is empowered to insist on further evidence, the nature of which is left to the community discussion to evaluate. I'm ok with that for now too.

that's my thoughts. Privatemusings (talk) 22:09, 25 November 2010 (UTC)[reply]

At a quick read, I agree completely with Privatemusings here. And none of this entails any further rewordings. - Jmabel ! talk 02:05, 26 November 2010 (UTC)[reply]
I like those answers too. Maybe they'd be useful for a COM:SEX FAQ. In fact, an FAQ, even an informal one might help explain the intention of some of the policies and let voters browse through issues. That's not to say the policy shouldn't be self-explanatory or that the FAQ is binding--just that a guide to a new policy might help introduce the policy itself (see: W:Wikipedia:Neutral_point_of_view/FAQ for example). Ocaasi (talk) 05:30, 26 November 2010 (UTC)[reply]
I don't actually understand what you mean, but it looks like people want a poll, so I'll start the section, closely based on the preview text above (which I'll archive now to avoid any misplaced votes). Wnt (talk) 13:56, 4 December 2010 (UTC)[reply]
  • may be we are ready for the polls, but not yet, i feel, till the question of "irrevocable consent" is well settled. "sign" jayan 05:51, 9 December 2010 (UTC)

Summary of poll arguments

This section is intended to be a brief, neutral summary of opinions raised in the policy poll below. Please make NPOV edits for completeness, concision, neatness, or accuracy, but not for persuasion (that's what the poll comments are for). Also, !nosign! any changes to keep it tidy. Thanks, Ocaasi (talk) 15:50, 7 December 2010 (UTC)[reply]

Thank you for doing this, it's very helpful! -- Phoebe (talk) 00:26, 12 December 2010 (UTC)[reply]
Gladly, I think every poll should have one! Ocaasi (talk) 12:16, 12 December 2010 (UTC)[reply]

Like it...

  • Policy is needed
  • Good enough for a start
  • Represents Compromise
  • Not having the policy is worse
  • Not a major change of existing policies, just a focused reorganization
  • Can help avoid social backlash and panic
  • Needed for legal reasons
  • Needed for ethical reasons
  • Protects the WMF
  • Prevents more strict censorship
  • Legally sound
  • Prevents becoming a porn repository
  • Better to have a centralized policy than multiple deletion rationales
  • Preempts WMF from imposing a less nuanced sexual content policy

Don't like it...

  • Slippery slope to more censorship
  • Educational purpose is not clearly defined
  • Doesn't explicitly protect content from deletion
  • Policies on illegality, privacy, pornography, and nudity already cover the material
  • Better to handle on a case by case basis
  • Instruction creep
  • Policy addresses legal issues but is not written by lawyers
  • Policy addresses legal issues but commons users are not legal experts
  • Sexual content should not be treated differently than other content
  • Vague wording
  • The phrase 'low-quality'
  • Out-of-scope deletions should be normal not speedy
  • US-Centric legal concerns
  • Legal concerns that are unresolved by courts
  • Addresses sex but not offensive or violent content
  • After Wales deletions, implementation cannot be trusted
  • Biased against Western cultural taboos (it's too conservative)
  • Biased against non-Western cultural taboos (it's too liberal)
  • Better as a guideline
  • Needs more specific criteria for inclusion and exclusion
  • Better to solve this on the user end (personal image settings/filters)
  • Insufficient protection for erotic art
  • Only available in English language, which makes fair overall understanding and voting impossible

Questions

  • What does 'low-quality' mean?
  • What qualifies sexual content as educational?
  • What content cannot be deleted?
  • Would it be better as a guideline?
  • Can the legal discussion be rephrased in normal language?
  • Should out-of-scope decisions be normal rather than speedy deletions?

Tweaks

  • Shorter
  • Less legalese
  • Clarity of legal interpretation
  • More specific criteria for inclusion/exclusion
  • Out-of-scope deletions to be handled by regular process, not speedy

Second poll for promotion to policy (December 2010)

This is a poll to adopt Commons:Sexual content as a Wikimedia Commons policy.

Please give your opinion below with {{Support}}, {{Oppose}}, or {{Neutral}}, with a brief explanation.

The poll concerns whether people accept or reject the November 26 revision. Edits made to the policy during the polling period may be temporarily reverted, and will need to be adopted by consensus, like changes to the other established policies.

Voting on the poll will proceed until 06:08, 15 December, ten days after the time that poll was first advertised at MediaWiki:Sitenotice.

A previous poll was closed as "no consensus" at Commons talk:Sexual content/Archive 4#Poll for promotion to policy. Wnt (talk) 07:47, 23 October 2010 (UTC)[reply]



Support

  1.  Support May not be perfect, but it's a good start. I think it is important that we go on record with a policy in this area; further debate, especially on edge cases, can follow on the basis of this general consensus. - Jmabel ! talk 03:26, 5 December 2010 (UTC)[reply]
    There is already precedent and policy to handle all the common sense guidelines this proposal aims to add(specifically Commons:Project scope#Must_be_realistically_useful_for_an_educational_purpose). The only thing this proposal actually changes is that now complicated decisions concerning legality(the line between porn and erotic art) and scope can be made subjectively by a single user, and speedy deleted without any discussion. Provocative works such as L'Origine du monde (which hangs in the Musée d'Orsay) De Figuris Veneris have already been speedydeleted without discussion by zealous admins deciding for themselves that they were "without educational purpose". There is no need for more guidelines to delete sexual content which is obviously without merit, we have standard deletion discussions and official policy already in place to handle that.AerobicFox (talk) 01:48, 12 December 2010 (UTC)[reply]
    The speedy thing is really mainly intended for deletion of multiple redundant low-quality cell-phone snapshots made by guys of their own penises while they are drunk. If there's abuse of this discretion, then paintings by famous 19th-century artists will easily be restored after a deletion review -- while the uploaders of drunken-cell-phone-self-penis-snaps rarely bother to start a deletion review (and do not very often show any particular interest in the Wikimedia Commons site itself and its goals other than their desire to upload such material). AnonMoos (talk) 06:24, 12 December 2010 (UTC)[reply]
    As far as I know, the only deletions if images of L'Origine du monde that have occurred were removals of inferior images with the usual process used for {{Universal replace}}. Can you point at a diff for the incident you are referring to? - Jmabel ! talk 01:46, 13 December 2010 (UTC)[reply]
    My bad, replaced with De Figuris Veneris which was speedy deleted for "Out of Project Scope". Currently low quality images of people's penises are already speedy deleted without problem. Obvious cases are handled with ease without the possibility that not-so-obvious cases will be deleted unnecessarily using this as justification.AerobicFox (talk) 02:19, 13 December 2010 (UTC)[reply]
  2.  Support BUT only for educational purposes. Thank you, --???????8790Say whatever you want 06:40, 5 December 2010 (UTC)[reply]
  3.  Support as a reasonable initial revision. While it will no doubt require further refinement over time, this is best done as a live policy handling real cases rather than theoretical analysis as a draft. -- Mattinbgn/talk 08:00, 5 December 2010 (UTC)[reply]
  4.  Support I think the issues have been dealt with (see the headings above and in the archives). There is disagreement on some points, but the policy represents a reasonable compromise and not having a policy is far worse. --LPfi (talk) 08:36, 5 December 2010 (UTC)[reply]
  5.  Support as not having a policy risks bringing Wikimedia into serious disrepute. --Simple Bob (talk) 10:47, 5 December 2010 (UTC)[reply]
  6. OK But not the last decision...--Yikrazuul (talk) 11:52, 5 December 2010 (UTC)[reply]
  7.  Support Too important not to have a policy on, especially with likely input coming soon from the Foundation. Easy to make a list of edge issues and let them be clarified over time. Policy doesn't leave a giant loophole for content to be deleted--it establishes the principle (namely that commons is not censored for broadly educational content)--and then provides some pathways for handling disputes. This policy is a good idea, whether or not it's perfect, and having no policy on the issue is worse. Ocaasi (talk) 10:55, 5 December 2010 (UTC)[reply]
  8.  Support I see some opposers wanting a more restrictive policy, and a similar number wanting a less restrictive policy (or none), suggesting that this policy strikes approximately the best consensus our community could hope for. This wording has been extensively debated by editors with a wide range of opinions. --99of9 (talk) 11:12, 5 December 2010 (UTC)[reply]
  9. Moderate  Support -- mostly a restatement of laws which already apply to Commons, plus basic common-sense policies. Not sure that there's a burning need to adopt it right now, but if it helps head off the next Sanger-type panic, then it will have been worthwhile... AnonMoos (talk) 11:23, 5 December 2010 (UTC)[reply]
  10.  Support well written, takes a cautionary, balanced approach to dealing with the subject. While this may repeat some basics (correctly) this has releaved itself as a problematic issue so some extra effort for the sake of integrity of Commons is a good idea. Hekerui (talk) 11:29, 5 December 2010 (UTC)[reply]
  11.  Support This can and will no doubt evolve further, as do all our policies, but we really do need a policy in place for this. This is a good start. --JN466 12:54, 5 December 2010 (UTC)[reply]
  12.  Support, although I agree with AnonMoos - these are just basic common-sense policies. -- deerstop. 15:16, 5 December 2010 (UTC)[reply]
  13.  Support - It may not be perfect, but something is needed if only for basic legal and ethical reasons. Smallbones (talk) 15:41, 5 December 2010 (UTC)[reply]
  14.  Support - such a policy, repugnant as it may be, is necessary to help combat the more repugnant heavy-handed censorship and insufficiently-justified deletionism and prosecution, as well as to preserve, protect, and defend the WMF.   — Jeff G. ? 15:55, 5 December 2010 (UTC)[reply]
  15.  Support (for now) As for the doubters, are there any cases of this sort of policy having been applied heavy-handedly (or abused) in the Wikipedia universe? If there are, I'd like to see them as I take your concerns seriously. Mr.choppers (talk) 16:53, 5 December 2010 (UTC)[reply]
    re: "are there any cases of this sort of policy having been applied heavy-handedly (or abused) in the Wikipedia universe" - um, yes there are. do you really need a list? o__0 if so, try the deletion & undeletion archives on here, or on pretty much any wm project Lx 121 (talk) 00:34, 7 December 2010 (UTC)[reply]
  16.  Support to echo people above, it's a sensible step forward in codifying our basic response to real issues of legality and how Commons' existing policies apply in a more focused sense. It may not be perfect, but it hits all the high notes and we need something like this. David Fuchs (talk) 17:11, 5 December 2010 (UTC)[reply]
  17.  Support the guidelines and the text of this policy seems to be well-balanced. --DenghiùComm (talk) 18:00, 5 December 2010 (UTC)[reply]
  18.  Support a good first attempt. Such a guidline could simplify some debates. --High Contrast (talk) 20:18, 5 December 2010 (UTC)[reply]
  19.  Support A policy needs to be in place, even if not perfect yet. It can be fine-tuned progressively. --P199 (talk) 20:24, 5 December 2010 (UTC)[reply]
  20.  Support I'm not a fan of it in all details, but it does seem that the Foundation is pushing us in the direction of having some sort of standard here, and a lot of this merely clarifies current practice on the land. "Low-quality pornographic images that do not contribute anything educationally useful may be removed" is basically No Penis, a practice that will go on with or without policy, because no one wants to DR those.--Prosfilaes (talk) 20:34, 5 December 2010 (UTC)[reply]
  21.  Support Needs some sort of standard to be adopted. Miyagawa (talk) 20:52, 5 December 2010 (UTC)[reply]
  22.  Support per Jmabel and 99of9. Walter Siegmund (talk) 21:40, 5 December 2010 (UTC)[reply]
  23.  Support Could do with less words, but a reasonably clear start --Elen of the Roads (talk) 23:05, 5 December 2010 (UTC)[reply]
  24.  Support I was iffy on including "prominent depictions of the pubic area or genitalia" as "sexually explicit conduct", but I am somewhat familiar with USA v. Dost (1996) and am definitely willing to take precautions against child pornography. The policy, as I see it, is to allow the deletion of three types of images we should delete regardless of an existing sexual content policy: illegal material (child porn, etc.), material breaching personality rights, and material out of the project's scope. ?fetchcomms 23:27, 5 December 2010 (UTC)[reply]
  25.  Support. I'll be darned, you guys managed to come up with a decent policy on this that doesn't negatively impact our educational mission, but keeps us on the right side of the legal and court-of-public-opinion lines. Is it perfect? Certainly not, but policy documents can be modified just like any other page. Powers (talk) 23:45, 5 December 2010 (UTC)[reply]
  26.  Support it's a start. - KrakatoaKatie 07:29, 6 December 2010 (UTC)[reply]
  27.  Support Privatemusings (talk) 09:22, 6 December 2010 (UTC)[reply]
  28.  Support litenbjoll (talk) 10:30, 6 December 2010 (UTC)[reply]
  29.  Support Captures the existing policy and the previous existing practices and adds some details to clarify the intentions of the policy. This is an improvement and worthy of adopting now so that we have something written to explain to new users. FloNight♥♥♥ 09:42, 6 December 2010 (UTC)[reply]
  30.  Support Seems reasonable enough, being based on already existing principles and mandatory law, without giving way to moral panic (read:fox). Opposing voices calling out that this is/will lead to "censorship" seems to me as not read all relevant text or misinterpreted it, as there will be no more "censorship" after a policy as this is adopted as there is "censorship" on commons today. And the "slippery slope" argument is strange to me as all policy change will be subject to community consensus plus that the "not censored" policy clearly show that commons isn't ruled by fear generated by moral panic in the media and society. Lokpest (talk) 10:07, 6 December 2010 (UTC)[reply]
  31.  Support. A good start. SlimVirgin (talk) 11:51, 6 December 2010 (UTC)[reply]
  32.  Support Sounds good. Agreed with Lokpest. --Millosh (talk) 12:21, 6 December 2010 (UTC)[reply]
  33.  Support good start--shizhao (talk) 13:24, 6 December 2010 (UTC)[reply]
  34.  Support A workable policy that can be adapted as necessary. Edgar181 (talk) 13:28, 6 December 2010 (UTC)[reply]
  35.  Support Better to have a good policy we agree on than the alternatives. Oppose reasons are not compelling. ++Lar: t/c 14:36, 6 December 2010 (UTC)[reply]
  36.  Support Some sort of policy needs to be adopted. MarioCRO (talk) 16:20, 6 December 2010 (UTC)[reply]
  37.  Support Speaking as a rabid free-speech nut, I like this - David Gerard (talk) 16:39, 6 December 2010 (UTC)[reply]
  38.  Support As a supporter of free-speech and anti-censorship, i support this policy. Joyson Noel Holla at me 16:51, 6 December 2010 (UTC)[reply]
  39.  Support Definite improvement. NathanT 16:53, 6 December 2010 (UTC)[reply]
  40.  Support A decent start. I share the concerns about legalese - we should just speak plainly about what is harmful and what is helpful - but otherwise this is needed. I'd encourage the opposes/neutral camp to dive in and edit parts they have strong feelings about to make it acceptable; a softer, early-stage policy is better than none at all. Steven Walling 16:56, 6 December 2010 (UTC)[reply]
  41.  Support as this area is important to have very clear policies and guidelines in place. ···日本穣Talk to Nihonjoe 17:18, 6 December 2010 (UTC)[reply]
  42.  Support Personally, I believe every content in our projects should have clear educational value, independent of what the content is about. In this sense I agree with some of the people who objected this policy. But sexual content is a controvertial topic, in non-western cultures often more than in western cultures. And in the past I do think that from time to time marginally educational, or even non-educational content got uploaded on Commons, which do disturb and confuse our users. So I think to clearly emphasize educational value is a very important issue here. This is the main reason why I am in support of this policy--Wing (talk) 19:30, 6 December 2010 (UTC)[reply]
  43.  Support. A good start. Seems to strike a balance between privacy/scope concerns and not censoring wikipedia. Kaldari (talk) 19:42, 6 December 2010 (UTC)[reply]
  44.  Support This policy definitely needs to be elaborated on more, but the overall policy is definitely for the best. Oxguy3 (talk) 19:43, 6 December 2010 (UTC)[reply]
  45.  Support I would normally oppose a policy which treated sexual material differently from any other class of material, and it is already a requirement that we remove material illegal according to the law under which we operate. However, I see the opposition to this proposed policy is almost entirely from those who think it too permissive, and wish it rejected so they could develop a policy which would to my way of thinking be even worse. I am not willing to do anything that might tend to encourage such an effort. DGG (talk) 19:59, 6 December 2010 (UTC)[reply]
  46.  Support I agree with the policy as a working basis for a sexual content policy. It has to start somewhere, and I think this is a pretty good first step. Zach the Wanderer (talk) 20:29, 6 December 2010 (UTC)[reply]
  47.  Support The proposed policy seems appropriate and legally sound. Fred Bauder (talk) 20:31, 6 December 2010 (UTC)[reply]
  48.  Support --|EPO| da: 21:07, 6 December 2010 (UTC)[reply]
  49.  Support -- Avi (talk) 21:25, 6 December 2010 (UTC)[reply]
  50.  Support: for the most part, simply codifies current practice. --Carnildo (talk) 21:31, 6 December 2010 (UTC)[reply]
  51.  Support Mardetanha talk 22:07, 6 December 2010 (UTC)[reply]
  52. Support <--Spot the Wikipedian. Seems pretty reasonable to me. Fences and windows (talk) 22:12, 6 December 2010 (UTC)[reply]
  53. Support - Not perfect, but some standards are a good idea. KillerChihuahua (talk) 23:26, 6 December 2010 (UTC)[reply]
  54. Support This is sound and relies heavily on already existing policies. Workable and useful. notafish }<';> 23:44, 6 December 2010 (UTC)[reply]
  55.  Support I agree. No policy would be a very tricky situation indeed. If it isn't perfect now, the bugs can be worked out as soon as possible, but the groundwork should at least be laid. Timmietimtim314 (talk) 03:27, 7 December 2010 (UTC)[reply]
  56.  Support It is important to have criteria in place for such material, as fundamentally and foremost Wikipedia is an educational encyclopaedia used by all ages, and not an easily accessed repository for pornography. This unsigned post was made by 3atc3 (talk · contribs) at 04:44, 7 December 2010
  57.  Support - I think it is reasonable to have a guideline for those keeping the place free from non-educational, objectionable media. Nephron  T|C 04:48, 7 December 2010 (UTC)[reply]
  58.  Support Good to have sexual content on Wikimedia for educational purposes. The proposed policy seems to give enough handles to prevent abuse. Lets go with it and see how things turn out. Yhoitink (talk) 08:58, 7 December 2010 (UTC)[reply]
  59.  Support Agree with most of the reasons provided above - there is a need to prevent the misuse of the site. It's not prefect, there are lots of room for improvement, but this will work as a foundation for future improvements. Joeyfjj (talk) 09:06, 7 December 2010 (UTC)[reply]
  60.  Support In my personal capacity as an editor. This does not represent an official statement of support or opposition from the Wikimedia Foundation. Philippe (talk) 15:00, 7 December 2010 (UTC)[reply]
  61.  Support - Time to raise the bar. - Josette (talk) 15:55, 7 December 2010 (UTC)[reply]
  62.  Support Requesting that the poster at least claim to have obtained consent in particular is a reasonable expectation. Sxeptomaniac (talk) 17:37, 7 December 2010 (UTC)[reply]
  63.  Support Having a policy is certainly better than not having one, and helps combat censorship. — Loadmaster (talk) 17:47, 7 December 2010 (UTC)[reply]
  64.  Support - The proposed policy seems to cover the basics. Killiondude (talk) 18:08, 7 December 2010 (UTC)[reply]
  65.  Support -- Seems reasonable, not too restrictive, seems to cover the important areas. --SarekOfVulcan (talk) 18:59, 7 December 2010 (UTC)[reply]
  66.  Support A policy is needed and I support this provisionally. DMSBel (talk) 19:21, 7 December 2010 (UTC)[reply]
  67. Template:Ar son - 's tús maith é / it's a good start - Alison 20:52, 7 December 2010 (UTC)[reply]
  68.  Support The very minimum if the WMF wants to have any credible educational environment and abide by US law, which it is legally and ethically bound to do. Ottava Rima (talk) 21:19, 7 December 2010 (UTC)[reply]
  69.  Support on the assumption the Foundation would rather support than oppose. Some WMF edicts such as the blanket ban on hosting "promotional" images of living persons I have disagreed with, but in general I support WMF initiatives because they have some real responsibility for the projects and are relatively more informed about the challenges that need to be addressed.--Brian Dell (talk) 21:31, 7 December 2010 (UTC)[reply]
  70.  Support an overarching policy on these images is necessary. This is a good place to work from. I can't understand the anti-US law opposers below when the Foundation and WPs/Commons' servers are located in the US. Moving them offshore just to escape the laws governing pornographic material would be a waste of money, a public relations disaster, and not conducive to anyone's goals. Ed [talk] [en:majestic titan] 21:40, 7 December 2010 (UTC)[reply]
  71.  Support Policies help.--Sisyphos23 (talk) 21:41, 7 December 2010 (UTC)[reply]
  72.  Support as others have said, a good start. Kevin (talk) 22:00, 7 December 2010 (UTC)[reply]
  73.  Support ɳOCTURNEɳOIR 22:09, 7 December 2010 (UTC)[reply]
  74.  Support per what others have said. This is a good starting point. Griffinofwales (talk) 22:11, 7 December 2010 (UTC)[reply]
  75.  Support This policy is not perfect but an acceptable start. Regarding the speedy deletions: That was already policy: Material which is plain illegal or that falls under COM:PORN was already speedy deleted in the past. That our media must not violate US law should be obvious. Suggestions to move Wikimedia Commons to another country with a different legislation are not helpful. --AFBorchert (talk) 23:03, 7 December 2010 (UTC)[reply]
  76.  Support This policy will make the guidelines as to what sexual content is and isn't allowed on commons much clearer. Barts1a (talk) 23:28, 7 December 2010 (UTC)[reply]
  77.  Support. Wikipedia and other Wikimedia projects are not for porn. Demize (talk) 23:40, 7 December 2010 (UTC)[reply]
  78.  Support - a sensible policy. --Bduke (talk) 23:50, 7 December 2010 (UTC)[reply]
  79.  Support - (Strictly as a Commonist, per Philippe) This seems to be a good policy laying out the kinds of images that have to be deleted (mainly for legal issues or respect for the people depicted) while making explicit the wide range of acceptable sexual content.--ragesoss (talk) 23:56, 7 December 2010 (UTC)[reply]
  80.  Support It's more than a start. This is a necessity. Half price (talk) 00:09, 8 December 2010 (UTC)[reply]
  81.  Support - Better to have good clear policies. This is a good beginning and it can be fleshed out (;) with time.
    ⋙–Berean–Hunter—► ((⊕)) 00:51, 8 December 2010 (UTC)[reply]
  82.  Support This makes sense. Lovetinkle (talk) 01:12, 8 December 2010 (UTC)[reply]
  83.  Support - a perfect start. Sjones23 (talk) 01:49, 8 December 2010 (UTC)[reply]
  84.  Support - not perfect, but better than the status quo. Kirill Lokshin (talk) 02:08, 8 December 2010 (UTC)[reply]
  85.  Support - Wikipedia is meant for educational and info purposes only, not a porn site. This would work well. Creation7689 (talk) 02:13, 8 December 2010 (UTC)[reply]
  86.  Support will no doubt require further refinement over time, but otherwise is a great start. Bakerboy448 (talk) 02:14, 8 December 2010 (UTC)[reply]
  87.  Support - This is an encyclopedia, so I'm in favor of limiting such images to trully educational/artistic purposes. BatteryIncluded (talk) 02:18, 8 December 2010 (UTC)[reply]
    with respect to the 2 above user-voters, commons is a media repository NOT an encyclopedia. our project goals here are different from those of wikipedia; if the wmc project's goals were not different, there would be no reason to have a separate wmc project. see Commons:Project scope Lx 121 (talk) 03:13, 8 December 2010 (UTC)[reply]
    I chose my wording wrong, I meant Wikimedia Commons, not wikipedia. I was transitioning back and forth from both wikipedia to wmc continuously. Although I still agree with this policy. Creation7689 (talk) 20:20, 9 December 2010 (UTC)[reply]
    fair enough, it happens :) & yes, of course everyone is free to support or oppose the proposed policy. you may have slightly missed my point though; it wasn't meant as a technicality. wikimedia commons is different from wikipedia. our purpose here is different; we're not only collecting media materials for wikipedia, wikimedia et cie. we're collecting free and/or open-source media files (for educational purposes) for anyone & everyone to make use of. if all we wanted to do was stock up for wm projects, we wouldn't need a full standalone project for that; just a central co-ordination & storage service. we could also get rid of at least half the files we already have as redundant/unnecessary. it falls within our purview to have as wide & varied a collection of media file on here as possible. Lx 121 (talk) 01:01, 10 December 2010 (UTC)[reply]
  88.  Support - This proposed policy seems to be consonant with other wiki policies and meets the legal concerns of the Foundation. Wabbott9 (talk) 03:36, 8 December 2010 (UTC)[reply]
  89.  Support - This should have been on poll long ago. It's what one would call "common sense," IMHO. Jsayre64 (talk) 03:56, 8 December 2010 (UTC)[reply]
  90.  Support Well worded and reasonable, does not seem too restrictive or to permissive, and provides clear and concise guidance for things relating to sexual content. Ks0stm (TCG) 04:01, 8 December 2010 (UTC)[reply]
  91.  Support My support for this is uncontroversial to me but what objections I have are all debatable. My vote of support is without hesitation. Bluerasberry (talk) 04:30, 8 December 2010 (UTC)[reply]
  92.  Support - I support the many reason previously stated for implementing this: Moral, legal, etc. If left unchecked, commons could easily become a repository and distribution platform for pornographic content.--JDIAZ001 (talk) 04:36, 8 December 2010 (UTC)[reply]
  93.  Support - I see this as being that missing last major policy that was needed to complete our policy 'canon'. I feel it was long needed and I generally support, but I do wish that the concerns raised by the opposers, if possible, can be alleviated and get any kinks ironed out first. -- OlEnglish (talk) 04:44, 8 December 2010 (UTC)[reply]
  94.  Support - Some form of control is needed, not perfect but better that a weak control, if we wait for the perfect policy abuses might continue to occur.-Mariordo (talk) 06:08, 8 December 2010 (UTC)[reply]
  95.  Support - Agree with comment by Jmabel (talk · contribs), above. -- Cirt (talk) 07:01, 8 December 2010 (UTC)[reply]
  96.  Support - Not perfect, needs a good copyedit and content tweak, but it's a start. Cindamuse (talk) 07:37, 8 December 2010 (UTC)[reply]
  97.  Support I'm all for it.TucsonDavid (talk) 08:10, 8 December 2010 (UTC)[reply]
  98.  Support Seems entirely consistent with the foundation's purposes and applicable laws. Jclemens (talk) 08:14, 8 December 2010 (UTC)[reply]
  99.  Support Perfect? No. Long overdue, yes. It is simply the right thing to do. --Ckatzchatspy 08:17, 8 December 2010 (UTC)[reply]
  100.  Support Not perfect, but I don't agree with any of the opposing vote reasons. VMS Mosaic (talk) 08:48, 8 December 2010 (UTC)[reply]
  101.  Support - If anything, this proposal is too lax. - Schrandit (talk) 09:17, 8 December 2010 (UTC)[reply]
  102.  Support - It does not necessarily need to start this project perfectly, there shall be criticisms against it; but I believe that this project shall enhance Wikipedia's sensibility on matters like this. Moreover, if possible, consider also the pleas of our colleagues who have opposed this plan. It is also for Wikipedia's good. - Thirdeyerevo (talk) 18:10, 8 December 2010 (GMT+8)
  103.  Support - Censorship should not take away from education. <3 bunny 11:22, 8 December 2010 (UTC)[reply]
  104.  Support - A step in the right direction. Esuzu (talk) 11:52, 8 December 2010 (UTC)[reply]
  105.  Support - Needed for ethical reasons. --Sachinvenga (talk) 12:04, 8 December 2010 (UTC)[reply]
  106.  Support - Unfortunate, but necessary. Beyond My Ken (talk) 12:24, 8 December 2010 (UTC)[reply]
  107.  Support - While I am loathe to suggest that any form of censorship be introduced, I do agree that there should be some limits imposed for this particular topic. A brief glance through certain categories will reveal images that are not currently used in the project, probably never will be used in the project, and offer absolutely no value to the project other than serving as freely accessible pornographic content. --MacTire02 (talk) 12:48, 8 December 2010 (UTC)[reply]
  108.  Support - Indeed a good start.Lord Psyko Jo (talk) 16:21, 8 December 2010 (UTC)[reply]
  109.  Support looks altogether well; US law may be followed, as the root servers are in the US, and the US law is goodHans Dunkelberg (talk) 17:06, 8 December 2010 (UTC)[reply]
  110.  Support Better than nothing, but IMO it shouldn't be restricted to sexual content; things like violence, hatred, disgusting stuff, subjectively offensive stuff (things that offend some people but not others) etc, basicly anything that any significant amount of people object to seeing or having avaiable online should also be judged under similar rules (so those type of things are only kept if they don't make Wikimedia break any laws and they positively contribute to at least one of the projects). Having said that, i haven't put much thought into this, there might be some important details i missed that would have made me change my mind and/or make other suggestions if i became aware of them. --TiagoTiago (talk) 18:01, 8 December 2010 (UTC)[reply]
  111.  Support - Wiki needs something, maybe others will see it as a step towards professionalism and start to take Wikipedia more seriously. Mike (talk) 12:35, 8 December 2010 (UTC)[reply]
  112.  Support Perhaps necessary, at all a good thing. --Singsangsung (talk) 18:38, 8 December 2010 (UTC)[reply]
  113.  Support I'm ambivalent about the "out of scope" issue, but on balance I think the proposal is a net gain. The fact is that the servers are in Florida and subject to applicable law, and no amount of votes by editors can change that fact. --Tryptofish (talk) 18:52, 8 December 2010 (UTC)[reply]
  114.  Support Not perfect, but better than no policy. Can always be improved once in place. Net gain if adopted. Kim Dent-Brown (talk) 19:12, 8 December 2010 (UTC)[reply]
  115.  Support Reasonable. Grimhim (talk) 19:37, 8 December 2010 (UTC)[reply]
  116.  Support But needs some work in the future.
  117.  Support "Good enough" start; I would like to see some policy going forward that non-logged-in users (anonymous) cannot view Sexual Content and only persons who are > 13 years old can register an account (to comply with COPPA. Safety Cap (talk) 21:57, 8 December 2010 (UTC)[reply]
  118.  Support This looks like about as much as can be done. StAnselm (talk) 21:58, 8 December 2010 (UTC)[reply]
  119.  Support Seems like a sound enunciation of policy. ScottyBerg (talk) 22:27, 8 December 2010 (UTC)[reply]
  120.  Support - Tiptoety talk 22:30, 8 December 2010 (UTC)[reply]
  121.  Support - Absolutely. We've needed this for a while. Nolelover (talk) 22:50, 8 December 2010 (UTC)[reply]
  122.  Support; although I remain mystified why people have such strong reactions to content that is of sexual nature and yet fail to blink an eye at strong violence. Meh. This policy draft is reasonable, if necessarily imperfect, and is a good foundation. Coren (talk) 00:19, 9 December 2010 (UTC);[reply]
  123.  Support Venustas 12 (talk) 01:10, 9 December 2010 (UTC)[reply]
  124.  Support – We need a policy on this content, and this one deals with it fine. MC10 (talk) 01:46, 9 December 2010 (UTC)[reply]
  125.  Support There definitely needs to be a policy in place here. This one may need some tweaking, eventually, but it is a sufficient starting point, at least... Sebastiangarth (talk) 02:52, 9 December 2010 (UTC)[reply]
  126.  Support - This policy has a VAST NET POSITIVE effect. I am so sick of people uploading and hosting pictures of their penis for no encyclopedic purpose. Aeonx (talk) 04:43, 9 December 2010 (UTC)[reply]
  127.  Support - Regardless of the other good reasons to adopt this, it addresses an issue one sees in the pop-culture fields in particular: a tendency to see Wikipedia as free server space for fan sites or, as one other person here put it, amateur porn sites. Wikipedia is not free server space for anything and everything. There have to be limits. Why not clearly establish them? --207.237.230.157 05:14, 9 December 2010 (UTC)[reply]
    Commons is not an amateur porn site is the name of an official guideline which is already clearly established.AerobicFox (talk) 02:33, 12 December 2010 (UTC)[reply]
  128.  Support It needs work in my opinion but its a good start. We need a good strong policy on content of this nature to not only protect children but to protect the Wikimedia Foundation--Dcheagle (talk) 05:16, 9 December 2010 (UTC)[reply]
  129.  Support Not perfect, but better than no policy. Gérard Janot (talk) 07:59, 9 December 2010 (UTC)[reply]
  130.  Support policy is greatly needed too many exploit this to make a point....77.97.18.242 10:10, 9 December 2010 (UTC)[reply]
    1.  Support it's the start we need, but definitely not completed. --Cvf-ps (talk) 11:26, 9 December 2010 (UTC)[reply]
  131.  Support Its something we need to address, Commons exists because of the freedoms the US law provides us to operate within Gnangarra 12:18, 9 December 2010 (UTC)[reply]
  132.  Support I fully support this stance.— Preceding unsigned comment added by 212.175.44.2 (talk • contribs) 13:03, 9 December 2010
  133.  Support A good compromise to start with, certainly not perfect but no policy is. --Captain-tucker (talk) 14:43, 9 December 2010 (UTC)[reply]
  134. I completely support this policy. People will only be using these dirty pages for something personal. Not for a project or research. (Zach attack)
  135. I  Support this well thought out step in the right direction. "Slippery slope" arguments in opposition seem awfully political. --SB_Johnny talk 16:33, 9 December 2010 (UTC)[reply]
    "Awefully political" can still be (and in this case is) correct. Beta M (talk) 17:32, 9 December 2010 (UTC)[reply]
    Indeed. See for example vote no. 119 in this group for an example of how slippery the slope is. --Saddhiyama (talk) 17:40, 9 December 2010 (UTC)[reply]
  136.  Support --Kjetil_r 17:35, 9 December 2010 (UTC)[reply]
  137.  Support implementation of this policy as written. Kelly (talk) 17:36, 9 December 2010 (UTC)[reply]
  138.  Support because "like it..." Rursus (talk) 17:43, 9 December 2010 (UTC)[reply]
  139.  Support
    1. As stated, Wiki Commons is a "media resource for encyclopedia articles..."
    2. Information, in all its forms, should never be censored.
    3. Wiki Commons is not a "porn site, and ... images that do not contribute anything educationally useful may (should) be removed." gatorgirl7563 | 173.65.66.133 18:01, 9 December 2010 (UTC)[reply]
  140. Reluctant  Support I suppose it's better than nothing at all, but I still think we should delete all of it, for the reason I stated below. --The High Fin Sperm Whale 20:02, 9 December 2010 (UTC)[reply]
  141.  Support - obviously not perfect, but just as obviously necessary, and well thought out. PrincessofLlyr royal court 21:30, 9 December 2010 (UTC)[reply]
  142.  Support Commonsense. Johnuniq (talk) 21:57, 9 December 2010 (UTC)[reply]
  143.  Support Long over due. Of course, voting on these projects is pointless, as achieving consensus is no longer possible. Jennavecia (Talk) 22:05, 9 December 2010 (UTC)[reply]
  144.  Support Keep Wikipedia clean! --Wingdude88 (talk) 00:46, 10 December 2010 (UTC)[reply]
  145.  Support A good idea in my opinion. Kithira (talk) 01:10, 10 December 2010 (UTC)[reply]
  146.  Support This is a sensible policy, most parts are common sense, and the guidelines as to what constitutes sexual content are a good rule of thumb.  -- Lear's Fool 01:21, 10 December 2010 (UTC)[reply]
  147.  Support Summarizes the consensus reached after much heated debate and hard work and accepts the legitimacy of sex education. --Simonxag (talk) 01:30, 10 December 2010 (UTC)[reply]
  148.  Support Farmercarlos (talk) 01:58, 10 December 2010 (UTC)[reply]
  149.  Support I also express my full support. 67.255.5.118 02:43, 10 December 2010 (UTC)[reply]
  150.  Support Sounds pretty well thought out to me. -98.230.81.0 03:21, 10 December 2010 (UTC)[reply]
  151.  Support in full. If you go and browse Category:Nudity and its sub-categories, you will likely find that there are a lot of nude photos that are totally unnecessary and not needed. People are using Wikimedia Commons as a repository for pornography, and that needs to stop, and violators blocked and/or warned. Having a few (and I said just a few) photos for educational purposes (e.g. on Wikipedia) is OK, but when a photo covers an angle that is already covered by other pictures, it should be deleted as unneeded (and possibly porn, if that was the intention of the uploader (and I have seen photos on here where it probably was). [|Retro00064|☎talk|✍contribs|] 04:07, 10 December 2010 (UTC)[reply]
  152. Strong support. This is a good step in the right direction. Bob the Wikipedian (talk) 04:11, 10 December 2010 (UTC)[reply]
  153.  Support It isn't perfect but nothing ever is. It is a much needed policy and does not restrict or permit too much in either direction.Matrixmania (talk) 05:59, 10 December 2010 (UTC)[reply]
  154.  Support It will be good to see if this policy works out once (if) it is implemented. It is about time that a policy like this has been proposed for Wikimedia and its sister sites, but we must see about its effectiveness. --Ano-User (talk) 07:02, 10 December 2010 (UTC)[reply]
  155.  Support, and really, screw consensus on this and use good common sense. It's a needed policy. Strange Passerby (talk) 11:44, 10 December 2010 (UTC)[reply]
  156.  Support I express my support for this cause as long as the content is used for educational photos and anything that breaches the policy is immediately removed— Preceding unsigned comment added by Borisagrees (talk • contribs) time, day month year (UTC) (UTC) 07:25, 10 December 2010
    What a package, calls for censorship and authoritarian non-consensus decisions all for the price of one! You know, such statements are not exactly doing your cause any favours. --Saddhiyama (talk) 12:59, 10 December 2010 (UTC)[reply]
    In condemning him you have inadvertently accused all supporters of this policy as "calling for censorship". That is also out of line. 99of9 (talk) 13:08, 10 December 2010 (UTC)[reply]
    Let me rephrase: With the "out-of-scope" clauses and the "Millar-test" the policy has the built-in potential to result in decisions that could be described as censorship. --Saddhiyama (talk) 13:27, 10 December 2010 (UTC)[reply]
  157.  Support We are writing an encyclopedia, folks. Non-educational material isn't wanted anyway. -- Marie Poise (talk) 13:36, 10 December 2010 (UTC)[reply]
  158.  Support A good start, I think, though I'd personally like it a bit more strict. Take what you can get. In response to the "imposing American values/mores/standards" comments, they're a total non-issue because the question is a legal one, not a cultural one. The policy in its current form clearly states that Wikimedia's offices and servers are in the US, and thus it must first and foremost conform to US law. If they were somewhere else, the discussion would obviously be different. End of discussion. White whirlwind (talk) 13:50, 10 December 2010 (UTC)[reply]
  159.  Support Bejinhan talks 13:56, 10 December 2010 (UTC)[reply]
  160. ((Support)) This has been far too long in coming and will finally garner Wikipedia some well-deserved respect that has been lost by sneering youngsters uploading pictures of their own dicks "for the lulz." Encyclopedia Dramatica, which I admit I read, had a mocking banner add of Jimmy which read, "Help Wikipedia: Upload a picture of your genitalia." I laughed hard, but if I'm to thake things seriously and feel good contributing to community, this needs to be approved as policy. PBF1974 (talk) 14:17, 10 December 2010 (UTC)[reply]
  161.  Support - Kmusser (talk) 15:24, 10 December 2010 (UTC)[reply]
  162.  Support The policy seems to have covered the legalities involved. Koman90 (talk), Network+ 16:19, 10 December 2010 (UTC)[reply]
  163.  Support Most of this is common sense, and needs to be in writing as a formal policy.--Ianmacm (talk) 20:20, 10 December 2010 (UTC)[reply]
  164.  Support I wish that Commons would not have to deal with the laws of a particular country, but what can you do... Europrobe (talk) 17:14, 10 December 2010 (UTC)[reply]
  165.  Support As long as there is recourse for people to go to since there could be political or religious reasons to try to delete some actual art work or other material of interest which could attract a disproportionate number of (possibly organized) people who want to delete something. 71.163.215.2 20:38, 10 December 2010 (UTC)[reply]
  166.  Support I completely agree that we should not be an "amateur porn site", and while this policy isn't perfect, it is a good start. Ajraddatz (talk) 23:30, 10 December 2010 (UTC)[reply]
  167.  Support I complete support the sexual contents as long as there is a regulation. --TX55TALK 00:46, 11 December 2010 (UTC)[reply]
  168.  Support It seems a good start. Lhynard (talk) 00:52, 11 December 2010 (UTC)[reply]
  169.  Support -- Swtpc6800 (talk) 01:30, 11 December 2010 (UTC)[reply]
  170.  Support Tired of the "slippery slope" cries. This is a good starting point. Niteshift36 (talk) 02:50, 11 December 2010 (UTC)[reply]
  171.  Support It would be a good policy, especially since the servers are in the U.S., and it would protect the innocent. --Funandtrvl (talk) 02:57, 11 December 2010 (UTC)[reply]
  172.  SupportThere is no real reason to have it on here. WMF wikis are not free hosting. They can take their porn elsewhere. Hamtechperson (talk) 03:02, 11 December 2010 (UTC)[reply]
  173.  SupportThis conforms with state/country laws (which I support) where the site is hosted. More important, it keeps focus on site's purpose.SeoMac (talk) 03:06, 11 December 2010 (UTC)[reply]
  174.  Support This policy very clearly defines the scope of sexual content on Commons. I completely agree with the proposed policies for keep/delete, especially as long as the servers are in the US. B Fizz (talk) 03:15, 11 December 2010 (UTC)[reply]
  175.  Support Herostratus (talk) 03:56, 11 December 2010 (UTC)[reply]
  176.  Support Many images in commons are of little educational value. And yes, WP _must_ not be used as a porn repository. Sibi antony (talk) 05:45, 11 December 2010 (UTC)[reply]
  177.  Support Guideline is a good start. -Fnlayson (talk) 13:52, 11 December 2010 (UTC)[reply]
  178.  Support to avoid flooding the site with images which are not needed. Aberforth (talk) 13:55, 11 December 2010 (UTC)[reply]
  179.  Support Given applicable laws and our goals, we need rules here, and these are good rules. CWC(talk) 16:56, 11 December 2010 (UTC)[reply]
  180.  Support to avoid any inappropiate content. I agree with all of the users in this section and this is a way to block any sexual content in Wikipedia. Another reason I support this policy is for educational purposes. No children should see any sexual content on the website. --Impala99 (talk) 18:19, 11 December 2010 (UTC)[reply]
  181.  Support This is long overdue. Some changes have been made since I first contributed in April however I trust the process. I'm glad this is finally happening. - Stillwaterising (talk) 18:51, 11 December 2010 (UTC)[reply]
  182.  Support Maximum flexibility and openness should be encouraged without creating sections which are better served in an entertainment venue. Jettparmer (talk) 20:24, 11 December 2010 (UTC)[reply]
  183.  Support This site is not an indiscriminate collection of unregulated porn images. Punched Cards (talk) 21:59, 11 December 2010 (UTC)[reply]
  184.  Support We need a policy on such content. It might not be perfect, but much better than having no policy at all. -- Orionisttalk 22:13, 11 December 2010 (UTC)[reply]
  185.  Support It's good to have policy definitions on Sexual Content because of the 'open' nature of Wikimedia as a whole. It's too easily abused, and without at least some defined guideline it's left up to the interpretation of an editor - which isn't always right. TBockman (talk) 00:17, 12 December 2010 (UTC)[reply]
  186.  Support Commons should be the database for Wikipedia articles, not something else. --El bes (talk) 00:54, 12 December 2010 (UTC)[reply]
    now this is an interesting opinion; maybe we should stop trying to make commons into anything more than a storage depot for wikipedia media files. comments, anyone? Lx 121 (talk) 01:04, 12 December 2010 (UTC)[reply]
    About as likely as the hopes of opposers who say this is biased becase Commons should not be hosted in the US and thus subject to US law. --99of9 (talk) 02:47, 12 December 2010 (UTC)[reply]
    What I wanted to say is: if potentially offending files, that are of no actual or potential use for any wikipedia article, will be deleted, I won't shed a tear for them. --El bes (talk) 01:17, 13 December 2010 (UTC)[reply]
  187.  Support This appears to be a reasonable summary of policy and as others have sid this may not be perfect, but makes a good starting point. Jezhotwells (talk) 01:17, 12 December 2010 (UTC)[reply]
  188.  Support Not sure whether the policy is perfect but any issues that arise can be dealt with later.--Lincolncooper (talk) 03:50, 12 December 2010 (UTC)[reply]
  189.  Support While people may seem to think that this policy is censorship and I'm a person who dislikes censorship but this policy is needed to protect peoples rights, the prevention of images being hosted on Commons which are rightfully illegal and also very much unregulated content relating to sexual images on Commons. Like all policies on the Wikimedia projects, it may not be perfect but over time it will improve as well as improving the content hosted here. Bidgee (talk) 03:57, 12 December 2010 (UTC)[reply]
    the thing is, we already have policy for that. it was never permitted to host illegal materials on commons, & suitability of materials (including sexual content) is governed by Commons:Scope, etc. Lx 121 (talk) 06:20, 12 December 2010 (UTC)[reply]
    Maybe so however Commons:Sexual content makes it much clearer as proposed policy, where as Commons:Scope doesn't go into detail. One other issue which Commons:Sexual content clears up is what defines illegal as the law in (Florida) which the servers are hosted in is very much different (some are stronger or weaker then Florida's laws) from the laws in other states, territories, counties and countries in which the uploader may be located in. Bidgee (talk) 10:18, 12 December 2010 (UTC)[reply]
  190.  Support What I read isn't only reasonable, it seems common sense to me. As common sense is rarely as common as it should be, it should be stated and this states it well. There's nothing draconian about it and, if anything, it may need stronger wording in future editions. Pandarsson (talk) 07:56, 12 December 2010 (UTC)[reply]
  191.  Support The draft isn't perfect, and as others have pointed out, the draft has some redundancies in that it duplicates what is stated in other policies. Nonetheless, having this policy is better than not having any policy on sexual content. It would be great to have a policy that one can directly reference whenever there is some controversial sexually explicit material uploaded. Muntuwandi (talk) 09:25, 12 December 2010 (UTC)[reply]
  192.  Support as Bidgee. Amada44  talk to me 11:33, 12 December 2010 (UTC)[reply]
  193.  Support Needed for ethical reasons, legal reasons, and prevents the site from becoming a porn repository. TechsMechs4 (talk) 12:22, 12 December 2010 (UTC)[reply]
  194.  Support People are using Commons as a place to upload images of their privates. We only need a small number of images on a certain topic (like Category:Nude cooking is right now with 3 quality images). So let's not be censored, select the best 5 (or less) images that belong, and get rid of the rest. Commons is not a porn website and it needs to comply with U.S. law. There are plenty of other websites for people to get their titillation. I've seen too many low-quality penis pictures that don't belong. Royalbroil 13:49, 12 December 2010 (UTC)[reply]
  195.  Support We need a policy to avoid nonsense. --Fox1942 (talk) 14:24, 12 December 2010 (UTC)[reply]
  196.  Support WikiMedia Commons isn't a porn site, there are plenty of those elsewhere on the internet. Forcing it to move outside the US is ridiculous for this. Eraserhead1 (talk) 15:00, 12 December 2010 (UTC)[reply]
  197.  Support Not perfect put a good start and it look like we need something more explicit for people who like explicit things. Iluvalar (talk) 15:21, 12 December 2010 (UTC)[reply]
  198.  Support. as others noted, this does not appear to be perfectly good, but it's clearly an improvement. there is no need to store sexual content outside the scope of educational purposes. —Pill (talk) 15:54, 12 December 2010 (UTC)[reply]
  199.  Support as precaution. - Al Lemos (talk) 17:31, 12 December 2010 (UTC)[reply]
  200.  Support Just because sexual content may be considered vulgar/inappropriate other places, does not mean that it is here. Any excessive images are promptly deleted, lets get this as a policy. Flightx52 (talk) 17:53, 12 December 2010 (UTC)[reply]
  201.  Support Sexual content being allowed would make Wikipedia better becuase articles talking about sex and human anatomy could have pictures to go along with it. --205.250.214.224 18:35, 12 December 2010 (UTC)[reply]
  202.  Support This is a good step in the right direction. --Danh (talk) 18:49, 12 December 2010 (UTC)[reply]
  203.  Support Excessive images are not needed, so keep the content at an acceptable level. Mingomongo (talk) 19:03, 12 December 2010 (UTC)[reply]
  204.  Support. Commons should not be abused and flooded with an inappropriate content. If a certain image is needed for an article, why cannot it be uploaded on a given Wikipedia according to that Wikipedia policy? --Michael Romanov (talk) 21:48, 12 December 2010 (UTC)[reply]
  205.  Support. Common-sense approach to a difficult, challenging topic. Content should be within scope and relevant. -- btphelps (talk) (contribs) 22:19, 12 December 2010 (UTC)[reply]
  206.  Support, necessary and a good enough start. Important to prevent abusive edits. | Moemin05 (talk) 00:37, 13 December 2010 (UTC)[reply]
  207.  Support We need a strong policy on sexual content. There is no reason for Commons to be a porn shack, and there's no reason to open ourselves up to legal action for unencyclopedic content. --JaGa (talk) 01:13, 13 December 2010 (UTC)[reply]

Oppose

  1.  Oppose In India, the sexual content is not allowed. Thousands of school students are regular visitors, it effects their surfing habit. Biswarup Ganguly 11:45, 12 December 2010 (UTC)
    Are you saying that Commons must host porn because it is illegal in India? Are you saying that the WMF should encourage people to break their laws? How does hat make sense. Ottava Rima (talk) 17:10, 12 December 2010 (UTC)[reply]
  2.  Oppose I agree with the statement below. 207.114.16.194 18:59, 11 December 2010 (UTC)[reply]
  3.  Oppose This is a very slippery slope. Hence, my two immediate objections are: age of visitors, and policing. Neither can be monitored and enforced.Ineuw talk page on en.ws 07:23, 5 December 2010 (UTC)
    There's nothing here about the age of visitors, only the subjects of the photography. And of course we have no way to determine a visitor's age, even if we wanted to play parent. Wnt (talk) 04:49, 6 December 2010 (UTC)[reply]
  4.  Oppose for this revision. It should define more clearly what the educational purpose is, and what kind of images must not be deleted due to this issue. I also cannot get the point here: COM:SEX#Normal deletions: "Items likely to fall within the Wikimedia Commons project scope should not be deleted without using the normal deletion process unless they are copyright violations or illegal to host. These include: ..." Is it equivalent to: "If an item falls within the Wikimedia Commons project scope as listed below there, there is still a possibility for it to be asked for deletion and to be deleted due to the sexual content issue, only if we do have the normal deletion process."? If so, I simply can't agree on it. --Tomchen1989 (talk) 07:39, 5 December 2010 (UTC)[reply]
     Comment “Likely” refers to borderline cases, where the nominator thinks the image is outside project scope. The section is about deleting prohibited content. It could of course be written clearer. --LPfi (talk) 08:36, 5 December 2010 (UTC)[reply]
    It is a loophole. It would better be: "Items likely to fall within the Wikimedia Commons project scope should not be deleted without using the normal deletion process unless...". Otherwise, things fall within the scope cannot be imediately deleted, but the guardians can still request them for deletion via a normal deletion process, that'd be so annoying even if the item finally being kept. But even if the sentence is fixed here, it should be more specific to handle the non-photo images. --Tomchen1989 (talk) 09:30, 5 December 2010 (UTC)[reply]
    respectfully, i think you are mis-understanding the real meaning of this section, the key meaning of this text is that any admin can speedy delete any sexual content based entirely on their own, individual opinion about whether the item is "out-of-scope", without any discussion or debate process Lx 121 (talk) 14:16, 7 December 2010 (UTC) 14:12, 7 December 2010 (UTC)[reply]
    I should note that the situation that most contributed to the current policy on this is the "Korper des Kindes" series which was undeleted after the Jimbo Wales purge. The photos were taken in the 1890s by en:Guglielmo Plüschow, a notable photographer about whom something like ten books have been written, whose old castle apparently is now a museum of his photos in Germany which is open to the public. So it's clearly educational, artistic, historically significant content. It also looks rather like child pornography, and apparently Pluschow was actually convicted of "procuring" young boys, though in keeping with the laid-back sexual mores of the Victorian era he was only sentenced to a few months in jail. So what do you do in a case like that? The answer people favored in discussion is that either (1) someone has to convince the WMF that the image is actually illegal and they need to make an office action right away (which no policy on Commons can prevent), or (2) we have a proper deletion discussion and see what the community has to say about the legal issues. Wnt (talk) 05:24, 6 December 2010 (UTC)[reply]
  5.  Oppose IMO mature content, sexual content, whatever you name it should not be deleted just they display human genitals or intercourse. The line for educational use is blurred. My suggestion is such images be tagged as so and be hidden from non-registered users. I added Category:Images_from_15th_century_sexual_book_in_Iran The content is not deleted. But I believe users have the right to be warned before seeing any mature content. They should also have the right to choose the content types they might find inappropriate to view. The settings should be in user preferences. I suggest tags such as : HumanMaleGenitalia, HumanFemaleGenitalia, HumanSex... etc. --Nevit Dilmen (talk) 08:12, 5 December 2010 (UTC)[reply]
     Comment The proposed policy does not even suggest that anything should "be deleted just [because] it displays genitals or intercourse". Also, the educational requirement of COM:SCOPE is always blurry, we're not proposing any changes to the deletion review process for such blurry issues. Finaly, your book uploads are explicitly protected from deletion without discussion as they are historical artworks. Please can you explain what you oppose about it? 99of9 (talk) 11:03, 5 December 2010 (UTC)[reply]
  6.  Oppose No need for a policy on this topic, case by case, DR by DR is better in my opinion. - Zil (d) 08:53, 5 December 2010 (UTC)[reply]
  7.  Oppose. The proposed policy is just an example of m:instruction creep. Approximately one third of it only repeats what is already in other guidelines and policies (illegal content, copyvios and BLP violations). The other third is devoted to vague and useless interpretations of various US federal laws. These interpretations are not written by lawyers and may be misleading. The remaining third contains some rather explicit descriptions of sex acts, which the authors of the policy think are inappropriate for Commons. I am interested whether this proposed policy itself should be marked by some kind of a "sexually explicit" banner? Ruslik (talk) 10:05, 5 December 2010 (UTC)[reply]
  8.  Oppose per Zil and Ruslik. Also I'm afraid that this policy will be abused by certain people to get rid of pictures they don't like. Multichill (talk) 11:50, 5 December 2010 (UTC)[reply]
  9.  Oppose Images should be judged on their possible educational value regardless of interpretations of what "sexual content" may or may not mean to different cultures, religious groups or fringe lobby groups. -- (talk) 10:52, 5 December 2010 (UTC)[reply]
  10.  Oppose This is a slippery slope. In future it may become harmful for Commons! — Preceding unsigned comment added by Amit6 (talk • contribs)
  11.  Oppose If photos depicting sexual activity can be uploaded for “educational purposes” it will only be a short time before photos constiuting as pornographic material will be uploaded. With there uploaders defending how this pornographic material is "necessary for Wikicommons success” or how it is "useable education material". Aaaccc (talk), 5 December 2010 (UTC)
    Permitting photos depicting sexual activity to be uploaded is the status quo, and the history of this policy shows that no policy that bans that will pass. Opposing this policy does not advance your goals in that direction one bit.--Prosfilaes (talk) 18:02, 5 December 2010 (UTC)[reply]
    I am opposing this policy because I do not believe it would provide a clear enough distinction between "educational material" and "pornographic material". Aaaccc (talk), 5 December 2010 (UTC)
    To be clear, yes, this policy does allow uploads of pornographic material, unless it's actually illegal in the U.S., or lacks consent, or lacks any educational merit. It accepts that there is considerable overlap between what is pornographic and what is educational. Wnt (talk) 05:00, 6 December 2010 (UTC)[reply]
  12.  Oppose, but only because some of the wording in the policy is too vague as to be unenforceable. I'm specifically referring to the sentence low-quality pornographic images that do not contribute anything educationally useful may be removed in the lede. The term "low quality" is subjective; by what standards do we determine something is of low quality versus high quality? Does it mean some college kid who uploads fuzzy, low-res images of his genitals? Or does any image that someone does not like suddenly become "low quality"? And how do you determine whether something is useful educationally? Almost any image taken and presented in good faith has the potential to be educationally useful in some manner. This lede sentence leaves way too much up to subjective reasoning, individual bias and lobbying by people with agendas. A good policy should always be as objective as possible to eliminate these issues. Otherwise, I completely agree with the legal stuff and that Commons should not devolve into a webhost for porn (which is a real concern). The Garbage Skow (talk) 17:33, 5 December 2010 (UTC)[reply]
    That phrase came directly from COM:PORN, which was the existing policy. Apparently it was based less on ideology than on the tendency of some people to take quick snapshots of their penises to post here, leading to a glut of images without a particular use. This is presumably based on the policy that "snapshots of yourself and your friends" aren't wanted. Wnt (talk) 05:47, 6 December 2010 (UTC)[reply]
  13.  Oppose per Ruslik. AaronY (talk) 17:39, 5 December 2010 (UTC)[reply]
  14.  Oppose, strongly - slippery slope, censorship, the same reasons that have been rehearsed every time this comes up.--ukexpat (talk) 18:37, 5 December 2010 (UTC)[reply]
  15.  Oppose, strongly - illiberal and will lead to censorship, would be misused to get rid of "offensive" material --HaTe (talk) 21:29, 5 December 2010 (UTC)[reply]
    # Oppose This is in no violation of any rules, except one. OK, so we don't get to have a few instances of ancient art. IMO this is no big deal, compared to getting in the BBC News and probably deducting thousands of dollars of donations. And like Ineuw said, it is a slippery slope. Where do you put the line between allowing uploads of art and pictures from a college party? Why do we do so much damage to the Wikimedia foundation under the banner of "Commons is not censored", and for what? What a waste to through away a respected encyclopedia's reputation to have a few porn images. --The High Fin Sperm Whale 21:03, 5 December 2010 (UTC)[reply]
     CommentWhale, that's a reason to have a policy that controls the upload of specifically sexual imagery, not a reason not to have one. --Elen of the Roads (talk) 23:10, 5 December 2010 (UTC)[reply]
    I'm afraid I was somewhat mislead as of the nature of this policy. I shall change my vote. --The High Fin Sperm Whale 20:02, 9 December 2010 (UTC)[reply]
  16.  Oppose The policy itself would be welcome but what Ruslik said is true, too much redundant instructions. --grin ? 21:26, 5 December 2010 (UTC)[reply]
  17.  Oppose illegal content was (and will be) deleted, the current out-of-scope-rules are enough without special xxx-rules Rbrausse (talk) 21:32, 5 December 2010 (UTC)[reply]
  18.  Oppose Also due to "low-quality pornographic images that do not contribute anything educationally useful may be removed". "Low quality is too subjective and I have argued that some questionable images are of value since everyone reads an article differently.Cptnono (talk) 21:42, 5 December 2010 (UTC)[reply]
    Follow-up: It looks like a few people are concerned about the low quality bit. I would consider supporting if that part of the line was removed.Cptnono (talk) 00:43, 7 December 2010 (UTC)[reply]
    I think "low quality" as a criterion serves a real purpose: it's a subjective standard for people to discuss in deletion requests, which helps avoid mere voting and also puts the question more in the hands of the community than some objective measure(s) of quality might. That and "educational purpose" are both better left subjective for discussion by design. 71.198.176.22 02:33, 8 December 2010 (UTC)[reply]
  19.  Oppose - bad idea the last time we tried it, worse idea now. this thing is badly written. also how did out-of-scope get snuck into the criteria for speedy deletion!? that was not on the table, the last time i checked into this long-running conversation Lx 121 (talk) 00:37, 6 December 2010 (UTC)[reply]
     Comment It didn't. The proposal requires "obviously out of scope", not "out-of-scope". Trusting administrators to be able to make a call in cases that are a long way from grey seems eminently sensible to me. If an admin starts getting this wrong, they can be chided. --99of9 (talk) 12:12, 8 December 2010 (UTC)[reply]
  20.  Oppose} no way we put "out of scope" as a criteria for speedy deletion: "being in scope" is something that always, mandatorily should be discussed by the community before deleting: a single admin's evaluation can't and will never be enough. Apart from that it seems, more than a policy, a vague rant to subtily allow censorship. No way. And to the guys complaining above about possibly lost donations: We're not here for the money. We're here for a mission about free knowledge. Wikipedia could get billions by biasing its Coca Cola article to look sympathetic for them, but guess what? we don't. We have principles. Rejecting censorship is one of them. --Cyclopia (talk) 00:52, 6 December 2010 (UTC)[reply]
  21.  Oppose We have principles. Rejecting censorship is one of them. Next thing they will ask to censor images that might be of offensive to some ppl-- The Egyptian Liberal (talk) 01:12, 6 December 2010 (UTC)[reply]
  22.  Oppose} As already said; it needs re-drafting, & "out-of-scope" is NOT a reason for speedy deletion.Archolman (talk) 01:14, 6 December 2010 (UTC)[reply]
  23.  Oppose} For it to be a policy it needs to be unambiguous and clear. For a non-US resident it appears to be neither. Dost test and Miller test seem to be important to US citizens. The Dost article section on case law is without clear conclusions and Miller lack citations at crucial points. The Miller test appears flawed in not defining who the panel of jurors should be. The text above show considerable mission creep- on definitions of scope and deletion, and could be used as a future precedent for unrelated policies. All the other definitions seem a little prissy but acceptable but until the other issues are tightened up- it must be an oppose.--ClemRutter (talk) 02:11, 6 December 2010 (UTC)[reply]
    Dost and Miller are important to the Wikimedia Foundation, because those are the laws that it has to abide by.--Prosfilaes (talk) 02:58, 6 December 2010 (UTC)[reply]
    Follow the links- look at the articles as if it is the first time you have seen them- what was the conclusions drawn from the Dost case law. The Miller test (Or three pronged test) is littered with citation needed tags. To quote the article: For legal scholars, several issues are important. One is that the test allows for community standards rather than a national standard.. Those are the facts we need to use in forming a judgement. Miller is incredibly parochial; and it being proposed that the global wikipedia deletion policy should be determined by a village hall meeting in a place we have never heard of. While this is a legitimate concern for the staff, it is no way to form a policy. I am tempted to suggest that if the infrastructure could not support our servers in terms of bandwidth etc-they would be moved, and in the same way if our content cannot be supported because of local legal constrains the answer is to hire a lorry--ClemRutter (talk) 11:18, 6 December 2010 (UTC)[reply]
  24.  Oppose per Zil and Ruslik. --nsaum75¡שיחת! 05:00, 6 December 2010 (UTC)[reply]
  25.  Oppose} Some rhetorical questions: Do we have a policy that facilitates deletion of certain culturally offensive imagery? No? Do we have a policy that facilitates deletion certain imagery of violence or murder? No? Not even if the perpetrator or victim of the violence requests deletion 40 years later? Shall it become OK to flood commons with images DEPICTING AND GLORIFYING HATE AND DEATH BUT NOT LOVEMAKING? NOT ON MY WATCH! What started as obviously vile, anti-sex, agenda-driven instruction creep became an innocuous-looking almost-reasonable sounding but-not-quite-right proposal, due to some edits by talented and well-meaning editors. But no, I will never accept the argument that it's OK to show children how adults can kill each other, and hate each other, but not how adults can love each other. Because it makes no sense. Think about it. Amen. --Walks on Water (talk) 06:48, 6 December 2010 (UTC)[reply]
  26.  Oppose per Ruslik and for the "Material obviously outside of scope" speedy deletion: who decide what is "obviously outside of scope"? the last spring i see tens of ancient drawing, illustrations and good quality pictures of not-so-nude woman, many in use, speedy deleted as "out of scope" by different admins. Leave the speedy deletion based on subjective opinion of few people about the in scope/out scope IMO is only a trojan horse for new mass deletions. To remove illegal images we don't need a policy (the are illegal, so we can speedy them as we do with copyviol) and for the low quality images and nude pics we alredy have COM:PORN and COM:NUDE. About the consent, people who upload images without consent, you probably will have no problem lying about it (the disclaimer about copyright is perhaps blocking all the copyright violation?). --Yoggysot (talk) 07:14, 6 December 2010 (UTC)[reply]
    The speedy deletion for out of scope material is subject to a long list of exceptions, which hopefully cover the things you describe. Apparently there's some argument for downloading low-quality penis snapshots that have been uploaded in remarkable abundance. However I would not mind removing the out of scope speedy deletion entirely; the policy as written reflects the balance of a fairly small number of editors who have been working on it and discussing it. Wnt (talk) 13:05, 6 December 2010 (UTC)[reply]
    The problem here is trust. "Out of scope" was used as justification for deleating images with a clear educational value used on Wikipedia. We had to spend hours of our time to trace the images that were deleted, and humbly ask for undeletion. It was extremely frustrating and humiliating. It can happen again.Ankara (talk) 13:35, 6 December 2010 (UTC)[reply]
    As Ankara said, in the past we have had evidence of a very wide interpretation of "out of scope" in sexual topics. Certainly, after the speedy deletion the list of exceptions could be used to request (and probably obtain) an undeletion, but then wikipedians must control the delinker log, then go on every page to reinsert the image, etc... and the next days it can be speedy deleted again without debate only because someone think that "sex"="out of scope", and so on... For example, if someone read the study of controversial content, and use his recommendation as yardstick to apply this policies, he can speddy deleted "vast majority of" the "over 3,000 images on Commons (by our count) in various categories and sub-categories around “Female toplessness” and “Nude women”", because "they are out of scope", resulting in mass deletion wrost than the Jimbo anti-sex war of this spring. IMO it's better to decide if an image is "out of scope" in the normal deletion process, as happens with all the non-sexual image. I think that a guideline (or a series of guidelines) that explain how to categorize ad write the description of all the file about controversial topic (sex, nudity, violence, religion, etc) can help, but i don't see the need for a preferential way to the deletion for not copyviol and not illegal sex-related pic (or in general sex related file).--Yoggysot (talk) 03:22, 7 December 2010 (UTC)[reply]
  27. --Elian Talk 12:25, 6 December 2010 (UTC)[reply]
  28.  Oppose. Good honest effort by lots of great people, but ultimately I don't want a special policy for sexual content-- fairness dictates that whatever solution we create should apply equally to all cultures' taboos, not just Western cultural taboos. Also, out-of-scope speedy deletion was tried and it failed. Jimmy showed conclusively that no one editor can discern scope when he speedy-deleted all that art. The founder wasn't able to use out-of-scope-speedy-delete responsibly, there is no way all our admins should be trusted with a call like that. Lastly, Obscenity is a legal term, for use by lawyers. Neither the average reader nor the average admin can be tasked with deciding obscenity. The colloquial use of "Obscene" is very different than the legal definition, which is infinitely more narrow. Putting an 'obscenity test' inside a policy would be a very bad idea, requesting, in effect, that our editors reach legal conclusions without any legal training. --Alecmconroy (talk) 13:36, 6 December 2010 (UTC)[reply]
    •  Comment Honest efford, okay, maybe, but good is something else. Imho, obscene is a religious judgement akin to sin. Imho, we should not even be discussing it as a cause of refusal of material. Of course, personal rights of depicted must be obeyed. Thus:
  29.  Oppose--77.182.18.173 11:48, 7 December 2010 (UTC)[reply]
  30.  Oppose (a) it's plain unnecessary. (b) since we're an international and intercultural site, using US/Florida law is not acceptable. The pure suggestion to value it above all others in the world is obscene and fascist. (c) Servers and office space can be had almost everywhere on earth, so if we were wanting to have materials challanged by someplaces law, we can simply store and serve them elsewhere. (As a side note, US govt. escape their law in Guantanamo, why not act like so?) (d) Obscenity is too variable a concept, you cannot build on it. --Purodha Blissenbach (talk) 10:34, 6 December 2010 (UTC)[reply]
    Wikipedia's servers are located in the US state of Florida, therefore Wikipedia is specifically and only subject to Florida and US law. Wikipedia already works this way, anything against the law in Florida isn't allowed on Wikipedia. Swarm (talk) 03:57, 7 December 2010 (UTC)[reply]
  31.  Oppose - Per HaTe, Ruslik and The Egyptian Liberal. --Локомотив 15:59, 6 December 2010 (UTC)[reply]
  32.  Oppose as policy, acceptable as guideline. --Foroa (talk) 17:09, 6 December 2010 (UTC)[reply]
  33.  Oppose unacceptable as policy. I still deny the need for such a policy, the draft is not concise and specific enough to base decisions on it, and finally the rules are not dealing with the different cultures of Commons and the projects, but are despite the laudable effort still centered too much on one cultural heritage. --h-stt !? 18:29, 6 December 2010 (UTC)[reply]
  34.  Oppose Wouldn't improve anything.--Lamilli (talk) 18:39, 6 December 2010 (UTC)[reply]
  35.  Oppose - As someone who has worked in a system where sexual content has very real relevance, I can honestly say that this proposed policy (having failed before to become one) still needs a considerable amount of work before it even comes close to being suitable. It doesn't define conditions well enough, and simply relying on the DOST test as a means of assessment of potentially unsuitable images of children is not a reliable method of assessment by itself. I would recommend that the creators of this proposal go back to the drawing board, get hold of some books by David Finkelhor and start working out what they need to define in order to give accurate, usable conditions under which decisions can be made. BarkingFish (talk) 19:13, 6 December 2010 (UTC)[reply]
  36.  Oppose - Abusus non tollit usum. Kameraad Pjotr 20:03, 6 December 2010 (UTC)[reply]
  37.  Oppose - US-moral isn't Wikimedia moral! This is an international project. If the US-laws are not good for us, the servers have to find a better place (Sweden, Switzerland or an other). Marcus Cyron (talk) 20:54, 6 December 2010 (UTC)[reply]
  38.  Oppose US moral is stupid and sucks, Wikimedia and Wikipedia are free projects and shouldn't be restricted this way. --Julius1990 (talk) 20:58, 6 December 2010 (UTC)[reply]
  39.  Oppose Most users here are sensible, and though it may take time work, and sometimes a little heat, in my opinion the right balance between the censor everthing and include everything camps is usually arrived at. Setting things in stone with such a policy will please neither camp, some will see it as too restrictive others too liberal.--KTo288 (talk) 21:23, 6 December 2010 (UTC)[reply]
  40.  Oppose, strongly. The guideline seems illiberal and prone to misuse. Images should be judged on their possible educational value regardless of interpretations of what "sexual content" may or may not mean to different cultures, religious groups or fringe lobby groups. --Pinnerup (talk) 21:36, 6 December 2010 (UTC)[reply]
  41.  Oppose Per Ruslik + If this was adopted, other file policy proposals may crop-up citing this as a reason to adopt it, "What's good for the goose is good for the gander" effectively. --George2001hi (Discussion) 21:46, 6 December 2010 (UTC)[reply]
  42.  Oppose, strongly. It opens the door for censorship of all kinds, and is simply not necessary. Let's continue to decide case by case. --AndreasPraefcke (talk) 21:47, 6 December 2010 (UTC)[reply]
  43.  Oppose I agree with most of the content but I do not agree with point 3 of [5] - the speedy deletions of scope. We have admins and users that think we should not host nude images and allowing speedy deletions will probably end in a mass deletion (again). I'm sure we will see admins delete all new images without checking if image is better than excisting ones. Images out of scope should go through a regular deletion per COM:SPEEDY#Regular_deletion. --MGA73 (talk) 21:50, 6 December 2010 (UTC)[reply]
  44.  Oppose About everything seems to be said, in particular brought to the point by Ruslik. --Abderitestatos (talk) 23:30, 6 December 2010 (UTC)[reply]
  45.  Oppose - per Alecmconroy and others. MrBlueSky (talk) 00:47, 7 December 2010 (UTC)[reply]
     Oppose - please stabalise the policy proposal before asking for community opinion the obsecenity section has had a change in meaning since this poll was opened[6] Gnangarra 01:39, 7 December 2010 (UTC)[reply]
    • The change was simple clarification. - Jmabel ! talk 03:23, 7 December 2010 (UTC)[reply]
    • Whenever we get 100 sets of fresh eyes across a text, they are bound to find some improvements. Any change in meaning was extremely minor, and surely nobody's support or oppose hinged on it. If it did, they are welcome to revert the change to the original phrasing. Do you really want to oppose an entire proposed policy giving only such a small technicality as your reason? --99of9 (talk) 09:22, 7 December 2010 (UTC)[reply]
      • Being a contributor for around 4 years and an admin for around 3 years I have not taken lightly the decision to oppose, I recognise the importance of the policy and I'm aware of the history behind its need. The time taken so far and the number of people involved there should be a moritorium on changes while the poll takes place so that everybody is agreeing to the same thing. Even since I highlighted it and said oppose because of instability there has been another change(only spelling). One cannot expect people to support the policy until its stable for the whole period of the poll. Gnangarra 10:11, 7 December 2010 (UTC)[reply]
        • Have you read the introductory text above this poll? We are voting on a specific version (Nov 26). We do sort of have a moratorium on even slightly contentious changes. If any of the changes since then are disliked by anyone, they can be reverted to obtain a full consensus later. --99of9 (talk) 10:19, 7 December 2010 (UTC)[reply]
          • then someone needs to lock-down the text ; i understand what you're saying about floating revisions always happening on a wiki, BUT it isn't a fair vote, if the text of the ballot question being voted on keeps getting changed during the polling. for this process to be credible, the nov.26 text needs to be what the people who are voting will see Lx 121 (talk) 14:12, 7 December 2010 (UTC)[reply]
     Comment I've withdrawn my oppose though I still think the policy should be stable before any poll is commenced, I also believe that there is a valid and real need to impliment a policy on sexual content as Commons(The Foundation) has legal obligations that we cant ignore. Weighing up the differences I've elected to support this because I believe that the community can address any further adjustments that need to be made. Gnangarra 12:14, 9 December 2010 (UTC)[reply]
  46.  Oppose First, let sensitive people use filter software. On the software manufacturer, is to comprehensively develop a case by key words. You could also see a lot more sexual material, passing beside a nearby kiosk. I am confident with the work of administrators and patrolers. --Vhorvat (talk) 01:54, 7 December 2010 (UTC)[reply]
  47.  Oppose - It would be a welcome to censorship. Playmobilonhishorse (talk) 03:58, 7 December 2010 (UTC)[reply]
  48.  Oppose, strongly - per AndreasPraefcke and others. --Rlbberlin (talk) 05:24, 7 December 2010 (UTC)[reply]
  49.  Oppose per MGA73 (i.e. I agree with much of the proposal, but not out-of-scope speedy deletion). --Avenue (talk) 11:50, 7 December 2010 (UTC)[reply]
  50.  Oppose COM:CENSOR and COM:PORN perfectly regulate such issues, particularly child pornography. Specifically, the very first sentence of COM:CENSOR says that "files and other materials which are not lawful for Commons to host on its servers in Florida will be deleted immediately upon being identified as illegal". Occam's razor is fruitful here. Brandmeister (talk) 13:17, 7 December 2010 (UTC)[reply]
  51.  Oppose-solution in search of a problem. Current policies are adequate to deal with this. -Atmoz (talk) 14:16, 7 December 2010 (UTC)[reply]
  52.  Oppose-Well, even if an image or a video contains sexually explicit content, it may be suitable for minors (for instance, a British sex-ed film Growing Up was criticized by some people since it featured actual film rather than drawings of naked people, which included intercourse and masturbation, however teachers and pupils then did give it positive feedback), so there is no need to make a policy regarding sexual content.--RekishiEJ (talk) 14:44, 7 December 2010 (UTC)[reply]
  53.  Oppose --Habakuk (talk) 15:55, 7 December 2010 (UTC)[reply]
  54.  Oppose How many sexual pictures here does Commons need anyway? Secondly, there are already some individuals angry at Commons for having their photos used here when they were taken at what they thought were what was a private but open social gatherings and then placed on flickr but licensed freely. --Leoboudv (talk) 20:06, 7 December 2010 (UTC)[reply]
    some examples? --Yoggysot (talk) 02:18, 8 December 2010 (UTC)[reply]
  55. Strongly  Oppose. Existing policies already deal with this - Scope, Commons is not censored, Nudity, Photographs of identifiable people, General disclaimer, What Commons is not. --5ko (talk) 20:33, 7 December 2010 (UTC)[reply]
  56.  Oppose mostly per Ruslik, Alecmconroy and Brandmeister. - CharlieEchoTango (talk) 23:09, 7 December 2010 (UTC)[reply]
  57.  Oppose Cgtdk (talk) 23:39, 7 December 2010 (UTC)[reply]
  58.  Oppose - Part censorship, part redundancy. --M4gnum0n (talk) 23:48, 7 December 2010 (UTC)[reply]
  59.  Oppose as a policy, Would strongly Support as guideline. As guideline status would allow wiggle room and as other say this seem redundant with existing policies ResidentAnthropologist (talk) 23:57, 7 December 2010 (UTC)[reply]
  60.  Oppose We aren't in dire need of this (other policies cover it well enough) and this is a slippery slope down the road to censorship. Themfromspace (talk) 00:37, 8 December 2010 (UTC)[reply]
  61.  Oppose No new policy needed which opens the door to censorship (eg with it's vague rules for deletion ("high quality")). Illegal images were deleted also in the past, so all was fine. Although I am sad that many people wasted time to make up this policy. --Saibo (Δ) 01:12, 8 December 2010 (UTC)[reply]
  62.  Oppose mainly due to the portion which allows "out-of-scope" speedy deletion. The events of this May made it abundantly clear that deciding what types of sexual content are obviously out-of-scope is an extremely subjective and contentious determination. Black Falcon (talk) 01:43, 8 December 2010 (UTC)[reply]
  63.  Oppose Existing policies already deal with this. Possible censorship.--Chrono1084 (talk) 01:52, 8 December 2010 (UTC)[reply]
  64.  Oppose i respect the intention behind this effort but the concept of "educational value" is obviously ill-defined --Jan eissfeldt (talk) 03:57, 8 December 2010 (UTC)[reply]
  65.  Oppose I oppose a policy or section that is absolutely useless. I was just asked by a Federal judge why after allowing my nudes to be shown on Wiki to minors I can be opposed to GOOG doing it also. Either place all nude images behind an age disclaimer or keep out the Googlebot-image bot. This is not just an opposition to ALMOST getting it right but a DEMAND that the Googlebot-image bot be excluded by 12-10-2010 or that all photo content by me be removed. I am comfortable with it being here and it is some of the best content and should remain. Wiki must either keep out the GOOG image search or delete content donated by me to prevent it being shown to my minor children while at school on GOOG safe searches. CurtisNeeley (talk) 04:40, 8 December 2010 (UTC)[reply]
  66.  Oppose Unneeded, slippery slope, and per Ankara (re the sum of all human knowledge, in the Comments section below). Existing policy allows immediate removal of unlawful files and the reasonably prompt removal of files deemed (by consensus) beyond scope of Project. If there's an ongoing or prospective problem, more policy isn't the answer; better enforcement is. Let's take the common sense already found here to heart and not further stigmatize sexual content with unnecessary new regulatory verbiage. Rivertorch (talk) 06:06, 8 December 2010 (UTC)[reply]
  67.  Oppose Although I commend the authors of the policy for the work they've put into it, this seems to be mostly redundant and unnecessary. I'm worried that users could construe this new policy into some form of censorship of the Commons, and it's important that we keep away from instruction creep when it's not needed. Nomader (talk) 06:13, 8 December 2010 (UTC)[reply]
  68.  Oppose As others have said, unless we are going to introduce a policy on when paintings that depict Abrahamic prophets are acceptable and when they should be deleted based on their perceived educational value, or a policy on when we can include images of dead soldiers and when we can't, then this is way too slippery a slope. Max Rebo Band"almost suspiciously excellent" 06:46, 8 December 2010 (UTC)[reply]
  69.  Oppose Not generally, but with differentiation. --Perhelion (talk) 08:03, 8 December 2010 (UTC)[reply]
  70. Strong  Oppose Wikipedia is not censored. EngineerFromVega (talk) 08:07, 8 December 2010 (UTC)[reply]
  71.  Oppose It is the task of the legislative to define which kind of sexual content is illegal. Illegal content must be deleted. So there is no need for a general policy for sexual content. As always a specific review for a specific file is required to decide if this file is out of scope. Yeah, this is not fun, but it is fair and just. --Catfisheye (talk) 08:25, 8 December 2010 (UTC)[reply]
  72.  Oppose I don't think the pro-arguments are really convincing while the danger of a slippery slope and censorship are real. As Nomader says above, the proposed new policy is in parts redundant to those we currently have because no one can claim that we need this new policy to delete prohibited content (like child pornography or copyright violations or images that violate Commons:Photographs of identifiable people). So it's likelier that with this new policy people will try to use it to justify censorship than that we really need it for clarification. Also, imho, all content on Commons should be treated equally, following the same rules. Last but not least, any policy that is, or might be applied as, stricter on sexual content than on other content can and will lead to fragmentation of content, with people uploading such files deleted here back to the Wikipedias it came from if those have less strict policies, thus defeating the whole purpose of having a common repository for files. Regards SoWhy 08:28, 8 December 2010 (UTC)[reply]
  73.  Oppose Too vague, very easy to use as justification for what is actual censorship. It also seems to entirely hinge on US laws. Child pornography is already forbidden as are depictions of sexual acts which do not contribute educationally to an article. So I say, a case by case approach is good enough. A new policy is simply not needed.--Astepintooblivion (talk) 09:50, 8 December 2010 (UTC)[reply]
  74.  Oppose I don't see the requirement for a new policy here, plus the reliance on US laws is troubling to non-US people such as myself.--Topperfalkon (talk) 10:41, 8 December 2010 (UTC)[reply]
  75.  Oppose I dont believe the "slippery slope" concerns have been addressed convincingly. Even with this policy, we ill be forced to deal with contentious images in a case by case basis because "obscenity", "sexual content" are all subjective terms and have widely different interpretations in different cultures. --Sodabottle (talk) 10:47, 8 December 2010 (UTC)[reply]
  76.  Oppose Concerned about slippery slope to censorship, and especially concerned about speedy rules. Most of the proposed page also seems to be covered by other policy pages, and so I'm not sure why this page is needed. --Falcorian (talk) 13:08, 8 December 2010 (UTC)[reply]
  77.  Oppose Although I understand the fear of prosecution is valid one of the greatest things about WP is the fact that this site, for the most part, has been willing to toe the line on issues like this. Regular montering, by all editors will more than likely keep the pages clean and leagle. I for one know after I change a page and add it to my watch list any time there's an edit I at least check the history to be sure there are not IP address users in the history since my edits. If there are I check the past edits up to my last edit and revise according to standards. The slippery slope theory is also valid, and one of the biggest fears of us who oppose this poll. Lets keep WP in the hands of those who work so hard, and offten for free, to keep this site at the standards we belive it should be.
  78.  Oppose --Janneman (talk) 14:06, 8 December 2010 (UTC)[reply]
  79.  Oppose US laws mean nothing to me (and I think the mentioned laws on "obscenity" are just funny and outdated), and the proposed policy is too vague and can easily be misused in either way. --Thogo (Disk.) 14:08, 8 December 2010 (UTC)[reply]
    as m:Steward you should be the last person to say US laws means nothing, The Foundation is a US organisation, the servers are in the US they are subject to US law. Gnangarra 14:16, 8 December 2010 (UTC)[reply]
    Don't tell me what I should and what not. Stewards have to abide by the Foundation policies and by nothing else. I'm not subject to US laws unless I'm currently in that country. --Thogo (Disk.) 14:49, 8 December 2010 (UTC)[reply]
    Foundation policies abide by US law, you know. Your identity is verified as a Steward so and if you break US laws regarding privacy you will be held responsible. By using US servers you are on US sovereign soil. That is how it works. Ottava Rima (talk) 18:28, 8 December 2010 (UTC)[reply]
    The privacy policy of the Foundation is not based on US law, just so you know, it's much stronger than US law. But that's none of your business, and it definitely doesn't belong here, so every further off-topic comment will be removed. --Thogo (Disk.) 23:55, 8 December 2010 (UTC)[reply]
    It is my business as it is the business of every user here. We have an Ombudsman just to investigate those who would -dare- make such claims as the above, as it is a threat to our privacy, our safety, and a violation of US law. If you hate US law so much, keep it off Commons as -that- has no reason here. Commons is not for you to make a point. You can be sure your actions and words will come up during the next discussion of confidence regarding you, as you have a long history of actions regarding Commons that makes it seem that your actions might not line up with what is best for the Foundation and its projects as they can put us into serious legal jeopardy. Ottava Rima (talk) 02:16, 9 December 2010 (UTC)[reply]
    This could be the beginning of an ugly controversy, as Ottava Rima is prone to try things like that. The problem is, if it is crucial for a Wikimedia steward to respect the nuances of U.S. law, how can any non-American stand for election as a steward? But such a motion would be a huge kick in the face for the international Wikimedia community. One reconciliation for this is what we set out to do here: make the specific policy closely follow the U.S. law, so that as long as people follow the policy they follow the law. And if the policy becomes consensus, stewards are required to follow it. Wnt (talk) 16:38, 9 December 2010 (UTC)[reply]
    Especially in this case we could have not much worse reference as the US law. It's nearly offensive against the users from other countries, that in general don't life after this forsaken rules. Someone should move the servers out of this desert and bring it into the lands of the free. Would definitely help and ensure that Wikipedia will stay free from censorship. --Niabot (talk) 18:04, 9 December 2010 (UTC)[reply]
    Foreigners who come to the US are expected to follow our laws, so it isn't hard. Look at child porn - it wont be acceptable here regardless of a person being in a country it is. Ottava Rima (talk) 20:49, 9 December 2010 (UTC)[reply]
    Believe it or not, not the whole world belongs to the US, and there are indeed (about 6.5 billion) people who are neither US citizens nor currently in the US and are therefore not subject to US law but to the law of their respective countries which might differ (believe it or not). --Thogo (Disk.) 22:28, 12 December 2010 (UTC)[reply]
    As long as Wikimedia servers are located in the U.S. state of Florida, the content on this server has to abide by Florida state laws and by U.S. federal laws. While not everyone around the world is a US citizen or resident, Wikimedia would be held legally responsible if its content violated Florida or US laws. Until the day Wikimedia content is hosted in another country, we have to follow US laws. A dislike of US laws and a dislike of a US focus in the legal considerations of "Sexual content" are not reasonable factors in opposition of Commons:Sexual content. However I believe that residents of other countries can be acceptable stewards. All they need to do is study the relevant US and Florida laws, and they will be good to go. WhisperToMe (talk) 00:19, 13 December 2010 (UTC)[reply]
    Believe it or not, the whole WMF belongs to the US. You knew this before starting. You don't like US laws, so who cares? You still have to abide by them while here. Ottava Rima (talk) 22:42, 12 December 2010 (UTC)[reply]
    They're right, Thogo. Give it a think. --JaGa (talk) 01:18, 13 December 2010 (UTC)[reply]
  80.  Oppose US law? This is Commons! --Gereon K. (talk) 14:12, 8 December 2010 (UTC)[reply]
    Our servers are located in the U.S. state of Florida, so Wikimedia must comply with both Florida State law and U.S. federal law. WhisperToMe (talk) 00:19, 13 December 2010 (UTC)[reply]
  81.  Oppose The policy mostly reflects american interests. I don't think this policy provides a good framework for our educational mission. yes, sex is a thing we need to cover, even graphically. To many people don't know enough about it. Who wants to get porn, will probably find better resources than commons. --bluNt. 14:15, 8 December 2010 (UTC)[reply]
    The policy "mostly reflects american interests" because it has to. Wikimedia servers are located in the U.S. state of Florida. WhisperToMe (talk) 00:21, 13 December 2010 (UTC)[reply]
  82.  Oppose --alexscho (talk) 14:33, 8 December 2010 (UTC)[reply]
  83.  Oppose«« Man77 »» [de]·[bar] 14:35, 8 December 2010 (UTC)[reply]
  84.  Weak oppose It's a shame it has to come to this, but here we are. The proposed policy is not bad, but I don't like the idea of speedily deleting files for being out of scope. Files that are believed to be out of scope need to go through the "normal" deletion process, not silently deleted by a single admin who has xyr own opinion on the project's scope. If point three is removed from the speedy deletion criteria, then I would be willing to support this as policy. Reach Out to the Truth (talk) 14:48, 8 December 2010 (UTC)[reply]
    I think you should reconsider your position and change it to Strong oppose. Current version of the policy creates a lot of possibilities of deleting the media which should have been created, and it's very careful never to explain what stuff will definitely be kept safe. This policy can turn away many new users who will do their best to contribute, and then have their contributions deleted. Beta M (talk) 17:52, 9 December 2010 (UTC)[reply]
  85.  Oppose Per Ruslik and others. --Kbdank71 (talk) 14:57, 8 December 2010 (UTC)[reply]
  86.  Oppose Existing policies are sufficient when ad hoc dealing with really problematic files. Any further definitions, standards, tagging and sorting (in other word censorship) of so called "sexual content" are unacceptable for me. Again and again certain prude groups push similar proposals trying to bowdlerize this worldwide project and impose their one-country based definitions to all nations, but I will resist it because I think Commons should be as much free as possible. --Miaow Miaow (talk) 15:15, 8 December 2010 (UTC)[reply]
  87.  Oppose No Policy of Censorship; decide case by case ----Wmeinhart (talk) 15:25, 8 December 2010 (UTC)[reply]
  88.  Oppose as per AndreasPraefcke and blunt --Paramecium (talk) 16:48, 8 December 2010 (UTC)[reply]
  89.  Oppose as per Marcus Cyron --Gamma127 (talk) 16:58, 8 December 2010 (UTC)[reply]
  90.  Oppose as the two guys before alrady mention. --Kuebi (talk) 17:02, 8 December 2010 (UTC)[reply]
  91.  Oppose --~Lukas talk 17:20, 8 December 2010 (UTC)[reply]
  92.  Oppose --Magadan (talk) 17:28, 8 December 2010 (UTC) If US law is a problem for content on our servers then please move our servers elsewhere.[reply]
  93.  Oppose until mention of the Miller Test and US obscenity law is removed, per my discussion below. Gigs (talk) 17:36, 8 December 2010 (UTC)[reply]
  94.  Oppose ---<(kmk)>- (talk) 17:40, 8 December 2010 (UTC) Wasn't the US the mother of freedom of speech?[reply]
  95.  Oppose The porn problem on Commons seems to exist mostly in some people's imagination. PDD (talk) 17:45, 8 December 2010 (UTC)[reply]
  96.  Oppose per many of above and in particular the speedy deletion of "obviously out of scope" files. Davewild (talk) 17:54, 8 December 2010 (UTC)[reply]
  97.  Oppose Speedy deletions as "out of scope" are not acceptable—among other problematic points. --Rosenzweig δ 18:28, 8 December 2010 (UTC)[reply]
  98.  Oppose, holy no. —DerHexer (Talk) 18:41, 8 December 2010 (UTC)[reply]
  99.  Oppose - Per the "out of scope" deletion rule, as well as the unclarities and virtually unending problems I foresee from the implementation of the Miller test. Probably a lot more issues as well. --Saddhiyama (talk) 19:04, 8 December 2010 (UTC)[reply]
  100.  Oppose--Verum (talk) 19:44, 8 December 2010 (UTC)[reply]
  101.  Oppose Please don't copy extremly prude and completely outdated US moral laws... Chaddy (talk) 19:54, 8 December 2010 (UTC)[reply]
    You do know that the US has some of the most liberal laws regarding porn and that the majority of the world belongs to cultures that oppose porn completely, right? Ottava Rima (talk) 23:09, 8 December 2010 (UTC)[reply]
    {{citationneeded}}. With all due respect I very much doubt that. Compared to China and Saudi Arabia sure, but most European countries, which I think is what we are comparing here, there are just no contest. The US definitely has some of the the most liberal laws when it comes to political free speech, much more so than most European countries, but regarding pornography laws they are not comaprable. It just seems we can't have the best of both worlds, but I must admit that if I had to choose I would choose the more free political speech. I still oppose this policy proposal, though, as there seem to be no reason to choose, as existing policy already covers it. --Saddhiyama (talk) 16:51, 9 December 2010 (UTC)[reply]
    You mean China with 1/4th of the world's population or Islam representing 1/5th? India has tougher laws on porn than the US, as many countries. US has most of the world's porn studios for a reason. Your statement seems to suggest an insignificant portion of the world (Europe, and only a few countries in Europe at that) can somehow trample over most of the world's population. Ottava Rima (talk) 20:49, 9 December 2010 (UTC)[reply]
  102.  Oppose Per Ruslik and many others. Adrian Suter (talk) 21:10, 8 December 2010 (UTC)[reply]
  103.  Oppose <insert reason here> --Schnatzel (talk) 21:36, 8 December 2010 (UTC)[reply]
  104.  Oppose --Joe-Tomato (talk) 22:43, 8 December 2010 (UTC)[reply]
  105.  Oppose-- Neozoon (talk) 22:57, 8 December 2010 (UTC) Thanks for the time invested to generate this proposal. Two main points why this can not be a policy: Speedy deletion of "Out-Of-Scope" pictures, we had this already and the inability to validate the Miller-Test within a community of international contributors. The current situation is better than having this.[reply]
  106.  Oppose Ukko.de (talk) 23:04, 8 December 2010 (UTC) per Brandmeister, Jan eissfeldt and others[reply]
  107.  Oppose Principal objection is to Speedy Deletions, Clause 3. ("Material obviously outside of scope....[ ]. Scope policy is very general and sexual content is frequently deleted under this policy. However, several categories of sexual content detailed below are likely to fall within the project scope, and should not be speedy deleted."). All material regarded by an individual as being outside of scope should be proposed for deletion and never speedied. There is so much subjectivity here that it is bound to be unfair and unworkable. Clause 3 should be removed. Anatiomaros (talk) 23:34, 8 December 2010 (UTC)[reply]
  108.  Oppose We don't need a special policy for sexual images. If an image of whatever sort is inappropriate, it has to be explicitely argued that this is the case in the context of whatever article it is used in. US law can never be an argument that would lead to a different consensus than one that would otherwise have been reached. If this then leads to a conflict with US law, then one has to think about moving WikiMedia out of the US. Count Iblis (talk) 02:09, 9 December 2010 (UTC)[reply]
  109.  Oppose. For the most part I'd actually support it as a policy if it were not for one glaring omission. There seems to be no mentioning of how to deal with erotic/pornographic art. If "Content clearly not educational or otherwise in scope" were to be interpreted as excluding most erotic/pornographic art unless it belongs to famous artists, I would strongly oppose the policy. If on the other hand it would be only applied to common pornographic pictures (the usual adult sites stuff) I'd support the policy. However for now without clarification on that subject I oppose the policy.--Kmhkmh (talk) 02:29, 9 December 2010 (UTC)[reply]
  110.  Oppose. Regardless of how well this proposal is watered down and polished--even with the best intentions--it would still inevitably be used by some as a platform to support eventual censorship of sexual content, as well as any other %CONTROVERSIAL_MEDIA_TYPE%.   — C M B J   05:13, 9 December 2010 (UTC)[reply]
  111.  Oppose Out of scope and speedy do not fit (and see PDD). sугсго 07:11, 9 December 2010 (UTC)[reply]
  112.  Oppose I don't see any need for this --Sargoth (talk) 07:38, 9 December 2010 (UTC)[reply]
  113.  Oppose The problem with sexual contents is mainly an us-problem. If an content is pornographic or non-pornographic should be decided case by case. Commons was founded to collect free contents, not to collect non-pornographic contents. These contents are mayby useable for an enzyklopedia or not. Any censorship could block the aim of collect free contents at all. There are al lot of sexual files are who are able to use in Wikipeida. We don't need no censor with exaggerated moral reservation. Widescreen ® 08:33, 9 December 2010 (UTC)[reply]
  114.  Oppose No need to go beyond existing legal obligations, just follow the laws that apply. Fossa?! 09:12, 9 December 2010 (UTC)[reply]
    Where do you think the proposed policy goes beyond obscenity and privacy law? --JN466 14:39, 11 December 2010 (UTC)[reply]
  115.  Oppose --Cartinal (talk) 10:29, 9 December 2010 (UTC)[reply]
  116.  Oppose --Don-kun (talk) 10:46, 9 December 2010 (UTC)[reply]
  117.  Oppose Since censorship is already in best pratice. Should we now make it the default? ... -- Niabot (talk) 10:51, 9 December 2010 (UTC)[reply]
  118.  Oppose Any in-scope/out-of-scope determination needs to be made through a deletion discussion, not through admin action such as speedy deletion. If the allowance for speedy deletion of out-of-scope files is removed from the proposed policy, I will support it, because in that case the policy will merely pull together already-existing policies and guidelines (office deletion of illegal files, speedy deletion for copyvio, etc.). cmadler (talk) 10:53, 9 December 2010 (UTC)[reply]
  119.  Oppose There are so many things wrong with this that i'm afraid that i will forget to mention all of them, but i'll try. 1) Having the policy that says "only educational material is allowed" is bad enough (because education with images comes from the articles they are used in, for example an image of rape would not be educational in the article on animal liberation, but would be in something like "BDSM community and sexual violence" outlining the differences, etc); but this policy (if it will become policy) will give the right to determine whether or not the image (or video, audio, etc) is non-educational judging exclusively from what is seen (or heard, etc). 2) Any suggestion that the burden of proof should be on the person who has uploaded the content will open all the content from an individual who cannot access internet to deletion. 3) The policy copies the current legal situation where the servers reside, which can mean trouble if those laws change. Let's say there's (unlikely) chain of events which will lead us very quickly to the law in Florida making it illegal to censor the media based on the age of the submitter or based on sexual practices the person is engaged in. This policy would become in clear violation of such a law. Also let's say that obscenity law will finally be stricken from the record as idiotic, it will de facto remain the law on wikicommons, because it's a part of the policy. Laws of the territory can be mentioned, but only in the way that says "we must follow all the laws where the server stands at the time of the decision", it would be then appropriate to link to some article like "en:Laws against sexuality in the USA" or some-such. 4) The policy of wikicommons should serve the purpose of enlarging the wikicommons without detracting from the educational value, rather than preserving educational value and if possible enlarging the sum of all works. It is a very big distinction, and people who work hard at making commons grow should be thanked, rather than being forced to jump through extra hoops in order to prove to somebody who has not contributed such content that everything is ok. 5) The policy as it is written is set out to define what sexual content will exist, but is too busy talking about what must be deleted. No attempt is made to define exactly what will be "definitely approved", and as such may detract the potential contributor. Some people work very hard to submit content, and they do not want to do that as a test case for some admin to see if they can get away with deleting more than their share. ... Beta M (talk) 11:24, 9 December 2010 (UTC)[reply]
  120.  Oppose We don't need a copy of prude US laws. If we had to follow such criterias, maybe also accordingly to other countries, soon our matter will fail. --Blatand (talk) 12:36, 9 December 2010 (UTC)[reply]
  121.  Oppose - Policy already requires us to only host content which is legal under US law, so I don't see the need to treat sexual content separately. We also already have a policy on personality rights, so there's no need to duplicate this here either. -- ChrisiPK (Talk|Contribs) 12:58, 9 December 2010 (UTC)[reply]
  122.  Oppose There are laws already. An extra policy is not required. Hybscher (talk) 13:48, 9 December 2010 (UTC)[reply]
  123.  Oppose The three broad criteria are illegal (which wouldn't be allowed anyway), lack of consent (we have personality rights guideline already), and the last (and most horrifying) "not educational or otherwise in scope". I'm reminded of book burning parties. I know a broad swath of people are offended by sexual content. But Commons isn't based upon moral guideline. Commons has no right not to be offended. What is "educational" is so nebulous as to be indefensible, and this policy if passed would cloud any debates about such content, rather than clarify. --Hammersoft (talk) 14:53, 9 December 2010 (UTC)[reply]
  124. strongly  Oppose as an author in wikipedia and contibutor on commons -- Achim Raschka (talk) 15:44, 9 December 2010 (UTC)[reply]
  125.  Oppose - think that just because some people have a taboo shouldn't mean they have to censor information. I agree with policies that remove illegal content, but if we make a policy for sexual things are we going to make a policy for gross cats, useless fruit pictures and whatever else? It's a slippery slope, I think. Delete the illegal pictures, and don't worry if the images have human body parts in them or not. Also I am a little uncomfortable that this policy debate seems to be only taking place in English ...this will be a Commons wide policy right? We need discussion from all minds. Anyway, cheers, Nesnad (talk) 16:26, 9 December 2010 (UTC)[reply]
  126.  Oppose -download | sign! 16:54, 9 December 2010 (UTC)[reply]
  127.  OpposeThis is not needed in my opinion.
  128.  Oppose Per Ruslik. -- kh80 (talk) 18:10, 9 December 2010 (UTC)[reply]
  129.  Oppose Per above Druifkes (talk) 18:15, 9 December 2010 (UTC)[reply]
  130.  Oppose mj 18:52, 9 December 2010 (UTC)[reply]
  131.  Oppose Aineias (talk) 19:33, 9 December 2010 (UTC)[reply]
  132.  Oppose per Marcus Cyron and Julius1990 --Grim.fandango (talk) 20:02, 9 December 2010 (UTC)[reply]
  133.  Oppose most of this proposal seems redundant to me - particularly the speedy deletion reasons simply duplicate existing copyright, legal and scope guidelines. The normal deletions -section seems rather pointless "don't make baseless deletion requests". The prohibited content lists child pornography and other illegal content, privacy rights and scope, all which should be covered by existing rules. The file description and categorization guideline is too strict - there is no reason to quarantine every slightly racy picture into separate subcategories. MKFI (talk) 20:24, 9 December 2010 (UTC)[reply]
     Comment The proposal does not "quarantine every slightly racy picture". That section is about not having images of intercourse in Category:Beds. The wording leaves the details to common sense. And isn't it good to have neutral, encyclopaedic wording in the file descriptions? I think this makes the sexual content much more acceptable for those not liking it, without hurting our mission. --LPfi (talk) 08:37, 10 December 2010 (UTC)[reply]
  134.  Oppose --Flussbus (talk) 20:59, 9 December 2010 (UTC)[reply]
  135.  Oppose This is Silver seren. I think that this is indeed a bit of instruction creep and, also, redundant to what is already performed on Commons. Furthermore, after the huge debacle over child pornography before, I don't exactly trust that this won't be used to further damage the collection of images here on Commons. I mean, someone could easily say that those lithographs from the 1800s and early 1900s that depict child pornography should go, since they are explicit and focus on sexual content. However, we all know that they need to be retained for their historicity and the fact that they are used on the Wikipedias as depictions in various historical articles and, of course, the article of the creator. 165.91.173.6 21:09, 9 December 2010 (UTC)[reply]
     Comment Erotic art is in project scope and can be deleted only if illegal, and we have to delete illegal content anyway. The issue should be clarified in the policy, but I think that would be a technical change, not influencing what should or should not be deleted. --LPfi (talk) 08:56, 10 December 2010 (UTC)[reply]
  136.  Oppose per Marcus Cyron, against censorship based on US point of view. --Alupus (talk) 21:43, 9 December 2010 (UTC)[reply]
  137.  Oppose per Marcus Cyron, against censorship based on US point of view. --Andim (talk) 21:58, 9 December 2010 (UTC)[reply]
  138.  Oppose If the content is not illegal where the servers are located, then it should be allowed. If users are offended, then they should not look for nor at the offending material. It is a big project and there are plenty of pictures to look at without seeking out these.--Die4Dixie (talk) 22:26, 9 December 2010 (UTC)[reply]
  139.  Oppose per Marcus Cyron. --Gripweed (talk) 22:55, 9 December 2010 (UTC)[reply]
  140.  Oppose It is up to the single Wikis to decide the use... and the English Wikipedia is worldwide, not USA only. Simplicius (talk) 23:03, 9 December 2010 (UTC)[reply]
  141.  Oppose Wikipedia might be latest result of the enlightenment, it may not be controlled by fundamentalists and their moral values. --Liberaler Humanist (talk) 01:10, 10 December 2010 (UTC)[reply]
  142.  Oppose Per all of the above. The policy is too strong in some language, and too vague in some. This would allow this policy to be used for all manner of drama.Wjhonson (talk) 01:39, 10 December 2010 (UTC)[reply]
  143.  Oppose Current policies have done fairly well to be honest. If non-academic pictures are being uploaded, they will be deleted. We do not need this in addition to what we already have. Malinaccier (talk) 02:51, 10 December 2010 (UTC)[reply]
  144.  Oppose It seems to permit too much; more should be a candidate for speedy deletion. We shouldn't have to wait a week to remove exhibitionism, even if tailored to these criteria. RJC (talk) 04:39, 10 December 2010 (UTC)[reply]
  145.  Oppose The Commons already has adequate policies to deal with these types of situations. This looks like mission creep. to me. Remember, it's much easier to make a law than to repeal it. --Scochran4 (talk) 06:40, 10 December 2010 (UTC)[reply]
  146.  Oppose -FASTILY (TAL{{ooK) 06:58, 10 December 2010 (UTC)[reply]
  147.  Oppose Per Widescreen, #111 -Sozi (talk) 09:27, 10 December 2010 (UTC)[reply]
  148.  Oppose Per Marcus #35 --Artmax (talk) 10:04, 10 December 2010 (UTC)[reply]
  149.  Oppose als long as we show those we have to deal with these. Enlightenment may hurt. -- Cherubino (talk) 10:20, 10 December 2010 (UTC)[reply]
  150.  Oppose Commons is not a project for American religious fundamentalists but a serious project for the world. Say no to censorship. --Matthiasb (talk) 11:47, 10 December 2010 (UTC)[reply]
  151.  Oppose Problems with the amurrican law? Move somewhere else... --Amga (talk) 11:59, 10 December 2010 (UTC)[reply]
  152.  Oppose — Existing policies already deal with this adequately. —Dark talk 12:30, 10 December 2010 (UTC)[reply]
  153.  Oppose It may hard to keep the control over these images. Current policies seems fine to me. බිඟුවා (talk) 14:48, 10 December 2010 (UTC)[reply]
  154.  Oppose I too am troubled by the speedy deletion allowance for material outside scope. A deletion request with some analysis needs to be had for such a subjective decision. I've seen the fervor over speedy deletions of images previously due to being out of scope and that fractures the community and even led to some recommending that files be uploaded to the individual wikis, defeating the purpose of Commons as a shared media repository. On consent, the mere "assertion" of consent by the uploader has no teeth for doing what it was intended. One also questions why consent would be required for one type of content over another. The portion on categories doesn't really cover the issue of viewer surprise and sidesteps around the need for an implementation of viewer-controlled filtering as seen with Flickr. The sentence suggesting that simply subcategorizing File:Félicien Rops - Sainte-Thérèse.png into a category that still does not suggest it will contain sexual content is a very poor example. Finally, the obscenity portion is far too ambiguous for a policy. Actual child porn is covered by the section above. Though it states work should not be speedily deleted, it suggests work deemed "obscene" could be deleted with a full discussion. This is what leads others to see this proposal as pushing censorship. What is obscene will vary from person to person. Existing policies cover this issue adequately. If you want to debate an issue, debate how to implement user-controlled filtering. Let the reader decide what they deem to be "obscene" and hide such images from view. – Adrignola talk 14:55, 10 December 2010 (UTC)[reply]
    "One also questions why consent would be required for one type of content over another" US law requires legal consent for hosting the images. The standard applies to all pornography hosted by US companies. It is common sense and having naked pictures of people without consent is akin to rape and a major violation of privacy rights. Ottava Rima (talk) 21:11, 10 December 2010 (UTC)[reply]
    So host it somewhere else where the US law doesn't apply. Like, err, Guantanamo... BTW, what pornography? --Amga (talk) 21:46, 10 December 2010 (UTC)[reply]
    Oh, yes, that shows a really good argument - find a place that doesn't have laws because clearly laws are a problem? Commons isn't anarchist, and is funded by a US company regardless of where the servers are. You could start up your own Wiki system on some island without laws and have your own thing if you want. That will always be an option. Ottava Rima (talk) 04:28, 11 December 2010 (UTC)[reply]
  155.  Oppose per Marcus Cyron #35, against censorship based on US point of view Bunnyfrosch (talk) 16:01, 10 December 2010 (UTC)[reply]
  156.  Oppose -- smial (talk) 17:04, 10 December 2010 (UTC)[reply]
  157.  Oppose increased censorship and no global focus (does not consider legal situations in other countries). regards, PETER WEIS TALK 17:13, 10 December 2010 (UTC)[reply]
    So you want a complete ban because dozens of Muslim countries and China ban such images? Or do you just not like having a US company have to legally comply to US law which it has to do no matter what? Ottava Rima (talk) 21:12, 10 December 2010 (UTC)[reply]
  158.  Oppose -jkb- (talk) 17:32, 10 December 2010 (UTC)[reply]
  159.  Oppose Nej tak. --Dansker (talk) 19:40, 10 December 2010 (UTC)[reply]
  160.  Oppose With my experiences in the German Wikipedia I think it is less pervert and much better for the original project scope when wikipedians use porn pictures instaed of rules for their satisfaction, and most inhuman enforcement of rules that only mentally retarded do need (like not to upload porn pics :-)). Meanwhile this project is a derailed social experiment and therefore *educational* in a completly different meaning. --92.196.91.127 20:19, 10 December 2010 (UTC)[reply]
  161.  Oppose Policies on illegality, privacy and project scope already cover the material. We can just change few words in them. I agree that out-of-scope deletions should be normal not speedy. I disagree with "pornographic images that contribute nothing educationally useful to our existing collection of images are deleted" that is again used in this proposal. There is no reason to prefer images that were uploaded earlier (existing collection of images) and delete newer images by different authors. I agree that Wikimedia should have moral responsibility and some kind of perversive images should be banned but this restriction is unnecessary. --Dezidor (talk) 21:48, 10 December 2010 (UTC)[reply]
  162.  Oppose --FrobenChristoph (talk) 21:52, 10 December 2010 (UTC)[reply]
  163.  Oppose The proposed text could be used as an informal help guide to uploaders of such content, but it's not useful to make it a policy. It is not within the Commoners duties to take actions in order to avoid any infraction to prudish feelings to anybody. As long as the educational standard is kept, every picture with sexual content could IMHO not make any psychic damage to a viewer. A prepubertal child has simply not the emotional skills to understand the feelings in sexuality, so wise parents could fall back to a basically "mechanical" approach of explaining a depiction. I expect that pubertal persons confronted with sexual media on Commons will ultimately be educated for having a sane sexuality without being subject to a fallacy like masturbations makes dumb - if those persons aren't annoyed by the dull sex depictions here and not on the direct way to YouPorn and the like. Any adult underway on Commons must be intelligent enough to know what he's doing and to be able to avoid the respective categories. @IP92.196... above me: you made a great and wise point and let me laughing like the word "LOL" is implicating... Grand-Duc (talk) 21:53, 10 December 2010 (UTC)[reply]
  164.  Oppose Current rules and policies seem to handle this sort of thing well enough. User:Now_registered
  165.  Oppose Badly written with unworkable definitions – like, a "prominent" depiction of the pubic area is a depiction of "actual or simulated sexually explicit conduct"? This will generate more misery than it will prevent.  --Lambiam 23:20, 10 December 2010 (UTC)[reply]
  166.  Oppose --Cvf-ps (talk) 23:53, 10 December 2010 (UTC)[reply]
  167.  Oppose per Dezidor. Illegal content and content showing people who object to being exposed is already rightfully being removed as matters stand. I oppose a policy that allows more speedy deletions based on vague criteria and prefer normal RfDs, because only admins are able to make judgements after a speedy deletion. Possibly, I could support the draft as a guideline, but haven't studied the implications involved in that yet. --Sir48 (talk) 01:06, 11 December 2010 (UTC)[reply]
  168.  Oppose ^^The implications for Sado/Maso material are very bad. Obviously non-educational and poor image quality images are already deleted without the guidelines proposed here which seem to have more potential to be used for deleting "offensive" images unfairly than to provide added security. Many photos from Flickr are reliable and add greatly to the commons; requiring by the poster who has dozens of images obviously taken by the them to provide their identification isn't going to greatly increase the responsibility of the use of the images but will decrease the amount available(authors that speak another language(Japanese) and whom can't be communicated with for example). Some measures are good such as the deletion of subjects at their request; although I would assume that we would do that already. Will act mostly as a censor to Wikipedia with little gains in image quality.AerobicFox (talk) 01:17, 11 December 2010 (UTC)[reply]
  169.  Oppose: Wikipedia is not censored. And the US-law/moral is clearly not the right stand to take for an international project. if those laws are a problem, then move the servers to a free country. --helohe (talk) 02:25, 11 December 2010 (UTC)[reply]
    You came to the WMF knowing it was a US company with US servers. Saying that you disagree with US law and that the servers must move is not a legitimate rationale. Ottava Rima (talk) 04:32, 11 December 2010 (UTC)[reply]
    with respect, you are mis-characterizing the rationale for the preceding voter's choice to "oppose"; the user clearly stated that they were against this policy proposal on the basis of "not-censored". the user then went on to observe that, if us law is the problem (which it isn't, we already have policy covering legal concerns, & this vote is not about allowing or disallowing illegal materials to be hosted) then we should relocate the servers. the comment was a separate point from the primary reason cited by the user, for their decision Lx 121 (talk) 05:39, 11 December 2010 (UTC)[reply]
    Unless you are going to admit the above user is your other account, you have no grounds to make such radical revisions of their statements regardless if your claims are inappropriate and wrong. You can't just pick and choose at will. People add things in a rationale for a reason. Ottava Rima (talk) 17:04, 12 December 2010 (UTC)[reply]
  170.  Oppose per Ruslik --Schlurcher (talk) 10:06, 11 December 2010 (UTC)[reply]
  171.  Oppose --Orci Disk. 10:11, 11 December 2010 (UTC) A reason would be nice... Barts1a (talk) 11:45, 11 December 2010 (UTC)[reply]
  172.  Oppose --Peng (talk) 11:09, 11 December 2010 (UTC) 10:11, 11 December 2010 (UTC) A reason would be nice... Barts1a (talk) 11:45, 11 December 2010 (UTC)[reply]
  173.  Oppose The upper part of the text is mostly OK, but I stumbled upon the inclusion into speedy deletions of "Material obviously outside scope". I stongly disagree with this view. In my view material merely out of scope can wait. There is no need to hurry for them. They are not breaking any law. And inscopeness is always controversial. I remember a proposal of mine to delete a en:contre jour picture so black that it was unusable to depict the subject described in the caption. The picture was finaly kept because someone had argued that conversely that picture could be used as an encyclopedic depiction of the "contre jour" topic. Besides, I have never heard convincing arguments enabling to solve the question whether categories such as Category:Young people and its subcategories should be limited (or even deleted altogether with their contents), which makes the exclusion of many "family and friends" pictures as "out of scope" potentially unfair in many instances. Teofilo (talk) 14:44, 11 December 2010 (UTC)[reply]
  174.  Oppose Ruslik has said it better than I could. "Illegal to host" is one thing; anything more vague is a dangerous step. ClickRick (talk) 19:18, 11 December 2010 (UTC)[reply]
  175.  Oppose Policy seems unworkable, and to me constitutes censorship. If it is illegal, don't host it - anything else should never be censored. Keys767 (talk) 19:21, 11 December 2010 (UTC)[reply]
  176.  Oppose I have had no familiarity with the issues under discussion here, but as I received a message to respond to this poll, I have spent the past hour reading the proposal and the comments. On balance of those agreements (but with no familiarity of the actual images that have led to these) I found the arguments opposing to be more convincing. Given the undeniable international significance of Wikipedia and, with guidelines always open to interpretation no matter how good they might be, I don't see that the guidelines will provide a general result better than a case by case decision and may well lead to over-zealous deletion out of fear. Obviously Wikipedia must protect its existence, but with the current general climate of towards repression in many areas - not just the one under discussion - I hope Wikepedia will stand up to fullest extent possible under the First Amendment. That said, I very strongly support respecting the integrity and values of those depicted in images on the site. Perhaps this could become a guideline?
    unsigned post by 84.0.216.227
  177.  Oppose -- Seelefant (talk) 20:45, 11 December 2010 (UTC). Current procedures for dealing with illegal and/or spam activities have been sufficient for years. The proposed policy is ambiguous and would allow removal of material without any reasonable or legal cause or requirement, introducing moral censorship throughout all wikimedia projects using commons.[reply]
  178.  Oppose We already have rules that govern innapropriate images don't see why we need this. Mo ainm (talk) 21:16, 11 December 2010 (UTC)[reply]
  179.  Oppose for 2 reasons : first, even if the policy is not direct moral censorship, it can be interpreted like that. So my oppose vote is first to counteract what could become a slide to something like the en:conservapedia. And second, moral and sexual policy is really different throughout the world. For example, explicit sexual content is prohibited in japan, but suggested "underage" sexual behaviour is tolerated, and that shocks me more than seeing a nude woman picture. Zeugma fr (talk) 22:06, 11 December 2010 (UTC)[reply]
  180.  Oppose Yuli Klever (talk) 23:25, 11 December 2010 (UTC) A reason would be nice... Barts1a (talk) 22:26, 12 December 2010 (UTC)[reply]
  181.  Oppose Censorship imposed on a system like Wikipedia is bad censorship.
  182.  Oppose We are sitting here not under american law or their moral. If they don't agree with our decicions let them look to a new place for the servers.--MittlererWeg (talk) 01:12, 12 December 2010 (UTC)[reply]
  183.  Oppose Needs to be more specific. Does not clarify how consent can be presumed, nor address the issue of obtaining consent for non-identifiable information well enough to be policy (or at least, well enough to outweigh its possible use for biased censorship of wikipedia.)Lisieski (talk) 01:58, 12 December 2010 (UTC)[reply]
  184.  OpposeCommons should have standard guidelines for all kind of pictures. No specific guideline/censuring required for sexual content.--Vssun (talk) 04:45, 12 December 2010 (UTC)[reply]
  185.  Oppose "the community will use its discretion" is the first troublesome part. The community does not have discretion or the ability to stick to decisions. Such phrasing just stirs the pot for censors. "Legal issues" should cover the entire guideline; anything that isn't a legal issue doesn't belong. Currently that section only covers US law; broader scope is needed. 70.162.119.243 06:58, 12 December 2010 (UTC)[reply]
  186.  Oppose Subjectivity X Moral bias X Cultural bias X No safety break = Imminent Disaster. I doubt we have the mean to stop people going overboard with nuclear solution. As said above we can achieve a more balanced & efficient result with our existing set of tools, rules & guidelines but i guess the Wikipedia Foundation need to make a "flashy stand" on that matter for some good Public Relation x Marketing Purpose. This the 3rd time this year that i see a guideline/policy discussion process that instead of strengthening Wikipedia as a Community is dividing it even more. --KrebMarkt (talk) 08:17, 12 December 2010 (UTC)[reply]
  187.  Oppose per M.Cyron (#36) --AM (talk) 10:38, 12 December 2010 (UTC)[reply]
  188.  Oppose as per M. Cyron (#36) --Jaellee (talk) 11:34, 12 December 2010 (UTC)[reply]
  189.  Oppose per the numerous supporters who concede that this proposed policy is not perfect, and the many opposers who have expressed concerns about speedy deletion. I'm not an admin on commons, but I am elsewhere, and I've learnt the importance of restricting the vast majority of speedy deletions to rules that can be clearly defined. If it becomes a judgement call and reasonable editors might disagree, then speedy deletion is inappropriate. I'd also point out that "unless (1) the individual in the photo is identifiable and still alive" is overly subject to interpretation. Does it mean that the face is clearly shown so that someone who knew that person forty years ago might have a chance to recognise them? Does still alive mean born in the last 117 years and not known to be dead as on EN wiki or the very different rules of other projects? WereSpielChequers (talk) 12:02, 12 December 2010 (UTC)[reply]
  190.  Oppose as per M. Cyron --Lidius (talk) 12:03, 12 December 2010 (UTC)[reply]
  191.  Oppose A.Savin 12:46, 12 December 2010 (UTC) A reason would be nice... Barts1a (talk) 22:25, 12 December 2010 (UTC)[reply]
  192.  Oppose per Alecmconroy ("Good honest effort by lots of great people, but ultimately I don't want a special policy for sexual content") --Pjacobi (talk) 12:59, 12 December 2010 (UTC)[reply]
  193.  Oppose Free sexual content is hard to find. Once found, I oppose deletion of it. On the contrary - the more, the merrier. Yonidebest Ω Talk 14:50, 12 December 2010 (UTC)[reply]
    You must be joking. Free porn is able to be found anywhere. The WMF is an educational place, and free porn on an educational site is incredibly rare. Maybe that's what you meant, that it is hard to find an educational site that debases itself by hosting free porn? Ottava Rima (talk) 17:09, 12 December 2010 (UTC)[reply]
  194. {{Oppose}} Withouth having read every single word of it, I have read the reasons given by those who oppose and support. I beleive this is dangerously close to culturally subjective censorship in accordance to puritanical values, rather than international liberalism, which is necessary as wikipedia and commons is after all international rather than American. I support the idea to decide the matter case by case. Pornography, if we are to discuss such images, is also a part of human history. I would also recommend you to move wikipedia from America to a more liberal country, for example in Scandinavia, to make it possible with a policy as liberal as necessary for the benefit of the fre world of speach and expression and international anti-censorship the world. I vote in opposition to Puritanical censorship, and censorship everywhere.--85.226.41.42 15:15, 12 December 2010 (UTC)[reply]
  195.  Oppose Sexual content should not be treated differently than other content. --Bsm15 (talk) 15:20, 12 December 2010 (UTC)[reply]
  196.  Oppose Against any kind of censorship, or perhaps except for very serious cases, which sexual content is not in my opinion. --Floflo (talk) 15:45, 12 December 2010 (UTC)[reply]
  197.  Oppose no thanks. -- Southgeist (talk) 15:53, 12 December 2010 (UTC)[reply]
  198.  Oppose - Hoo man (talk) 15:56, 12 December 2010 (UTC)[reply]
  199.  Oppose no need of such kind of policies. Rhadamante (talk) 18:31, 12 December 2010 (UTC)[reply]
  200.  Oppose Let's continue to decide case by case. --Gerbil (talk) 19:25, 12 December 2010 (UTC)[reply]
  201.  Oppose Points made by the supporters are less compelling to me than those of opposers such as users Zyl, Rusnik and Walks on Water et al). Kaiwhakahaere (talk) 20:21, 12 December 2010 (UTC)[reply]
  202.  Oppose Really having a specific policy for sexual content is a cultural bias not appropriate for an encyclopedia especially if assessments are to be conduced on idiotic and backward grounds like the Miller Test. The scope and usefulness principles are enough, and I trust the admins to take care of the puritanical US laws not to threaten the hosting until Wikipedia can find a more open-minded hosting location. Leuenberg (talk) 20:44, 12 December 2010 (UTC)[reply]
    Real encyclopedias like Britannica don't have such images, so your rational would back banning the sexual content completely. Thus, your oppose is really a support. Ottava Rima (talk) 22:40, 12 December 2010 (UTC)[reply]
    Britannica is a general encyclopedia, specialized encyclopedias about sexual topics (examples easily found on amazon Encyclopedia of Unusual Sex Practices, Sexploration: An Edgy Encyclopedia of Everything Sexual) have illustrations and pictures about sex, and specialized medical encyclopedias have also pictures that we, for precautionary principle (and I agree with this principle), don't have and will never have, with or without this policy (i.e. nude pictures of minors for articles about en:Child development). With the current rules we already host only a subset of what can be found in specialized paper encyclopedias. --Yoggysot (talk) 03:27, 13 December 2010 (UTC)[reply]
  203.  Oppose --Toter Alter Mann (talk) 21:10, 12 December 2010 (UTC) A reason would be nice... Barts1a (talk) 22:25, 12 December 2010 (UTC)[reply]
  204.  Oppose - Abusus non tollit usum. --Reddi (talk) 23:29, 12 December 2010 (UTC) A real reason would be nice... Barts1a (talk) 00:46, 13 December 2010 (UTC)[reply]
  205.  Oppose --Valentim (talk) 00:38, 13 December 2010 (UTC) see User:Gereon K., User:Blunt and User:WMeinhart[reply]
    A reason would be nice... Barts1a (talk) 00:46, 13 December 2010 (UTC)[reply]
    If you look to the comments of the users I refered to above you will find the reasons. --Valentim (talk) 01:41, 13 December 2010 (UTC)[reply]

Comment

  •  Comment This was discussed many times. We don't have the resources to check whether the written consent is from the real model and thus it is of little value. On the other hand, serious Wikimedia users posing for sexual content would probably prefer to do so anonymously. --LPfi (talk) 07:02, 6 December 2010 (UTC)[reply]
  •  Comment It is noted above that the poll concerns whether people accept or reject the this revision. So it would be better if supporters who predict a great future improvement of this proposed guideline discuss the current revision, or discuss the improvement in detail. Once it becomes a policy, it is a policy. --Tomchen1989 (talk) 09:35, 5 December 2010 (UTC)[reply]
  •  Neutral We do not need a special policy. I do not want this. Sexual images should be treated like other types of images. If it means we get less donations, who cares? We will survive anyway, and my integrity is not for sale. “Imagine a world in which every single person on the planet is given free access to the sum of all human knowledge”. Sexual knowledge is also knowledge. On the other hand, perhaps this policy make it harder for new attempts to censor and delete images (as we saw earlier this year).Ankara (talk) 21:48, 5 December 2010 (UTC)[reply]
  •  Commentreading it now, I think it will do. It does not restrict images by how explicit they are. Handing off low grade images uploaded from Flikr, in order to protect good quality images is a reasonable trade-off in my view. --Elen of the Roads (talk) 23:16, 5 December 2010 (UTC)[reply]
  •  Neutral I share the policy's concern that we have a problem with certain pictures that clearly shouldn't be here, but on the other hand I think that such a policy could easily be abused for censorship, even with all the disclaimers and affirmation of the contrary in place. --rtc (talk) 23:48, 5 December 2010 (UTC)[reply]
  •  Comment The used terminology looks to be dilettantish and garbled in some formulations. E. g., paraphilias are defined as a predisposition (a gene), not as some acts which can be photographed (every type of practics can be and can not be a manifestation of paraphilia, no type of practice is a paraphilia itself). As regards the proposal, still at the present time, especially in the United States, uncensored educational encyclopedia cannot be created and published. Fortunately, the most fanatical censorhip of today's western power system is limited only to the sexuality-related content and some few other themes. Every attempt to vindicate uncensored educational content would be welcome. If some compromises are necessary to achieve it, nothing doing. However, if we will accept censorship in this theme, whatever other theme can come after. --ŠJů (talk) 06:03, 6 December 2010 (UTC)[reply]
  •  Comment Ultimately, we should have a sexual content policy. I haven't read this in detail to work out if I think the proposed is acceptable or not. However, I think consensus amongst those working here is that this is pretty good. Therefore, I'd suggest that if there's no consensus to adopt as policy, can we at least get it established as a working guideline? Then it does have some value to allowing the community to function in this area - actions based on guidelines are more likely to be accepted than actions based on a proposal.--Nilfanion (talk) 12:20, 6 December 2010 (UTC)[reply]
    I generally agree this could fly as a rough guideline. The people here put a lot of work into this, it is a valuable document that gives some insight into the issues raised by sexual content. It's good to read and to consider-- it's just not a document that should be quoted as policy. --Alecmconroy (talk) 14:53, 6 December 2010 (UTC)[reply]
  •  Neutral I've been involved in writing this policy and making it as best as I could negotiate to favor free speech, and as I've responded to various Oppose votes above, some of the objections to it should be invalid. I have accepted, however, the model that current Commons policy should reflect where Wikimedia is subject to prosecution, even though Purodha makes an appealing argument that it shouldn't be. To my way of thinking, Wikimedia is such a strong force for freedom of inquiry right now that we can't risk it for a small gain. In a political protest where there are some people who are willing to be arrested but a lot more who aren't, and Wikimedia definitely belongs to the have-a-wife-and-kids group of moderates. If Wikimedia moves to a different country we can always discuss a lighter policy - certainly if that country has more restrictive laws on other topics people will be discussing those. I am moved by Walks on Water's complaint that this seems to stigmatize sex as opposed to say violence; this policy is in fact a scar, a response to a media and hence an administrative onslaught that should not have been; but it did happen. I can't disagree with the comments of Cyclopia and others that the mission of free knowledge is the fundamental principle. And Ruslik makes a good point against instruction creep; I've said at many points throughout the writing that most of what can be done by this as a policy can be done by it as an essay, merely guiding readers to preexisting policies.
However, I also have to ask, if people oppose this policy, what will happen instead? Having a policy means that people will have to watch it and make sure it doesn't creep its way down a slippery slope, yes. But settling everything by AfDs means a lot of content could get lost unless someone is watching every obscure discussion. It is also possible that there could be another rash of admins speedily deleting and undeleting, then arguing to de-admin one another... will people be around to defend good admins and demand public discussion? And then there's the risk that Powers That Be in WMF could impose a policy because "no consensus can be made anyway", and from what I've seen such policies tend to be ham-handed. Are people willing to make the kind of massive protest that would be necessary to reverse such an action? I say this as someone who has seen Pennsylvania lose Arlen Specter, a powerful voice for scientific research in the Senate, in favor of a hard-core Obama supporter, only to have all that support evaporate in the general election, leaving us with a rubber stamp the Republicans found in a bank in Hong Kong. If people are serious about rejecting the relatively small compromises imposed by this policy, we need their opposition to remain firm later.
My purpose here is to make this a decent policy, not to force people to accept it. I think the Commons community has a generally enlightened perspective and I'll trust whichever way it decides. Wnt (talk) 13:35, 6 December 2010 (UTC)[reply]
You absolutely can't speedy delete just for out-of-scope. Because once deleted, only admins can even see the image in order to discuss whether it's in scope or not. A speedy deletion alienates the majority of the editor population and cuts them out of the discussion. A speedy delete is a conversation killer, not a conversation starter. And for the very difficult question of scope, we _need_ that discussion to take place. --Alecmconroy (talk) 15:16, 6 December 2010 (UTC)[reply]
When we receive the 10,000th blurry picture of a bloke's penis with his hand clamped around it, do you suggest that we have a full-blown deletion discussion each and every time? We have better things to do, and I've seen deletion discussions sit there, stale, for weeks and months. If an admin does abuse the right to speedy, deleting stuff with a realistic educational value, then let other admins raise it in a community discussion. If their speedy deletions are out of line, the community can throw the book at them. --JN466 14:45, 7 December 2010 (UTC)[reply]
Whatever comes out of this poll, the next subject heading will probably be about speedy deletion of out of scope images. We'll need stronger facts to make any real headway on the issue in either direction. How many such penis pictures are there really? How much extra time does it actually take to close a normal deletion discussion rather than make a speedy delete? Aside from the great Jimbo purge, how often are such speedy deletions done improperly? Does the undeletion mechanism really work properly when it is needed? Wnt (talk) 15:05, 7 December 2010 (UTC)[reply]
As AFBorchert notes above, speedy deletions of illegal material or material falling under COM:PORN are already current practice. They will continue whether this is adopted or not. --JN466 02:14, 8 December 2010 (UTC)[reply]
correction to the above comment: it is the practice of some admins to do this, it is not a policy, or even a guideline, & has never been approved by a community vote. the fact that you feel "they will continue, whether this policy is adopted, or not" raises the question of whether such users should continue as admins, especially if this policy fails to achieve consensus (with that item intact). if the admins abusing speedy for this purpose would instead spend their time in closing the backlogged, old deletion debates, the problem would resolve itself. & frankly, i do not find it acceptable that participating in deletion debates should be moved from full community access, to an admin prerogative, which will happen for sexual content, if that part of this policy is approved.Lx 121 (talk) 03:00, 8 December 2010 (UTC)[reply]
  •  Neutral This is not censored to be a children's encyclopedia. We give info that is necessary to give, even if it's not safe for work. At the same time, starting controversy with parents and religious groups is not a fight an encyclopedia that wants to be a respectable, objective source of information should want to get into. I think, at least, that pages with sexual pictures should have some kind of warning that the content on that page is not safe for work. J390 (talk) 19:54, 6 December 2010 (UTC)[reply]
     Comment We tried to ensure that by the section about categories. People that see the file other ways, e.g. by it being used in an article (that should be handled by the Wikipedias) or through a search, cannot be handled by this policy: we would need technical infrastructure that is not in place. There is also the very problematic question about what media are "safe for work" (or even worse: "safe for children") in different cultures. How would the tags be maintained? --LPfi (talk) 09:41, 7 December 2010 (UTC)[reply]
    I feel like "safe for work" is a misnomer. More like "safe for work if your boss is an idiot". No reasonable manager should punish an employee because a Wikipedia page turned out to contain something unexpectably explicit. Of course, the fact that the employee is browsing Wikipedia at all is another story, and an explicit image might draw unwanted attention - but the same is more true of say, a page about a union. If we really want Wikipedia to be safe for work maybe we should implement a Rogue-style "supervisor key" to flash the window back to some work-relevant page the reader has preloaded. Wnt (talk) 15:14, 7 December 2010 (UTC)[reply]
  •  Comment i find it interesting how most of the "support" votes are one-line comments, usually just saying that they agree, without adding anything much to the debate, but the "oppose" section is full of discussions about the issues at stake Lx 121 (talk) 00:30, 7 December 2010 (UTC)[reply]
     Comment I's easy for (US)Americans to agree because they're indoctrinated with the (US)American POV. We likely have a large group of them, or similarly thinking persons, here, who are advocating "their" customary realms of censorship. --Purodha Blissenbach (talk) 01:47, 7 December 2010 (UTC)[reply]
    • We've sat here for months working on this; what do you want the supporters to do, recapitulate months of discussion and compromise?--Prosfilaes (talk) 01:49, 7 December 2010 (UTC)[reply]
      • um, yes actually; i was following the developments here intermittently, until it seemed like the whole thing was getting nowhere. i'd still like to know how "out-of-scope" ended up being included as a criteria for speed deletion? Lx 121 (talk) 14:00, 7 December 2010 (UTC)[reply]
        • It was perceived that current praxis is having obviously out of scope media deleted ("blurry penises"). We did not want to forbid something that is actually going on without anybody protesting. It is better having explicit statements in the policy than everybody turning their back when some aspects of policy is violated. And at least I felt that the "obvious" point could be handled by the administrators, with most serious material explicitely exempted from such deletions. If not, no policy or lack of policy will do. --LPfi (talk) 11:23, 8 December 2010 (UTC)[reply]
      • Prosfilaes nails it: I've written paragraphs and it makes little sense to just repeat them here. If anyone should be accused of a one-liner mentality, it is the people who say "NOT CENSORED" and imply that no further inquiry or nuance is needed.--Brian Dell (talk) 22:19, 7 December 2010 (UTC)[reply]
        • i'm sorry, but when you call a poll to decide this issue, you should be expect to reiterate your reasoning & have it challenged. "NOT CENSORED" is a pretty clear policy, & a widely supported one, this isn't. as for "inquiry or nuance" it's up to the nominator & supporting voters to make their case for deletion; if "NOT CENSORED" is relevant to the item under consideration, then that is part of it. the default option is "keep" not "remove" Lx 121 (talk) 00:04, 8 December 2010 (UTC)[reply]
  •  Comment I'm not suggesting that WMF leaves the US of A just in order to get perhaps a better legal environment hosting some specific content. Rather, other organizations, or individuals, in other states could do that. There are several states in the world, where there are practically no restrictions at all.
    Judging content by its "educational value" is utterly silly. Take somejudges judgements, whatever they be, make an educational course explaining what made these judgemnts possible, how they are supported by both the content, and somejudges educational background and expectations. Whatever the judgements, the specific content is now of some educational value. Since this is always possible, the conclusion has to be, there is no content having no educational value. Even before you use it so, you cannot dismiss content, because you then you prohibit the education that would use it. --Purodha Blissenbach (talk) 01:37, 7 December 2010 (UTC)[reply]
I'd like to think this is true, but could you name the country? I'm thinking that some countries that have a good reputation about sexual material have very problematic laws that would interfere with articles about racism and Holocaust revisionism. Wnt (talk) 02:40, 7 December 2010 (UTC)[reply]
  •  Comment No opinion on this policy, but I do hope those closing this poll will properly discount all !votes that are just ranting about the US. Not liking US morals is not relevant to policy. Heimstern (talk) 11:42, 7 December 2010 (UTC)[reply]
a vote is still a vote Lx 121 (talk) 14:00, 7 December 2010 (UTC)[reply]
Heimstern: criticizing the proposal to have an American bias is a valid argument. We are a global project, not a American.Ankara (talk) 14:13, 7 December 2010 (UTC)[reply]
Oh! So now everyone that supports this proposal is "American" with "American moral values" like "sex belongs to marriage"?!? I gave my support to this proposal even though I'm from Sweden and watch porn regularly. Thanks for offending me! Lokpest (talk) 17:28, 10 December 2010 (UTC)[reply]
Indeed! Being US-centric is a very real and valid concern, even among US editors. Wikipedia is full of copies of notable truly offensive hate-speech. Our Muslim readers, for example, are very, very troubled that we include insulting depictions of Muhammad. How can I explain to them that it's okay to have a policy for one taboo, porn, while we ignore all their cultural taboos?
This is a perfect case for a culture-neutral, user-controlled, opt-in content filtering-- like the kind recommended by the study. It would solve all our problems without ever jeopardizing our ability to host truly notable controversial material. --Alecmconroy (talk) 16:38, 7 December 2010 (UTC)[reply]
re a claim further up, votes are not just votes on Wikipedia (or Commons to my knowledge). Assessing consensus is not narrowly defined in a way that can be gamed. I note that a person voting Support says his vote is as an editor not as a very heavily involved official, and I find that more persuasive than some "drive-by" vote. re the USA, the projects have to be situated somewhere and that means the legal norms of the hosting jurisdiction are relevant. Argue that the servers be moved, fine, but complaining about the social or cultural norms of the hosting jurisdiction is off-topic. Re moving the servers, note that the United States government has been a huge enabler of the Commons by legally designating so much of its work public domain. I do not see it as an enemy of the idea of a free encyclopedia.--Brian Dell (talk) 22:35, 7 December 2010 (UTC)[reply]
Criticizing the proposal at US-centric is fine, sure, but saying "US morals suck", which is what some of the oppose !votes say, is not valid. As for the original reply to my comment, consensus is not a vote. Also agree completely with Brian Dell. Heimstern (talk) 23:32, 7 December 2010 (UTC)[reply]
"consensus is not a vote", but you still want to make sure that the votes you do not like are removed? seems to be a bit of a contradiction there... XD Lx 121 (talk) 00:04, 8 December 2010 (UTC)[reply]
No, I said I wanted !votes discounted, not votes. And not the ones "I don't like"; the ones that make no argument save "US morals suck". Or for that matter, "US law is irrelevant", since that is patently false for as long as the WMF servers remain in Florida. Consensus means opinions (not votes) are weighed in accordance with the strength of the argument, and that's all I'm asking for here. Probably some on both sides need discounting. Heimstern (talk) 04:10, 8 December 2010 (UTC)[reply]
in that case, all the "support" votes that make no valid arguement & simply say "i agree with this" should be removed also. if we are weighing this (& who is doing the weighing?) on the strength of the arguements, then this proposal is already dead. Lx 121 (talk) 18:41, 8 December 2010 (UTC)[reply]
For my part, I don't believe we should be hosting any material that is illegal under US law-- so when I talk about being too US-centric, I mean US culture, not US law. (Servers are in US, so US law applies). What I want to avoid is a case where some readers have 'more rights' than others-- if we are going to let US users hide images that offend them, we should let readers all around the world have the same right to decide for themselves what content is offensive. --Alecmconroy (talk) 23:37, 7 December 2010 (UTC)[reply]
There is a difference between valid content removal and censorship; what you are proposing is definitely at the 'censorship' end of the scale. Barts1a (talk) 23:42, 7 December 2010 (UTC)[reply]
  •  Comment I'd like to point out that we would like Wikipedia/Wikimedia to be accessible to as large an audience as possible, including children and countries other than the U.S.A., and as such we want to limit the amount of censorship imposed on that access. Categorizing images likely to be deemed "offensive" or "pornographic" helps our goal of widespread access by limiting content likely to be deems "objectionable" to well-labeled areas (which could be censored by governments and children protection software), freeing up all of the other image areas for free and uncensored access. The alternative is to risk the entire site being censored instead of specific sections of it. — Loadmaster (talk) 17:43, 7 December 2010 (UTC)[reply]
To me, we absolutely don't want [unsupervised] children here! The world is full of information so disturbing, it makes pornography look like a lullaby. The world has murders and genocides and diseases and abuse, and adults need to document them all so that we can learn from those horrors. But stressing over the porn in the hopes of making wikipedia child-safe is like trying to babyproof a construction site-- Wikipedia is inherently not child-safe, we can't change that fact, and we shouldn't kid parents about that reality. --Alecmconroy (talk) 18:44, 7 December 2010 (UTC)[reply]
I don't see Loadmaster as insisting that Wiki projects adhere to a prescribed notion of child safe. I rather see the entirely reasonable suggestion that we decline to play activist, something entirely in keeping with the cornerstone principle of neutrality, which means selecting an appropriate hill to die on when it comes to picking fights with external norms. If Wiki is going to stubbornly take a stance that leads to it being marginalized and ostracized even in "liberal" cultures, the cause should not be trivial.--Brian Dell (talk) 22:12, 7 December 2010 (UTC)[reply]

 Neutral - I think there are legitimate concerns being addressed by this policy, and I'm glad to see it moving ahead. If it overlaps with other policy, it should link/refer to those policies, but not restate them which might lead to confusion. I think there are other legitimate concerns about the wording, including the speedy deletions of possibly "out-of-scope" material. Keep working on it, and you'll get it right. - Themightyquill (talk) 16:49, 8 December 2010 (UTC)[reply]

  •  Comment I read the suggestion that we should label content as "safe for childen". Why on earth should we? It is safe for children. Or have you ever seen a computer image stabbing a child? A defective bike may be unsafe for children to ride, but electronic images?
    Now what you indeed are talking of is that there are picture which are subjectively deemed unsafe for adults, concerning their respective ideologies, such as Dalai Lama in China, images of the Prophet Muhammad in some arabian countries, or what they call obscene in Florida. While I am far from advocating that unexpecting users shall have to run the risk of being unvoluntarily confronted e. g. with a funy scetch of a holy, I think it's not our duty to judge in someone elses name. So, if the State of the Holy Sea wants specific content labelled as heresy, to protect catholics against it viewing it accidentally, they should supply the labour and cost of doing so. If they were wanting us to host or serve their judgements, they should of course pay for our additional cost. --Purodha Blissenbach (talk) 20:51, 8 December 2010 (UTC)[reply]
  •  Neutral I feel that it is too US centric and many of the things in the policy already exist on the commons. Although I do feel that the "Deletion at the subject's request" section is important. Andyzweb (talk) 06:04, 10 December 2010 (UTC)[reply]
  •  Comment - Is it just me or do most of the {{oppose}} arguments stink of I-don't-like-it-itis Barts1a (talk) 11:42, 10 December 2010 (UTC)[reply]
Also: I see a few on the oppose side that are just signatures with no reason. Barts1a (talk) 12:03, 10 December 2010 (UTC)[reply]
respectfully disagree with the previous comment; it's pretty clear that the "oppose" section is where the commentors have spent the most time & effort making their opinions clear. the support section is full of one-line endorsements, with much repetition & without much explanation. if anything, following the logic of your arguement, it's the "support" side that has "i-like-it-itis"... XD Lx 121 (talk) 16:27, 10 December 2010 (UTC)[reply]
also, the wikipedia essay cited doesn't really apply very well here. it's focussed on discussions about the possible deletion of articles on wikipedia. this discussion is about whether or not to adopt an important policy change @ wikimedia commons. how all the members of the community feel about these proposed changes is germane to the coversation. Lx 121 (talk) 16:36, 10 December 2010 (UTC)[reply]
As far as I know, the manner in which the poll is actually closed - the standard for adopting a new policy on Commons - is actually be very well defined. But I think that this will be an important factor, and that such accounts probably won't have full weight, at least not unless they raise an argument that is taken up by others. I should say that at the moment it looks like it is going to be very difficult to gain consensus for the policy; the question then is whether the content of the vote can be used to say with certainty that the consensus on Commons is that even this policy is too restrictive. Wnt (talk) 15:05, 11 December 2010 (UTC)[reply]
  •  Comment - I am not too involved in Commons, so i will refrain from voting here. But it seems, there are still several legitimate concerns open, which could use addressing beforehand. Of course the laws of the hosting country must be followed, but not necessarily it's taste or the pressure of "public opinion". Especially the issues of clarity for non-lawyers, clear definition of scope and measures to prevent abuses should be addressed. With such improvements support for such a policy would be easier for many, i believe. GermanJoe (talk) 14:33, 11 December 2010 (UTC)[reply]
  •  Neutral I believe I would be able, using the current draft of the policy, to decide what I should when finding an image that may fall inside the scope of the policy. I don't have the background to determine whether the policy is appropriate or matches legal requirements in this area. PauAmma (talk) 20:20, 12 December 2010 (UTC)[reply]

Alerting other projects

Since other projects are dependent on commons for image hosting, they will want to be alerted to any potential changes. Is there a way we can get neutral watchlist notices up on places like EnWiki and DeWiki, like the sitenotice currently running here on commons? --Alecmconroy (talk) 18:24, 7 December 2010 (UTC)[reply]

EnWiki and DeWiki? Why on earth? Either on all projects or by individuals alerting projects by their own choise. --LPfi (talk) 11:32, 8 December 2010 (UTC)[reply]
Agreed. All projects would be even better. I just mentioned EN and DE since they're the two biggest, but any project that is reliant upon commons should be alerted. --Alecmconroy (talk)

Out-of-scope decisions should not be speedy deletion criteria for files

COM:CSD doesn't indicate that scope concerns are a legitimate speedy deletion criteria for files; only for pages. I think this is a serious flaw in the current proposal and would like to see it changed. Asking individuals, even admins, to make subjective scope decisions seems like a recipe for conflict. Of all the issues raised in the objections, this one seems the most substantial to me.

I'm asking here if there are any objections to removing "Material obviously outside of scope" from the proposed speedy deletion criteria additions so that the people who have already indicated their support for the proposal in general can indicate whether they have any objections. 71.198.176.22 02:12, 8 December 2010 (UTC)[reply]

Of course I object to changing something so significant in the middle of a vote! I'm afraid you'll have to vote accordingly and then wait until after the poll to re-discuss this. 99of9 (talk) 10:12, 8 December 2010 (UTC)[reply]

I'm not proposing that it be changed before the vote is closed, but I am proposing that the vote be closed in the next few days, and discussion move on to the objections raised by more than one person. We can begin such discussion at once. I'd like to start by asking for people to propose alternatives to this edit, which was indicated as potentially problematic by the editor in the summary and severely changed the character of the "out of scope" section. My opinion remains to delete the remaining, small, "Material obviously out of scope" bullet point paragraph which was made out of the earlier much larger section that was moved into the next section. 71.198.176.22 11:01, 8 December 2010 (UTC)[reply]
As it says in the intro to the poll, it will be closed on the 15th of December. --99of9 (talk) 12:03, 8 December 2010 (UTC)[reply]
  •  Support unless the admins doing the speedy deletions can provide specific facts to make a compelling argument. It's clear from the poll comments that this part of the policy is objected to by many people, and favored by few. However, as I commented above, I would like to know how many speedy deletions really happen and how much time and effort it actually saves to do things that way rather than by normal AfDs, among other things. But I think at this point the burden of proof is on the admins; otherwise we can change the policy to rule out all speedy deletions for scope, and see what experimental data that provides. Wnt (talk) 12:46, 8 December 2010 (UTC)[reply]

If this is the loophole which allows deletion of blurry cell-phone camera snapshots of penises and similar (see File:What Commons Does Not Need.jpg), then removing it would disalign policy from practice (as explained above) and set the stage for future bickering... AnonMoos (talk) 03:00, 9 December 2010 (UTC)[reply]

this "practice" is undertaken by only a handful of admins who seem to have a particular interest in the subject. i have pointed out in the past that it is a violation of policy. with respect, it would be more accurate to say that the inclusion of this provision is an attempt to "align" policy with the preferred practices of those admins who have not been following established policy in this regard. it is also pretty clear from the responses here & in the "oppose" section, that i am not alone in feeling that all admins should be required to follow policy on deletions, & it is not "ok" for some admins to simply ignore written policy, & make up their own rules, when it suits them Lx 121 (talk) 08:44, 9 December 2010 (UTC)[reply]
Unfortunately there is a real problem with drunken guys uploading low-quality cell-phone snaps of their penises, so there it's only reasonable that there be a special policy sub-clause to deal with this special problem. I'm not sure what denying this would accomplish, other than to create additional tedious work for other people in order to comply with your personal philosophical views... AnonMoos (talk) 11:05, 10 December 2010 (UTC)[reply]
yes, i know; we've had this same debate before, repeatedly... BUT as the results of the discussion so far show, i'm clearly not the only one who has an objection to this "special sub-policy clause". it provides unlimited opportunities for abuse. this kind of deletion is a judgement call, & that kind of decision should be made in open community debate, NOT at the discretion of lone admins.
Here's the key point: deletions of this type are NOT URGENT, it's not time-critical, it's not "important" that the files get erased as quickly as possible; the work is just minor tidying. there is no legitimate reason not to allow an open, community debate before deleting the material.
when you see a penis-pic that you don't like, tag it & notify, wait 5 days to see if there's any objections, & if not, then delete. how hard is that? if the exact same number of people devote the exact same number of man-hours to the work, the rate of penis-pic deletions will remain the same, meaning you guys won't "fall behind" just by changing your s.o.p.
right now, we have a massive backlog of unclosed deletion discussions. it would be FAR more useful to work on closing those.
Lx 121 (talk) 16:16, 10 December 2010 (UTC)[reply]
Any individual deletion is probably not particularly "urgent" in itself -- but it's a fairly strong overall priority to prevent Wikimedia Commons from being gradually transformed into the kind of site where such images form a significant fraction of the content, and so far the "Nopenis" escape valve has been found to be the most smoothly-working means towards that end. Any objections to the "Nopenis" escape valve have almost always been on abstract theoretical philosophical grounds, and NOT because anyone really believes that many valuable images are being deleted. If you want to eliminate the "Nopenis" escape valve, then it seems to me that it's incumbent on you to explain what practical positive goals will be achieved on a non-theoretical non-abstract non-philosophical level (i.e. achieving something worthwhile other than creating administrative busywork over ideological views). AnonMoos (talk) 17:55, 10 December 2010 (UTC)[reply]
point 1. we have over 7 million media files @ wmc now; it is pure hyperbole to talk about commons being in danger of becoming "overwhelmed" by a tsunami of penis pics. please provide statistical data to back up this claim
point 2. the practice of admins unilaterally deleting sexual content based solely upon their own judgement of scope HAS NOT EVER been put to a community vote, until now. as things stand, at the moment, you are losing this vote.
point 3. it is impossible for an average user to examine the materials being removed by this method, or assess the validity of the decisions being made, because only admins can check the files, after they are gone. as your already know
point 4. it is not "creating administrative busywork", to honour & respect the policies laid out by the community process. &, as i have already pointed out above, it does not increase the workload to process scope deletions properly, it simply allows time between the nomination & the removal, for members of the community (including the uploader) to express their opinions & discuss the matter, before action is taken. this is why we have a deletion debate process.
point 5. the mere fact that you feel it is appropriate to separate out sexual content as a category where non-debated, non-reviewable "instant deletion" for "scope" is acceptable, shows considerable bias. if we are going to allow admins to instantly delete "out-of-scope" materials, why not do it for everything?
point 6. your comment "Any objections to the "Nopenis" escape valve have almost always been on abstract theoretical philosophical grounds, and NOT because anyone really believes that many valuable images are being deleted." clearly indicates that you have received multiple objections to this activity; which, again, has never been approved of in any open community process.
point 7. i consider it "worthwhile" to respect the policies & procedures laid out by the community, & i do not consider it "abstract, theoretical, & philosophical" to object, when these same policies & procedures are violated. especially when it comes to such important questions as "who gets to judge what materials should be removed from the wmc collection".
point 8. i'm getting tired of indenting, so if you would like to continue this discussion, shall we open a new section? Lx 121 (talk) 19:09, 10 December 2010 (UTC)[reply]
Dude, I am not "losing the vote", because this is a purely non-binding side-discussion to the main poll (which is itself NOT a "vote" in the sense that one more support than oppose would mean the policy would be adopted while one more oppose than support would mean not adopted). It's not that Commons is being overwhelmed by an immense quantity low-quality genital snaps, as that allowing them to accumulate unchecked without any periodic trimming or pruning would eventually tend to tip the balance of Commons in a direction that most thoughtful people don't want it to be tipped... AnonMoos (talk) 12:18, 11 December 2010 (UTC)[reply]
to clarify: you are losing the vote seeking approval for a "special sub-policy" to remove materials as "obviously out of scope" by speedy deletion, without community discussion. both in this sub-discussion & in the main vote, that provision is clearly being demonstrated to be objectionable to a significant part of the community. & you still have not provided any data to back up your stated concerns, even tho you claim expertise on the problem. & you still have not provided any reasonable explanation for why the current deletion procedures are inadequate. all you keep saying, over & over again is: "this is a huge problem, we must have these powers to fight it, & it's not good enough to delete the material via the normal process (because it's too much work?)". Lx 121 (talk) 00:48, 12 December 2010 (UTC)[reply]
Dude, there simply is NO VOTE in any meaningful or relevant sense of the word "vote", and any counting of the comments in this particular subsection of the discussion will not be definitive for any policy changes -- and you're deluding yourself quite strongly if you imagine otherwise. AnonMoos (talk) 06:12, 12 December 2010 (UTC)[reply]
There are/were 1000 penis images on wmc or 0.015% of all the wmc images. Considering all the things that one could photograph that is a huge number. There are 250,000 beetle species each one of which has a different shaped penis. If wikipedians were as dilligent on recording beetle penis as that are on recording there own, number of photos of beetle penis would be over 100 million. Fortunately, biologists don't feel the need to record a 1000 dicks for each beetle species, even though the size and shape is diagnostic in species identification, and in most cases they seem to get by a with a single line drawing. John lilburne (talk) 20:28, 10 December 2010 (UTC)[reply]
I think that standards in biology are likely to change - the days of "the" Cryptocephalus abdominalis penis, or "the" (species) anything, are numbered. The individual species is only an approximation; the world is covered in hybrid zones. And a w:ring species will expand horizons. I doubt we'll see 250,000 species of beetle, or their various accessories, anytime soon, simply because a greater level of skill and equipment is needed, but when the day comes it will be nice to have. Wnt (talk) 14:56, 11 December 2010 (UTC)[reply]
Yet it is the approximation that forms the basic classification for the life sciences. Whilst penis shape remains a main determinate in species determination as far as biologists are concerned, I suspect that female beetles don't go about saying "Hmmm nice cock that will fit nicely", that there are other determinates that actually decide on choice of mate or more specific whether any one organism is considered a suitable partner. The point remains though that in clumping individual beetles into species biologists don't need 1000 photos of penises to a) determine what one looks like, or b) to determine that the one being examined matches with the norm for any particular species. 62.49.31.176 17:25, 11 December 2010 (UTC)[reply]
but if you are an entomologist compiling a comprehensive atlas of comparative beetle anatomy, you do need those images. a "line drawing" is not an adequate substitute for photographic materials. Lx 121 (talk) 00:48, 12 December 2010 (UTC)[reply]
What you don't need is a 1000 of them. Line drawings are good as they only indicate the salient features. Too many poor quality photographs of botched dissections will overwhelm the good. A vast quantity of this stuff isn't needed and an un-curated resource isn't nearly as useful as a curated one. Lets leave beetle genitalia to one side and examine say wing venation on dragonflies which can be diagnostic for some species. Again one doesn't need 1000s of images of show the difference between Onychogomphus uncatus and O. forcipatus.
So when there are 1000 photos of human penis comprising 0.015% of all images on Commons, one has to say that the number is out of proportion to their actual worth, and simply reflects the preoccupation of the uploaders. John lilburne (talk) 16:47, 12 December 2010 (UTC)[reply]
Those can be speedily deleted if they don't have an explicit declaration of consent with the reason "not permitted to host." I would like to try that for a month or twelve to see how it goes before rejecting the potential solution. Ginger Conspiracy (talk) 03:33, 9 December 2010 (UTC)[reply]
  •  Support with prejudice. Easy to abuse with no greatly necessary pragmatic purpose. Low quality cell photos of dicks should just get their own special policy if they're such a problem. The use of this policy has to great of a potential to be abused and isn't critical in removing obviously unnecessary images.AerobicFox (talk)
  • I share the concerns expressed at the top of this section. See my "oppose" vote comment above. My concern focuses mostly on non-sexual pictures that might be speedy deleted for outofscopeness. Teofilo (talk) 15:04, 11 December 2010 (UTC)[reply]

Potential "Miller Test" Material-- speedy delete or not?

The current version gives conflicting instructions. First, we're told to speedy delete any material that is illegal for Wikimedia to host. Later, we're told that "Because of the complex, subjective nature of obscenity law, work should not be speedy deleted on the basis of perceived obscenity".

It's confusing to tell people to speedy delete anything illegal, then teach them all about obscenity law, only to finally inform that that obscenity is just too complex for a lay audience to decide.

Perhaps the obscenity law discussion belongs in some essay, guide, or instructional page--- something that doesn't carry the force of policy. Right now, we try to educate an international population about some US laws that hinge upon US interpretations of 'community standards' in unknown ways, thus making it utterly impossible for a non-lawyer to accurately gauge an image's potential legality.

--Alecmconroy (talk) 03:40, 8 December 2010 (UTC)[reply]

I too have concerns about the "Miller test" provision. The courts have generally been very lenient with the Miller test, rendering it nearly moot. However, the plain language of the Miller test, if applied in a reasonable way, would exclude large swaths of material. I can't really vote on this proposal as long as it states that material must comply with the language of the Miller test, considering the potential for abuse there by editors who take the Miller test at face value instead of applying as the courts have. Gigs (talk) 04:08, 8 December 2010 (UTC)[reply]
If you can cite material explaining this greater leniency, it would make a very welcome footnote to this policy. My untutored impression is that w:Mike Diana lived in Florida. Wnt (talk) 12:54, 8 December 2010 (UTC)[reply]
Regarding the impact of US laws, this is a consequence of the fundamental NOTCENSORED pillar. I've tried to prevent the policy from excluding any sort of material except for some actual reason beyond our control. We aren't picking out stuff and saying it's off limits simply because we don't like it. As a result, the face of Commons is inevitably going to reflect the tread of the boot that stepped on it. We're not going to say that we walked into a doorknob or fell down the steps. Wnt (talk) 13:01, 8 December 2010 (UTC)[reply]
Wnt -- What "Wikipedia is not censored" means is that an image is not automatically deleted just because someone considers it offensive or unsuitable for viewing by children. However, "Wikipedia is not censored" does NOT mean that we can't make a considered deliberate decision not to host certain types of material which we have concluded does not further our goals. AnonMoos (talk) 03:07, 9 December 2010 (UTC)[reply]
i'm sorry, but your agruement is sophistry; when you "make a considered deliberate decision not to host certain types of material", that is censorship. once we define the project's goals (which, for wmc would be: media repository, educational material, free and/or open-source only), then the basic considerations for inclusion/exclusion of materials are: 1. relevance (i.e,: scope) 2. legality (copyright, other legal restrictions on permitted content, etc.) if you want to add a 3rd parameter it should probably be: 3. technical limitations (storage, bandwidth, etc.) on that basis, you are welcome to argue why you feel that sexual content "does not further our goals", but please do not pretend that it's not censorship! Lx 121 (talk) 21:36, 9 December 2010 (UTC)[reply]
People are free to speak all they want, but if they want to use our servers and bandwidth to do so, then it's perfectly reasonable to require that they follow certain basic ground-rules that we have laid down. "Relevance (i.e,: scope)" means exactly and only previous past considered deliberate decisions which have been made not to host certain types of material which we have concluded does not further our goals -- and if we make further considered deliberate decisions in the future not to host certain other types of material which we will conclude will also not further our goals, then that will redefine the exact meaning of "relevance (i.e,: scope)" on Wikimedia Commons. You can call it "censorship" if you want to, but that sounds a lot like mere somewhat superficial rhetoric to me, since 1) We're under no legal or moral obligation to indiscriminately host absolutely everything that anyone submits ; 2) We're doing nothing to limit their ability to use other forums if they're not willing to respect the basic ground-rules here ; and 3) There are still very few topics which are under any kind of absolute content-matter ban, and those are as a result of external laws (not Wikimedia Commons policies). AnonMoos (talk) 11:27, 10 December 2010 (UTC)[reply]

I think it's fair to say that administrators can make satisfactory judgements about anything out of a wide gray area, and the information in the proposal helps them understand where and how wide the gray area is. We already have instructions at COM:CSD to delete some kinds of illegal material (copyright violations), and technically, it is impossible to license a lot of the material forbidden by anti-pornography statutes, so the CSD for non-free content applies too. Therefore, the proposal represents an incremental improvement in any case. 71.198.176.22 10:47, 8 December 2010 (UTC)[reply]

i have no problem with removing material that it would be illegal for wmc to host, but which statues are you referring to? o__0 Lx 121 (talk) 21:36, 9 December 2010 (UTC)[reply]
I think everyone agrees that obviously illegal images should be immediately deleted. Many such kinds of illegality can be judged by our editors-- copyright or consent, for example. But US obscenity case law is never obvious to untrained eyes. Non-specialists cannot correctly apply the miller test.
If you let people start using their own personal standard to apply miller to our content, how do you resolve disputes? All editors can do is argue back and forth on whose standards best represent 'the Florida, US community standards'.
Legal questions are for office. Don't have our admins singlehandedly decide such important and complex legal questions. Don't make our editors debate over legal obscenity, each arguing based upon their own impressions of what "FL, US community standards" look like-- that will just spark divisive flamewars over religious, cultural, and philosophical differences.
If someone thinks an image is obscene, tell OTRS and let the lawyers take a peek. If it's truly obscene, you can be sure they'll be the ones best suited to actually do the deletion and take whatever other legal steps are necesssary. --Alecmconroy (talk) 11:55, 8 December 2010 (UTC)[reply]
I agree; again, I think the proposal represents an improvement over the existing COM:CSD in the direction you're supporting. 71.198.176.22 12:02, 8 December 2010 (UTC)[reply]
I would say that legal questions are for lawyers, but not necessarily that legal questions are for office. If it's something where great haste is needed to avoid discrediting Wikimedia, that's one thing, but if it's a debatable case, we might as well make the decision in public. There's no reason why Wikimedia's legal counsel can't give input directly to the AfD process. Mostly, however, my hope is that the act of discussing a borderline case in AfD will goad those involved into presenting good arguments that the content actually has educational, scientific, artistic, or political importance, thereby hardening Commons against any potential prosecutions. Wnt (talk) 12:51, 8 December 2010 (UTC)[reply]
And then the closer judges whether their arguments met the Miller Test or not? No thanks. Gigs (talk) 17:33, 8 December 2010 (UTC)[reply]
I too raised Dost and Miller as a reason for an oppose. It is ironic, as I do spend a lot of time policing out images from a en:wp article pages that fall within the spirit of Dost and Miller. But that is en:wp, and I can envisage occasions where even those images may have an educational content. Before however I make a decision on a policy I need to understand completely the issues- and from the Miller test and Dost test- I don't get that. Just warning bells. They are only Start-class articles- if they were GAs then I might have enough information- to take the risk and form a policy. At the moment I sense a Weapons of mass destruction fiasco is being engineered. So even before, we take into account whether we wish to guide a global project on the perceived opinions on the views of a small community somewhere on the 26th parallel, could someone work up those two articles to GA status.--ClemRutter (talk) 20:15, 8 December 2010 (UTC)[reply]
Illegal material will be deleted by administrators, with or without this policy. The Miller test is used to deem illegality, so it is relevant whether or not we mention it. But if a work, taken as a whole, has serious literary, artistic, political or scientific value, then it will not be regarded as obscene. That means, more or less, that no work that is in project scope is obscene by this test. Educational is not mentioned, but I understood that would not be a problem.
So mentioning the Miller test is not about allowing censoring, but about ruling out some arguments about illegality in the deletion debates.
--LPfi (talk) 20:58, 8 December 2010 (UTC)[reply]
If applied properly, the Miller test won't be a problem for any of our content. If applied improperly, taking the test at face value, it could lead to lots of problems. Let's cut the current Miller test text-- nobody knows what that test even means within the context of a global project like ours. --Alecmconroy (talk) 06:52, 9 December 2010 (UTC)[reply]
one obvious point here: we already have a policy for deleting illegal materials, which makes that aspect of "commons:sexual content" irrelevant/redundant. as re: "the miller test", i don't see any way that we can realistically train all our admins on here to administer it, to a fair, even, & credible legal standard. failing that, it becomes a hopelessly subjective "bogeyman" standard, being applied differently by every admin, & the source of yet more internal strife @ commons. it's a nice idea in principle, but if even the courts can't make it work as a practical, implementable, objective standard; how are we supposed to? for materials that aren't clearly illegal for us to host, the place for discussion & consideration (especially of issues re: scope!) is in open community debate, NOT by admins in secret, with no true open review of their actions possible Lx 121 (talk) 21:46, 9 December 2010 (UTC)[reply]

other languages

Well, I know that it is a lot of work, but if I see this case correctly, people that have not the sufficient proficiency of English should deal with the outcome of this voting of people who have. Hm, thought this is a multilingual project, where even non-English speakers are welcome? --Catfisheye (talk) 23:21, 8 December 2010 (UTC)[reply]

If you want to translate it, go ahead. But translations are hard, we don't have anyone volunteering to do translations of this document, and it would take several translations to even start to cover our non-English speaking users. Not to mention that disagreements are turning over delicate turns of phrase that won't translate exactly, and hence those voting based on versions other than the English won't get an entirely accurate view of the proposal. So translation is not our first priority.--Prosfilaes (talk) 00:51, 9 December 2010 (UTC)[reply]
I considered the problems you mentioned, but knowing that, how could the result of this poll be regarded as representative, i. e. obligatory for all, if a not neglectable amount of users is being excluded from the beginning? That of course does not apply only to this poll, but is a general problem. Maybe a system can be adopted, where translations do exist with a caveat for being limited to that what a translation can do, but the voters have to vote for the English version. But how to secure that translations are being made. Do you know which chapter of the foundation is responsible for Commons? --Catfisheye (talk) 01:47, 9 December 2010 (UTC)[reply]
This is a fair argument, except... isn't every Commons process done in English? Deletion discussions, admin noticeboards, village pump, etc. Hopefully the policy won't infringe very much on other projects --- with the notable exception of explicit consent. Handling the language issues regarding that may not be easy. Wnt (talk) 02:17, 9 December 2010 (UTC)[reply]
It's really critical that we involve non-english-speakers in this decision-- they are going to be the ones most affected, since they won't even have the opportunity to come to commons and argue for undeletion. I think I'd feel very uncomfortable about that set up, if the shoe were on the other foot and it was a german-language community deciding what can and cannot be allowed here.--Alecmconroy (talk) 06:32, 9 December 2010 (UTC)[reply]
I think the way to involve non-english-speakers is to have those understanding English report about the proposal and key points in it. In most communities there is at least a big minority knowing English. If the community finds the proposal problematic, we can hope enough people will come here also from those communities. There is easily some hundred people knowing English in every big language wikipedia, enough to overturn any result here. Of course the USA and Western Europe are overrepresented, so we should listen carefully to those representing other cultures.
The same is true about being affected by policies. There is a village pump for all languages, at least in the Wikipedia project. Those not knowing English can ask for help there. I think translation of introductory texts is more important than translation of proposals: when new users feel at home in Commons, using their own language to upload and describe their files and discuss problems (at their village pump or user talk pages), they will know how to find somebody to help them with the English.
--LPfi (talk) 08:30, 9 December 2010 (UTC)[reply]
It's hard to argue that this isn't a concern, but I'm not volunteering to do the translation either. In the near term, if the policy passes, some latitude will be needed for people who don't speak English. If need be (and I don't think it really should) the other projects can also host their own images locally. Certainly on en. that has to be done quite often for copyright/fair use reasons. Wnt (talk) 16:26, 9 December 2010 (UTC)[reply]
The biggest problem of this poll is the language. Who is willing to participate in a poll that he can't even understand or at least in detail? Commons is a project that provides images for all language versions. A poll that is only hold in English language is not representative at all. First the translations. Then the poll. Otherwise it is like cheating on all the other projects. --Niabot (talk) 20:27, 9 December 2010 (UTC)[reply]
Whatever. I've noticed no impulse on anybody's part to start translating it, so all this amounts to is a stalling tactic, which can be used to block any new policy. Most current policies have a handful of translations; none of Commons:Blocking policy, Commons:Photographs of identifiable people, or Commons:Nudity have more than five translations, none of them into German, Russian or Arabic. There is no hope of getting this translated into every language of Wikimedia, so at the very least you're going to have to specify which languages you're agitating on behalf of.--Prosfilaes (talk) 21:57, 9 December 2010 (UTC)[reply]
Sorry, but that's not cool. Every contributor--regardless of native tongue--should be a part of all major Wikimedia-wide policy decisions. The fact that this isn't already mandated absolutely amazes me. And if finding volunteers to translate proposals is such a barrier, then we need to have the oversight to prepare a reliable plan in advance.   — C M B J   22:49, 9 December 2010 (UTC)[reply]
I agree with OP Catfisheye and others: It really is a problem in our multilingual project: confusio linguarum. But it is a problem for all polls here. It is not only relevant for this one.
The European Union has a big, big, very costly institution (en:Translation Centre for the Bodies of the European Union) to provide all the translations to all languages in the EU. But Commons has more languages than the EU and cannot pay certified, professional translators to translate 100 % correctly.
I do not know the solution to this problem right now - but it surly is a problem for Commons. It especially is a problem when it comes to polls where the language restriction creates a bias towards a specific result - as it is surely the case with this topic. This poll cannot be representative for all members of the Commons community. Some parts of the community rely on our wonderful translated templates and messages and get it done to upload images and do the stuff they then need to do. As here is nothing translated this poll is not accessible for them.
This problem does not matter here as long as it is closed as "no consensus", but if it would be not we have to ask the question if it really is a consensus or if it looks like a consensus due to a strong bias in the voters ability to vote. Cheers --Saibo (Δ) 23:09, 9 December 2010 (UTC)[reply]
Once again, you haven't responded with anything practical. All contributors are never going to be a part of all policy decisions; only a tiny percentage of contributors have responded to this. There are 276 Wikipedias; cut, fold and spindle however you want, that's still 200+ languages. Commons:Community portal is available in less than a fifth of them, and that's a lot; Commons:Project scope is available in 14, one fifteenth those languages. So far, not even one person has said "I want to spend my time and energy translating this proposed policy."--Prosfilaes (talk) 04:40, 10 December 2010 (UTC)[reply]
I believe that you are unnecessarily devaluing the points being made here. No one is saying or implying that all contributors will be a part of all policy decisions, but rather that all contributors should have the opportunity to be involved in planning the future of the project that they are part of. And the conundrum is not one of practicality; it is one of fairness and universal access; one that is akin to alt attributes and wheelchair ramps. But even from a pragmatic standpoint, there is no reason why we cannot solicit two or three active, bilingual administrators from each of those 200+ projects to translate major (sitewide) proposals.   — C M B J   05:05, 10 December 2010 (UTC)[reply]
The fact is, alt attributes and wheelchair ramps are tools for cheaply supporting the moderately challenged. Text readers will mangle many words found in Wikipedia articles; should we not dedicate ourselves to making sure that every Wikipedia article has a spoken version? The simple fact is, not all of those project have two or three active administrators, and many of them would find it too much of a call on their time or simply outside their interests to translate these policies. So far, not one person has volunteered to translate this proposal, which doesn't bode well for the theoretical possibility of getting it heavily translated.--Prosfilaes (talk) 05:25, 10 December 2010 (UTC)[reply]
When I skip through the names of the voters, I see a strong geographic bias: There are many high profile german users on the oppose list (Elian, Markus Cyron, Catfisheye, Saibo, Jan eissfeldt, Thogo, Paramecium, DerHexer, Ukko, Sargoth, Julius1990, Widescreen, Fossa, Kmhkmh, Niabot, Adrian Suter, Chaddy, Kuebi, Gereon K., h-stt, Cartinal, Don-kun, Hybscher, Achim Raschka, Grim.fandango, Andim, Gripweed, Simplicius, Liberaler Humanist) many of them admins, all of them very active. The same cannot be said about the support list. I didn't find a single familliar german wikipedia name on the oppose support list. Based on this sub-poll, it seems fair to conclude that German WP would not be happy with the proposed policy. I'd expect some kind of fork in case the policy will be accepted.---<(kmk)>- (talk) 02:47, 10 December 2010 (UTC)[reply]
Add me though probably not high-class :-) and serveral others that I recognize. I wonder that I id not see any Dutch colleagues on either list. I still suspect this is somewhat a European / (US)American shism. --Purodha Blissenbach (talk) 02:57, 10 December 2010 (UTC)[reply]
Thanks for your quick research KaiMartin! (By the way: you probably confused oppose and support here: "I didn't find a single familliar german wikipedia name on the oppose list.") Cheers --Saibo (Δ) 03:49, 10 December 2010 (UTC)[reply]
(corrected)---<(kmk)>- (talk) 10:49, 10 December 2010 (UTC)[reply]
There are some Germans on the support list (Cvf-ps, Singsangsung, AFBorchert, Yikrazuul and High Contrast). And even though you missed a couple of "high profile" Germans who oppose, the German opinion is clearly against the policy.
The main problem with the language is, that by using English we adopt american/britain way of thought. I don't see why a policy for an international project should even mention practice conducted by the US-Supreme court. --bluNt. 07:25, 10 December 2010 (UTC)[reply]
Delete COM:L, and I'll go for it. Otherwise, we're stuck dealing with the fact we live in the real world and have to deal with the laws laid upon us.--Prosfilaes (talk) 08:18, 10 December 2010 (UTC)[reply]
If we think of linguistic fairness, Germans are no problem. English is taught as first foreign language in Germany, German and English are related and Germans do know English better than most non-native speakers (that is my impression at least). Germany is a Western country. The problem is serious e.g. with Latin America, Russia, China, the Arabic world and most of Africa. Spanish, Russian, Chinese, French, Portuguese and Arabic are big languages though, with potential of finding translators (but having good translations of all proposals is probably too much to ask).
We also have a lot of small languages, in countries where none of the mentioned languages is the main foreign language, or foreign languages are not commonly known. The latter is a real problem for the cultures affected, to which I see no viable solution, other than being understanding towards people using less known languages (e.g. in creating categories or asking for undeletion of their files).
--LPfi (talk) 09:33, 10 December 2010 (UTC)[reply]
I would say, that germans have not such an big problem (statistically). But you may consider that germany was parted 20 years ago and anyone older then 30 in East Germany (DDR) had Russian as the foreign language. I know many german wikipedians that suffer from this language barrier and aren't able to sit on the same table as we currently do. And yes, your are right. For other countries/regions it's even more difficult to participate. I guess that 50% of the projects are not even knowing that this poll is hold or aren't able to figure out what it is about. In my opinion such a poll (affecting globally) it should at least be available in the five major languages.
Also i read: "no one cared to translate it". But i also know: "no one asked to do so". --Niabot (talk) 10:19, 10 December 2010 (UTC)[reply]
um, maybe a stable version of this proposal would make it easier to find translators? --Catfisheye (talk) 11:57, 10 December 2010 (UTC)[reply]
Maybe, yes, since I see no sense in spending much time and effort to translate this proposed policy which I do not support. This could be called stealing time by some people who think that we need a new policy with this and that content just because some are overreacting to US sunshine press/media agitation.
@ ~"germans do not have the biggest problems" - sure - but I and others discussing here know it best from German community. If even there occur problems the problems are worse for other language communities who do not have English as a widespread foreign language thought in school and whose language is not similar to English.
@Prosfilaes: Yes, I know and even wrote that my comment above was nothing practical. There is no need to give me a note about this I know it. It was just a bit beyond our own nose to get ideas and impressions from areas with similar problems. Cheers --Saibo (Δ) 15:22, 10 December 2010 (UTC)[reply]

A few little queries

I've gone through some of this, copy-editing without intending to change the substantive meanings. Among little issues I've spotted (and may not understand) are these:

  • "A media file that is in use on one of the other projects of the Wikimedia Foundation is automatically considered to be useful for an educational purpose. Such a file is not liable to deletion simply because it may be of poor quality: if it is in use, that is enough. Exclusive use of a file in userspace does not constitute educational purpose." I'm surprised that no assessment would be made of whether the use of the file is educational. This is a requirement, for example, of some fair-use rules. Is material allowed on talk pages alone, and other WP space alone?
  • "Your communication will be confidential and will be viewed only by trusted volunteers who have identified themselves to the Wikimedia Foundation by name. For further information see Meta:OTRS." So WMF employees won't be able to access it? Tony1 (talk) 07:09, 9 December 2010 (UTC)[reply]
The main point is that the projects are allowed to define themselves which images are suitable and we on Commons should not delete images that are deemed useful (for an educational purpose) in the projects. We might think that an image is useless low quality porn, but if it is used on a Wikipedia article, then we should not think we know better than the editors of the Wikipedia. User page use is explicitly not considered educational use (but COM:PS has some comments about scope and user pages).
Use on talk pages is more tricky, but I think removing an image from a WP: or talk page could be very disruptive. If there is no problem with the image (other than being out of scope), I think leaving the image is certainly the right thing to do. If the number of such images is too high, then the problem is with the projects, not with Commons hosting them. Individual disturbing images might be protected this way, but there are probably many more disturbing clearly in-scope images. ("Fair-use rules" is probably referring to some individual Wikipedia? The problem is different as it can be handled locally.)
--LPfi (talk) 09:05, 9 December 2010 (UTC)[reply]
Good catch about the WMF employees - when we make a promise like that we should be able to adhere to it literally. Wnt (talk) 16:47, 9 December 2010 (UTC)[reply]

I've removed some of the mid-sentence bolding. (1) It looks messy. (2) It often excludes the critical word, which comes before the bolding. I do not believe it aids comprehension, unless perhaps bolding is used at the very start of each point, as an inline heading, as it were (such as in Speedy deletions). Tony1 (talk) 07:12, 9 December 2010 (UTC)[reply]

Images of deceased people

I don't know if this is right place for this, if not I'll delete. I hate to re-open an old dispute. There needs to be clearer image policies about situations where the assumed consent of a deceased person should be required.

When I joined Wikipedia completely new and inexperienced last February, I was right away immersed in a dispute regarding the nude image of a murdered child on Wikipedia. Her private area was barely visible but it was there and if that was someone I loved exposed that way, I would have been devastated. For the record I hate censorship but this was a case of the rights of a person comes first.

When I tried to have the imaged removed, editors :

  1. Gave me the old not censored disclaimer
  2. Insulted my intelligence with pseudo logic and platitudes
  3. One editor flamed me with some serious personal attacks
  4. Tried to ignore me away under the pretext of Don't Feed the Trolls when I tried to make a peace gesture

I was so intimidated but I didn't go away, I'm here. The image was finally deleted three months later for a copyright technicality, but the way that poor child was exposed for over a year is a stain on Wikipedia's reputation. Slightsmile (talk) 17:02, 9 December 2010 (UTC)[reply]

The goal of Wikicommons is to create a repository of free works which then can be used for educational purpose. If somebody's feelings get hurt in the process, it is bad of course, but if there is a policy it should be to help those people deal with their feelings rather than doing the a disservice by denying their problem and removing the images. Beta M (talk) 17:28, 9 December 2010 (UTC)[reply]
The existing COM:IDENT policy doesn't contain any word about living, dead, or deceased persons, so there's no reason to think they would be treated differently. However, if there was a copyright issue involved, the photograph may have been previously published, and that policy is addressed to the individual taking the photographs, not people uploading previously published photos. This policy follows in the same path. The idea is that if a photograph is published already, it's already public and Commons can't change that. Wnt (talk) 03:40, 10 December 2010 (UTC)[reply]
Further down in the link provided by Wnt Moral issues, it does touch on the concerns I've expressed regardless of if it's already published. I'm just saying these policies should be better defined.
I want to thank Beta M for the perfect example - if you don't like an image go see a shrink. Users shouldn't have to take this crap when they have concerns like this.
For whatever it's worth I've seen similar discussions in the Black Dahlia and Goebbels children talk pages. Slightsmile (talk) 17:42, 10 December 2010 (UTC)[reply]
I would say that the "moral issues" expressed there are based on a very idiosyncratic notion that the international "right to dignity" is actually a responsibility of general censorship. I would accept that the right to dignity means that a state actor can't lock detainees to a chair and broadcast to the Web as they pee themselves. But I don't accept that international law should be taken to prohibit republication of images by individuals. Certainly we have not heard much of it in the Abu Ghraib affair, which was as flagrant a violation as could be devised.
On reflection, there is a small omission in the policy here. My first reaction is that the policy prohibits a contributor from taking an autopsy photo on his own and posting it here without consent. But what is the "consent" of a dead person? Does it default to yes or to no? Does it matter if he donated his body to science? Maybe you can ask the next of kin, but there's a can of worms for you... you'll be arguing about gay marriage before you get to the bottom of it. Wnt (talk) 14:31, 11 December 2010 (UTC)[reply]
For what it's worth, Belgian law says that the heirs' right to control the dead person's image (with probably exceptions, like public places, or informational purposes) ceases 20 years after death[7]. This could provide ground for deleting a Belgian picture if lacking heir consent. Teofilo (talk) 16:57, 11 December 2010 (UTC)[reply]
I think it would be simple enough to identify which cases would require the Assumed Consent Test - If it was you, would you want to be seen like that?. For example, in my opinion : Innocent murder victims would merit the test. Excecuted Nazis would not merit the test. There will be grey areas but I think most cases would be just plain old common sense. Slightsmile (talk) 18:56, 11 December 2010 (UTC)[reply]
I am not sure about that. Do you say that images of bad guys are ok, while images of good guys are not, unless they would consent? Who's going to judge who are good guys? The victor? What about en:My Lai? Requiring consent is non-problematic when we can get images where the people do consent (such as normal sexual content). In the case of historical events (such as murders and executions) we have to use other criteria. --LPfi (talk) 20:47, 11 December 2010 (UTC)[reply]
Some of the bodies in the top picture in the My Lai would probably fail the consent test but is such an iconic photo - sure there will be special cases. In the case of many Holocaust articles on Wikipedia we see bodies illustrated but I haven’t seen any images, that I remember, humiliating the victims here on Wikipedia and there are lots of them out there. So Wikipedia does already seem to exercise some kind of discretion. It is a good idea to form some kind of policy. Not having some kind of safeguard about humiliating victims is not an option.
There should be should be some kind of require consent list. For starters start out with the most obvious cases where the test would apply - nude bodies of innocent victims seems to me a no brainer. Add or not add situations to the require consent list as issues come up. Something tells me this guy wouldn't qualify. Sorry if I use this word a lot - common sense. Slightsmile (talk) 00:58, 12 December 2010 (UTC)[reply]

while we're chatting.....

there's a deletion discussion quietly going nowhere featuring two lovely people having sex as radiohead plays gently in the background. There's no indication that both parties agreed to the worldwide publication of their efforts, nor that the band members involved in the creation of the music approve either. In the charming wiki world we live in, one of those facts will at some point, maybe after a month or two, ensure that the file is removed - funny old game.... Privatemusings (talk) 22:52, 9 December 2010 (UTC)[reply]

Radiohead plays in the bakground = copyvio. There is no need for this rubbish to delete the flick. sугсго 08:32, 10 December 2010 (UTC)[reply]
you've sort of made my point, syrcro - imagine for a moment that this is a video of someone's ex partner, and that one of the individuals involved would be mortified and humiliated by the worldwide publication of their shenanigans, which may have been recorded unbeknownst to them, do you really feel that the people in that circumstance we should most concern ourselves with are the multi millionaire chaps who happened to strum a string or two in the background? Privatemusings (talk) 10:35, 10 December 2010 (UTC)[reply]
I thought our positive consent requirement would entail at least nominal permission from both partners (exes). Ocaasi (talk) 10:46, 10 December 2010 (UTC)[reply]
well it would (it's really the only substantive change within this proposal currently) - but it's not been adopted yet :-) Privatemusings (talk) 01:36, 13 December 2010 (UTC)[reply]
No, thats a porn site. In Fact its a porn Newspaper called Österreich, not the federal broadcasting service. --Liberaler Humanist (talk) 20:45, 10 December 2010 (UTC)[reply]
the above commentor is either joking, or factually incorrect. http://www.oe24.at/ is quite clearly a news service Lx 121 (talk) 06:03, 12 December 2010 (UTC)[reply]
oe24.at is the website of the newspaper Österreich which has got nothing to do with the public broadcaster ORF (Österreichischer Rundfunk). However, the event itself was also featured in the ORF news broadcast ZIB 2. --84.112.138.221 13:15, 12 December 2010 (UTC) [reply]

Beating the dead horse

Expanding my voting rationale, I just want to emphasize the extent to which the proposed policy duplicates the existing sexual content policy COM:CENSOR and the guideline COM:PORN, as well as COM:PRP, all of which are mentioned in the proposed policy. Nearly all sections of COM:SEX are actually moot elaborations of the aforementioned Commons pillars with minor copyright issues. The three types of prohibited content are already covered in COM:CENSOR, COM:IDENT and COM:SCOPE respectively, and all of them are mentioned in that core section. The tandem of famed Miller test and the relevant legal disclaimer in COM:CENSOR for quite a long time has been and still is a powerful and succinct remedium against undesired agenda, encapsulating all relevant policies and guidelines. Without this old tandem Commons most likely would have been flooded with porn, but, from what I see, is not. And, as far as I know, the proposed policy was not put forward amid some rising wave of illegal or out-of-scope uploads. What we are doing now is jailing what has been already jailed. Brandmeister (talk) 13:10, 10 December 2010 (UTC)[reply]

Yes, mostly we are jailing what has been already jailed. But one reason for this effort is that the founder of the project, the board of the supporting foundation and perhaps a public opinion do think that we have a problem. Them thinking we have a problem is a problem.
The mentioned policy and guideline sections are not very well written and they are not very easy to point to, when dealing with somebody not knowing Commons well. The proposal is written not to change current policy and practice, but to have it well written in one place and to clarify some issues.
I was surprised by all "this is censorship" (and "we do not need porn") oppose votes. I do not see what media will be deleted by having this policy that would not be deleted otherwise or vice versa – other than this policy hopefully reducing randomness.
--LPfi (talk) 14:24, 10 December 2010 (UTC)[reply]
If this policy becomes accepted, I think that COM:PORN and COM:NUDITY might reasonably be superseded by it. Wnt (talk) 14:06, 11 December 2010 (UTC)[reply]
If all this seeks to do is to encapsulate existing policy in one easy to go to place and to placate the media then it would be fine. If the two added clauses that make legal and out of scope images subject to speedy deletion and not to a regular deletion review were removed than I as well as many others would probably support it. As for images that have already been speedy deleted as pornographic and out of scope, here are some: File:Édouard-Henri Avril (27).jpg, File:Franz von Bayros 016.jpg, File:Félicien Rops - Sainte-Thérèse.png. --AerobicFox (talk) 02:31, 13 December 2010 (UTC)[reply]

"Prudery" seems to me a very odd accusation

I notice that an enormous amount of the opposition to this proposal cites "prudery" as a reason to oppose. I fail to see wherein the prudery lies. Basically, except for noting that certain content is illegal for us to host in the country where our servers are located (something we are merely explaining and that we, as participants in the project cannot affect), there is nothing here saying that any subject matter is off limits: quite the contrary. The focus is on making sure that (1) identifiable people who are photographed in a sexual manner have given their consent for these photos, (2) content is accurately described and accurately categorized, in accord with the "principle of least astonishment," and (3) we are clear that Commons is not interested in hosting a bunch of poorly made penis shots (which are already routinely deleted). In particular, note the sentence, "Except for images prominently featuring genitalia or sexual activity, mere partial or total nudity is generally not considered sexual content."

As several other people have pointed out, this policy is far more a gathering in one place of how we already work than it is breaking new ground. Many have objected to it on exactly that basis. I personally think we should nonetheless have an explicit policy in this area in order to stave off rampages like the one Jimbo engaged in a few months back. But it is looking clear that we are not going to have consensus to adopt such a policy.

Can someone please clarify what, precisely, in the policy they see as "prudery"? Are the opponents to this really saying that they think we should host thousands of poor photographs of ordinary penises, allow photos taken with no consent of people having sex (or non-consensual upskirt photography), and/or allow a photo to be titled "my girlfriend's hot pussy" and described in terms that come straight out of Hustler? I, for one, fail to see how that furthers Commons' mission. And, if that is not what you are saying, what exactly are you objecting to as "prudery"? - Jmabel ! talk 17:55, 10 December 2010 (UTC)[reply]

I don't think that the majority of opposers think "we should host thousands of poor photographs of ordinary penises, allow photos taken with no consent of people having sex (or non-consensual upskirt photography), and/or allow a photo to be titled "my girlfriend's hot pussy etc", quite the contrary. As far as I can see from the arguments on the list most opposers, and I agree, they say that our current policies are sufficient to prevent or eventually delete those kind of pictures. What I and a lot of other opposers are saying, is that with vague wording of speedy deletion of "out of scope" images as well as the odd and legalese "Millar test", the proposed policy is in effect granting singular powers of judgement to individual admins that can be potentially misused in defence of "prudery" as well as a host of other non-acceptable excuses. --Saddhiyama (talk) 20:09, 10 December 2010 (UTC)[reply]
You are at least partly right. Looking again, I was wrong to say that an "enormous" amount of the opposition was on this grounds. I guess it's just odd to me that any of it is. - Jmabel ! talk 22:35, 10 December 2010 (UTC)[reply]
You and a lot of the opposers continue to mischaracterize the wording as speedy "out of scope", when it actually says speedy "obviously out of scope". Any admin who deleted files as obviously out which later turned out to be IN scope would most likely be severely chided (as Jimbo was for his mistakes). --99of9 (talk) 02:26, 11 December 2010 (UTC)[reply]
2 problems with this: 1. "obviously out of scope" is a subjective judgement-call & it should NOT be left to the judgement of individual admins; there is overwhelming evidence in the records of this wikiproject (& others) to demonstrate how widely varied individual opinions can be, this is why we have community processes for such decisions. 2. if an admin does "insta-delete" files as out-of-scope inappropriately, there is no way for most members of the community to check, observe, verify, or be involved in the process of dealing with it. ONLY other admins would be able to even see the files which had been removed; without any proper notice, or record of events being available to the general community. i'm not willing to give you guys that kind of power, over the whole project, blindly. & as regards: "severely chided": i'm sorry, but that just doesn't cut it! any admin who abused their powers that severely should be automatically de-adminned permanently & at least temporarily banned from the project. Lx 121 (talk) 05:29, 11 December 2010 (UTC)[reply]
The recourse that uploaders of those images have (if they think that their now-deleted images have real value) is to take it to Deletion Review. The reason why this rarely happens is presumably because most uploaders of such material have little lasting commitment to Wikimedia Commons and its goals, other than being a convenient hosting site for drunken-cell-phone-self-penis-snaps... AnonMoos (talk) 12:35, 11 December 2010 (UTC)[reply]
one key point you are either missing, or deliberately ignoring is that this process is NOT just a dialogue between one uploader & one admin, it is a community discussion & a community decision. when you make these decisions on your own, in secret, instead of openly, the there is no way for the community to review your actions. that is not acceptable. Lx 121 (talk) 00:34, 12 December 2010 (UTC)[reply]
No. Thatś not fair. There are some editors that create content, and others that have your legal mind and do all the backroom work. To be honest, there is just too much to do without learning yet another procedure- and submitted yourself to it from a point of weakness. Have I tried?- no and won't. I trust you to interpret the arcane rules on my behalf- mumble to m'sen and go back to Commonist. However, if I am asked to formulate policy I will attempt to point out a potential boo-boo.
I can see from your postings that low res genitalia is an increasing problem- well there must be technical ways to intercept it and redirect it to water-wheel driven server in Outer Oblivistan, that uploads over a 300/1200 modem.--ClemRutter (talk) 15:00, 11 December 2010 (UTC)[reply]
i would like to point out (again) that the claim about this being an "increasing problem" has still not been backed up with any data. the statement is merely an assertion/justification by certain users seeking approval of this provision. Lx 121 (talk) 00:34, 12 December 2010 (UTC)[reply]
It's not usually a major problem at any one moment, but it's a constant annoying trickle which could build up to become a major problem if there wasn't also an accompanying trimming and pruning back... AnonMoos (talk) 06:05, 12 December 2010 (UTC)[reply]

Which taboos should be respected?

We discuss here about one of western taboos. The discussion theme should by wider. Which from many moral and cultural prohibitions and scruples should we respect, and which shouldn't? Child porno - and as appears also many others - should be prohibited, and Muhammad images shouldn't? Why not conversely? Should be Commons censored only from US or western point of view? --ŠJů (talk) 18:08, 10 December 2010 (UTC)[reply]

Child pornography is prohibited by the laws of the United States and of Florida, while depictions of Muhammad are not (and in fact there have been many images of Muhammad made in traditional Muslim cultures, some of which you can see for yourself at Category:Muslim depictions of Muhammad). And if you want to talk about "selective taboos", then images which defame Jews and Judaism are far more protected and sheltered at Commons than images which defame Muslims and Islam... AnonMoos (talk) 18:15, 10 December 2010 (UTC)[reply]
In fact we do respect the taboo on Muhammad images. A separate category Category:Depictions of Muhammad was created, with no images other than in subcategories. I removed an "Everybody Draw Mohammed Day" image from Category:Muhammad in the spring and the category seems to have remained clean (perhaps by the efforts of others).
This is the same as we do with sexual content: place them in more specific categories. Files illegal to host is a totally separate problem, that could be solved by somebody putting up a server somewhere else.
In any case servers outside the control (and thus responsibility) of WMF is something that is independent of this proposal. I am not going to fork Wikimedia to be able to host child pornography, although a mirror with separate deletion decisions could be good to have – mostly for retaining politically sensitive content that might get deleted in USA.
--LPfi (talk) 19:52, 10 December 2010 (UTC)[reply]
as regards the previous comment about depictions of muhammad: what on earth are you talking about!? there is NO policy, or precedent, or community decision about censoring religious/political materials. the muhammad-related materials are subcategories logically, for purposes of efficiency. if you have been deleting materials relating to mohammad, because they "violate a religious taboo", then please provide links to the relevant deletion debates, because i would very much like to know what has been removed, & see how consensus was reached on this! Lx 121 (talk) 05:14, 11 December 2010 (UTC)[reply]
Dude, he didn't say he deleted it, he said he moved it to a subcategory. AnonMoos (talk) 12:26, 11 December 2010 (UTC)[reply]
The subcategories were created this spring, probably in direct response to a person from WMF (I don't remember who) complaining about the issue, not because somebody found them more efficient. It was easy to create the category and seemingly nearly nobody had strong objections (that one image was probably a protest, and no, I didn't delete it). For sexual content we need a lot of categories, so it is easier to state the need in a policy than to watch every potentially problematic category. --LPfi (talk) 21:13, 11 December 2010 (UTC)[reply]
thank-you for clearing that up, & sorry if i misunderstood your original comment. Lx 121 (talk) 03:22, 12 December 2010 (UTC)[reply]
I think, that my taboo about bare hands should be respected. We should also respect the taboos about the theory of evolution. Seriously: The worst thing we can do is to care about any taboos. There is a taboo about almost everything. WP ought to provide serios , not censored content. As I mentioned the Theory of Evolution: The argument is, that we have to use the Miller Test as the servers are in Florida. What would happen, if Florida would introduce a law against the Theory of Evolution? Would we have to replace the Theory of Evolution by the "Intelligent Design"? --Liberaler Humanist (talk) 20:43, 10 December 2010 (UTC)[reply]
If Florida introduced a law against the Theory of Evolution, however absurd, illegal, and inconcievable that may be, we would have to move the servers or comply with the law. In such a case, it's possible (if again absurd and inconcievable) that every place in the world has introduced laws against the Theory of Evolution. In which case, policy or no policy, we'd have to comply with the law.--Prosfilaes (talk) 20:52, 10 December 2010 (UTC)[reply]
since i seem to be spending a great deal of time lurking on this page (at least until the vote closes), i can't resist commenting on this one! if florida ever did introduce such a stupid law: 1. it would be thrown out as unconstitutional 2. if not, then we would need to relocate our operations & servers. 3. if the whole world went crazy enough to do such a thing, it would be time to take our wikis underground, & viva la revolucion! :P Lx 121 (talk) 05:14, 11 December 2010 (UTC)[reply]
I have observed the debattes about Evolution vs. Creationism for many years, there are some groups which promote a ban of the theory of evolution. I do not think, that these groups are disconected to established political parties. However, I do not think, that the servers should be stored at a single place. As you have mentioned the constitution of the united states: The secound party in that country is dominated by a group of very strange people. I dont want to mention all the strange positions of these very strange people, but the scenario of having a political and legal climate which would seriously disturb our work is not as unrealistic as it seems to be. The servers should not be stored at a single place. --Liberaler Humanist (talk) 17:47, 11 December 2010 (UTC)[reply]
without intending to disparage the current state of us politics, i agree, on basic principles, that we shouldn't have our servers concentrated in one country. makes it too easy to shut us down, & leaves us too vulnerable to changes in the legal climate in one host country. the wikileaks situation certainly demonstrates that!Lx 121 (talk) 03:22, 12 December 2010 (UTC)[reply]
This is a straw man - child porn is also banned in Muslim countries, as well as most countries of the world. Also, child porn is not a "taboo", it is visual rape - taking images of someone's sexual organs without consent and no ability to legally consent. The WMF does not condone the spreading of sexual violence through mass transmission of depictions of it. Ottava Rima (talk) 04:34, 11 December 2010 (UTC)[reply]
agreed with the point about the near-global illegality of child porn; but, with no personal disrespect intended to the commentor, the term "visual rape" & the stuff about "spreading sexual violence through depictions of it" i would consider as hyperbole & psychobabble; it's not a serious, scientific description of the actual effects/harms done. this is not intended to in any way minimize the considerable potential for harm by such activities, but it is unhelpful to obscure the real damage with simplistic, one-dimensional, political sloganeering Lx 121 (talk) 05:14, 11 December 2010 (UTC)[reply]
The "servers in Florida" standard actually comes from COM:CENSOR. It wasn't invented here. If WMF sets up another server outside the U.S., then after some legal consultation to consider the effect of integrating it into the sites, we could work on changing this policy and COM:CENSOR to allow for material not legal in the U.S. kept on the other server.
That said, the specific example of child porn has another problem altogether which would be difficult to deal with, even if the Foundation really wanted to set up a server in some Shnagri-La specifically to allow it, which is that it is subject to one of very few censorship laws in Western countries where people are punished merely for looking at it, rather than posting it. By comparison I think I recall that when the Acehnese Wikipedia was having its protest about Muhammed images, they actually put a header on the main page linking to them.
Now in saying this I should disclaim that I don't mean to support or excuse this Western taboo - several billion dollars yearly are made by abducting children to make child porn, which would presumably be much curtailed if we had the common decency to allow those abused and photographed as children to take possession of the copyright over the images should they survive to adulthood and to distribute them commercially if they wish. We have a situation in which according to w:child prostitution there are 162,000 child prostitutes on the streets in the U.S., and yet large numbers of highly technical prosecutions are being made against people for looking at pictures, or possessing a few pictures mixed in with their porn feeds, or kids "sexting" one another. It really isn't sane or decent - our society should be putting its resources freeing children from a condition of literal and depraved slavery, while recognizing that the right to a free press, no matter how offensive, is God-given and inalienable. Wnt (talk) 14:02, 11 December 2010 (UTC)[reply]
"which would presumably be much curtailed if we had the common decency to allow those abused and photographed as children to take possession of the copyright over the images should they survive to adulthood and to distribute them commercially if they wish." Sick. I really hope you never get the chance to decide on these matters.--93.131.239.236 14:13, 12 December 2010 (UTC)[reply]
Real C P is illegal in any country because the subjects depicted have been psychologically harmed, and the circulation and republishing of said material is, in a way, a tacit approval of said practices, which encourages the criminals to produce more of it. Children are like sponges, they absorb information without malice or critical thinking, and then they internalize what was done and think it's normal, while it is actually wrong. They could become like Pedro Lopez and Daniel Camargo Barbosa later in life, and contine a never ending process of harming other children. Aberforth (talk) 14:10, 11 December 2010 (UTC)[reply]
I once read Mein Kampf, but it wasn't a tacit approval. I don't believe that watching axe murderer movies makes kids into axe murderers, I don't think that watching gay pride marches makes children gay. The causes and potential cure of pedophilia is indeed an interesting medical question, and I think that there are many fairly easy neurological/fMRI research projects that would be interesting to try there (e.g. trying to use biofeedback in conjunction with psychological techniques to get patients to stimulate the neural pathways that have been found to be lesioned in some pedophiles, in order one hopes to use adult images to upgrade their animatype), but alas, this isn't the forum... Wnt (talk) 14:41, 11 December 2010 (UTC)[reply]
The laws of the United States and Florida ban child pornography, and that's pretty much the end of the matter as far as WIkimedia Commons is concerned. AnonMoos (talk) 16:56, 11 December 2010 (UTC)[reply]
Agreed. I brought up the argument only to demonstrate that we are making this decision on account of the law - Commons is not making a choice to censor. Wnt (talk) 03:43, 12 December 2010 (UTC)[reply]
There is nothing in this decision that has anything to do with the law. Current policy already says that all material deemed illegal will be deleted. Under the new proposed policy a single admin with no legal knowledge can determine if they think something is illegal or irrelevant and delete it without discussion. Legal areas are blurry, is child porn illegal if it is with clothed anime characters in erotic positions and not with real people? Is anything featuring naked children child porn? If so then are depictions of the nude, children angels known as Cherubs from the Medieval period forms of child porn? Obvious cases are already speedy deleted, but more nuanced cases need to be handled on a case by case basis. The same goes for relevancy, an admin should not be able to decide by themselves if something is relevant enough to stay on Wikimedia except in the most obvious cases(which under current policy they already do). --AerobicFox (talk) 05:01, 12 December 2010 (UTC)[reply]

It concerns me to see so many participants here basing their thinking on whether this is a question of taboos. If it were, I would agree that we should strongly resist censorship. But it isn't. It's about laws, specific laws, those of Florida and the US. The laws don't have to be "right" in the opinions of Wikipedians. And no amount of talk here will change those laws. But the servers, in their present location, are governed by them. If, instead, the servers were in (to take an example) Saudi Arabia, we would be looking at a very different situation indeed. It's fair to discuss whether moving the servers would better serve Wikimedia's mission of free information, but that's not what's on the table here. --Tryptofish (talk) 20:30, 12 December 2010 (UTC)[reply]

The current situation

The prosposal suggests, that it was not possible to delete half-pornographic waste. However, it is possible to delete useless or pornographic pictures, the problem seems to be a administrative problem. I have asked for the deletion of the terribilities uploaded by User:Toilet (Nomen est Omen!). They are still here. Is it possible, that the problem could be solved without introducing questionable us-laws. --Liberaler Humanist (talk) 17:47, 11 December 2010 (UTC) Example:thumb|Really useful[reply]

His pictures are really useful, maybe they could be printed and given to kids so they can color them. However they aren't suited for this project. Aberforth (talk) 18:55, 11 December 2010 (UTC)[reply]
For this image, I am in agreement with Liberaler Humanist but it could be improved by cropping off the lower 40%- then you could derive an image from it that your kids could colour in and use as Christmas Cards. But even according to the rules above, the image concerned could not be deleted- it is tasteless but in scope as a illustration of a medical condition, it is not sexual content- the anal region is not the genital region, and there is no sign of a sexual act. Checking w:en:Miller test it does not fit on to any of the three prongs the subject is too old to invoke w:en:Dost test. --ClemRutter (talk) 21:17, 11 December 2010 (UTC)[reply]
One problem is pictures of blow-jobs and the like that lack any indication that the person shown is aware of the upload and has consented to it. They are often imported from Flickr, and a few weeks later, the Flickr account shows as "no longer active". Such cases are difficult to address via DR. Someone will say it is educational. Another will say it is in use. Another will say you don't see the whole person's face. The upshot is that one way or another, the image is kept. This was one of the things this policy was meant to address. The Foundation has had several complaints from women whose images were used without their consent, yet were offered to the world as free content for any type of re-use, including commercial re-use. --JN466 21:46, 11 December 2010 (UTC)[reply]
And what is the point of having more than one picture for a given subject? How many blow jobs do you need here to illustrate the relevant page? As for Flickr, the site should be blacklisted for being unstable and being a hive of copyright violations. Most users on that site don't understand the licenses. Aberforth (talk) 21:54, 11 December 2010 (UTC)[reply]
ok. so to summarize the key points in the preceeding conversation:
1. wikimedia commons does not need more than 1 image of any subject. (therefore we should delete the rest of our material as redundant?)
2. flickr should be banned as a source of materials for commons.
3. we need to make it easier to suppress undesirable materials on here.
i would suggest writing that up as a policy proposal & submitting it to the community for a vote.
Lx 121 (talk) 00:23, 12 December 2010 (UTC)[reply]
How do you feel about sexual images uploaded without the consent of the people depicted? Do you feel this is something we should worry about? --JN466 15:43, 12 December 2010 (UTC)[reply]

This seems to me to be a much broader question of personality rights, that has nothing to do with sex per se. But no one seems to want to broach that one because it would require us to delete most of our pictures of people. Gigs (talk) 21:02, 12 December 2010 (UTC)[reply]

Change to summary of poll arguments

Under the Oppose section it lists:
"Not strong enough at keeping out pornography"
I have not found one person who has opposed the proposed policy and cited this as a reason. While there are people on the support side who think the proposed policy doesn't go far enough, there is nobody in the opposition who is opposing it on those grounds. This section may mislead newcomers into thinking that is a significant argument of the opposition when overwhelmingly the opposition thinks it goes too far.
Does anybody else opposing this policy want this removed as it doesn't seem to accurately portray our arguments against it? --AerobicFox (talk) 04:50, 12 December 2010 (UTC)[reply]

agreed it is not the best wording (& could be improved), but do i think that a few of the oppose votes (definitely a minority of them) were predicated on the basis of wanting a more restrictive policy. sometimes politics makes for strange bedfellows. Lx 121 (talk) 06:00, 12 December 2010 (UTC)[reply]

(edit conflict)

No- I read all the posts yet again- No opposer has made that claim. It only occurs as part of rhetoric Supporter #7. So presumably stuck in the mind of the editor drawing up the list of supposed arguments. The claim needs to go. In reading the posts, I more convinced about how unfair this whole process has become to one whose native language is not English. If I had to write a summary of the arguments used I think it would be
Support: Those who think that Commons is a photo collection for en:Wikipedia
Those who think any policy is better than none
Those who think that Florida state law overrides everything else
Oppose: Those who think that Commons is a photo source for many Wikiprojects- including those not yet in existence
Those who think that a badly drawn policy is harmful
Those who think that Commons is must not be restricted than Florida state law.
Finally, the most quoted and supported oppose post was that of Ruslik ,(Oppose #6) that more or less sums up the oppose argument. I quote it here for reference.
Oppose. The proposed policy is just an example of m:instruction creep. Approximately one third of it only repeats what is already in other guidelines and policies (illegal content, copyvios and BLP violations). The other third is devoted to vague and useless interpretations of various US federal laws. These interpretations are not written by lawyers and may be misleading. The remaining third contains some rather explicit descriptions of sex acts, which the authors of the policy think are inappropriate for Commons. I am interested whether this proposed policy itself should be marked by some kind of a "sexually explicit" banner?
--ClemRutter (talk) 10:35, 12 December 2010 (UTC)[reply]

(end of edit conflict- good that it has gone)

Actually, if you read the opposes, some find the implicit support of COM:CENSORED and COM:NUDITY and COM:SEX to be too permissive, and the educational scope is too broadly defined for non-encyclopedic purposes. It seems contradictory that the policy which promotes greater regulation would be seen as insufficiently strict, but that is indeed how some read it... I looked fairly closely and the oppose votes which invoked this were 53 and 143. If it's confusing, I can take it out, but I think it's accurate. Ocaasi (talk) 10:04, 12 December 2010 (UTC)[reply]
I took it out, because it was making the list seem contradictory and confusing (which might be a reflection of the policy stances). If someone can think of a way to point out that some opposes were based on the policy not being strict enough, go for it. Ocaasi (talk) 10:08, 12 December 2010 (UTC)[reply]
Note, vote numbers were changed after someone voted at the top of the list. The relevant Opposes are now 1, 54, and 144, all of which clearly identify this Policy as too lenient. Add to that 194, 195, and 196, as well as any which identified the policy as biased towards conservative us attitudes about sex or practicing censorship. It's there.Ocaasi (talk) 12:20, 12 December 2010 (UTC)[reply]

IP votes

Can someone indent the IP votes? It is bad enough that we have brand new accounts with editing histories voting but the IPs votes are just outright silly. Can we have some semblance of proper practices? Ottava Rima (talk) 17:11, 12 December 2010 (UTC)[reply]

p'raps we should have a chat about a sensible standard for being 'enfranchised' - I had a click through a few of the voting accounts above, and found some with rather.... um... shallow editing histories, and obviously there's the IP issue ottava mentions (personally, I'm ok with an IP voting, but prefer that they have some significant contributions prior to the vote) - obviously this needs to reflect the 'community voice' as clearly as possible, so what best standards should we consider to apply to the data above? - something like registered for x long and made x contributions? or is there a better way? Privatemusings (talk) 22:30, 12 December 2010 (UTC)[reply]
If we start this game we can also point editors that have been indef blocked in any wikis. --KrebMarkt (talk) 22:41, 12 December 2010 (UTC)[reply]
well we could, I suppose - but personally I don't think that's a good metric really - generally speaking trials and tribulations on individual projects don't really follow editors around - and certainly not in terms of whether or not they're enfranchised. I think there are probably some more sensible markers we could lay down - would you agree? Privatemusings (talk) 00:16, 13 December 2010 (UTC)[reply]

Global notices needed - Farsi, Chinese, etc

I find it odd how en wiki and Germanic wikis get notices, but the majority of cultures and countries in the world that coincidentally happen to have conservative views are not getting notices. I see dozens of users from the German and Dutch wikis, but those from Chinese, Farsi, Arabic etc would definitely have far more people and far more in support. Isn't that odd and inappropriate? Ottava Rima (talk) 22:38, 12 December 2010 (UTC)[reply]

Maybe it has to do with the language barrier. Younger Germans, like me, usually have no big problems to read and understand English writings and are a bit familiar with the US law system. But other regions can't even read the policy (no English as major second language). How can we expect them to vote? If they aren't even notified, then this stinks like rotten fish. --Niabot (talk) 23:02, 12 December 2010 (UTC)[reply]
Yeah, lemme just go ask all the kids in my highschool who took Farsi/Arabic to come translate.AerobicFox (talk) 23:09, 12 December 2010 (UTC)[reply]
O sorry, Ottava Rima, that "Germanic" users do notice the notice in English. Ouch. Btw if those Farsi/Chinese speaking people do not understnad the site notice how could you expect them to understand the proposal? ... --Catfisheye (talk) 23:58, 12 December 2010 (UTC)[reply]
We can translate it. It isn't fair that people can claim cultural bias while simultaneously excluding 70% of the world that has a much more conservative view on sexual content. It is hypocrisy. Ottava Rima (talk) 02:26, 13 December 2010 (UTC)[reply]
That isn't the point. Such an global decision should be at least available in three or four major languages. Otherwise the will of many other projects is basically ignored. Ask Chinese or Russian people which languages they speak, and which second language they learned at school. Most likely none of them was English. So even if they got informed, how should they be able to vote? Throwing dice? So long i can't consider this poll as fair or balanced. --Niabot (talk) 01:21, 13 December 2010 (UTC)[reply]

When you think it can't get any whorse...

Commons:Deletion requests/File:Wikipe-tan Cartoon - Something is missing.png Anyway a good example why i voted with oppose. --Niabot (talk) 03:02, 13 December 2010 (UTC)[reply]