Commons talk:Sexual content/Archive 6

From Wikimedia Commons, the free media repository
Jump to navigation Jump to search

Archive 1 (Dec 2008-Apr 2009) | Archive 2 (April 2010) | Archive 3 (May 2010) | Archive 4 (June 2010) | Archive 5 (June - October 2010)

Recent Changes[edit]

I did a pretty major copy-edit to the whole policy. No content changes of any kind were intended. Here's a summary of what I did.

  • Fixed typos and obvious grammar errors
  • Shortened phrases for brevity
  • Rewrote some wikilinks for flow and simplicity
  • Cleaned up header and list formatting
  • Added H2 header to combine File Descriptions and Categorization under 'Working with sexual content'
  • Added H2 header to combine 2257 and obscenity law under 'Legal issues'
  • Removed duplicate policy links
  • Rearranged sections to put most important info first
  • Added italics to two key areas
  • Refactored legal discussion to separate Wikipedia procedure from historical background

If there are any questions, shoot. The diff won't be very helpful, since there was a lot of shuffling and it happened over 10 different edits, but there are no content changes. I'd just give it a top to bottom read to make sure that it didn't change anything substantive, and hopefully it's a cleaner, shorter version of an important policy. Ocaasi (talk) 05:20, 2 November 2010 (UTC)[reply]

Limits on what 2257 applies to[edit]

Should we add a note that {{2257}} should not be used indiscriminately and never on pictures of children? I removed 2257 from a lot of pictures of nude children; pictures of children that 2257 applied to would need deletion, not tagging. In the larger sense, I don't want to see this indiscriminately spammed across a bunch of files in Category:Nudity when it only applies to sexual activity.--Prosfilaes (talk) 07:29, 2 November 2010 (UTC)[reply]

That seems like an important point: record keeping is not for illegal images; deletion is for illegal images. I don't know when exactly the line is drawn, but if you have that sense, it sounds good. Ocaasi (talk) 09:36, 2 November 2010 (UTC)[reply]
That's not my point at all; my point is that record keeping is not for mere nudity.--Prosfilaes (talk) 11:07, 2 November 2010 (UTC)[reply]
Alright, misread then. I think they're both good points. I will add yours as a counterweight, explaining both ends of the 2257 spectrum. Check in and see if it works. Sorry I identified you as privatemusings in the edit comment, I had just written on that talk page. Ocaasi (talk) 12:45, 2 November 2010 (UTC)[reply]
I tried this: Content identified as a sexual depiction of a minor should not be tagged with a 2257 template or amended to include 2257 information; it should be deleted. Ocaasi (talk) 10:21, 2 November 2010 (UTC)[reply]
New Version: 2257 requirements apply specifically to sexual content including subjects of a legal age. If content is of child nudity that does not qualify as sexual then 2257 notifications are not needed. On the other hand, if child nudity is identified as sexual, then it should not be tagged with a 2257 template or amended to include 2257 information; it should be deleted and reported to the WMF. Ocaasi (talk) 12:55, 2 November 2010 (UTC)[reply]
Yes - I can say unequivocally that 2257 should never be used on images of minors (unless the image happens to also include images of adults who fit it). Explicitly sexual images of minors should be deleted, not tagged - mere nudity is generally not a reason for 2257 tagging. I've added brief notes to the tag regarding this. Dcoetzee (talk) 21:44, 4 November 2010 (UTC)[reply]

How about the artworks? They shouldn't be counted as sexual just because someone is naked (i.e. a god) unless it really contains sexual content. 221.126.247.208 10:08, 12 December 2010 (UTC)[reply]

What does this mean?[edit]

Can someone tell me what this sentence is trying to say:

By sporadic application of the First Amendment obscenity prosecution for adult sexual content or for sexual artwork, depicting apparent minors has become rare and unpredictable, but explicit sexual content depicting actual minors has not been protected by the courts and its prohibition is strictly enforced. Ocaasi (talk) 01:16, 4 November 2010 (UTC)[reply]
The Feds could sue a site hosting w:lolicon claiming that the material was legally obscene, but given that obscenity requires that it be without artistic, etc. value, it's never cut and dry, so they rarely do so. However, material showing actual children doesn't have such requirements, and will get slammed down quickly and hard. (In practice, I understand a lot of cases are stopped by the fact the FBI won't generally go after teenage CP unless they can get a confirmed ID and age of birth. But that's not really relevant for us.)--Prosfilaes (talk) 01:45, 4 November 2010 (UTC)[reply]

Check please:

Obscenity prosecutions of adult sexual content, sexual art, or caricatures depicting apparent minors has become rare and unpredictable; however, explicit sexual content depicting actual minors has received no protection and prohibitions against it are strictly enforced.

--Ocaasi (talk) 09:57, 4 November 2010 (UTC)[reply]

I read that, and it still doesn't make sense, because there is nothing qualifying 'protection' - the sentence above uses the term 'modification', and 'protection' does not obviously apply when what is actually being referred to is a loosening of restrictions on the depiction of such materials. How about Obscenity prosecutions of adult sexual content, sexual art, or caricatures depicting apparent minors has become rare and unpredictable; however, there has been no such modification where explicit sexual content depicting actual minors is concerned, and prohibitions against it are strictly enforced. --Elen of the Roads (talk) 22:50, 5 December 2010 (UTC)[reply]
Ah, I see what you mean. The perils of a la carte fixes. It will probably have to wait until after the poll, but that should be an uncontroversial improvement. I think we should start a list of to-fix-afterwards edits, since all of the new eyeballs will be wasted while the draft is !vote-locked. Maybe I'll do that. Ocaasi (talk) 18:14, 6 December 2010 (UTC)[reply]

Archive[edit]

Would someone mind archiving the old threads on this page. I'm a bit inexperienced in the archive business...Ocaasi (talk) 01:16, 4 November 2010 (UTC)[reply]

I've given it a go - although we may not even require 4 / 5 months of threads above? cheers, Privatemusings (talk) 02:19, 4 November 2010 (UTC)[reply]
Excellent...Ocaasi (talk) 07:37, 4 November 2010 (UTC)[reply]

Question about content without explicit consent[edit]

Edit went from:

This shall not in itself be a reason to delete them, but the community will discuss whether or not consent can be assumed in individual cases.

to

The community is free to discuss whether or not consent can be assumed in individual cases.

Do we want to remove the statement that lack of explicit consent is not sufficient for deletion? Is 'community is free to discuss' weaker than 'community will discuss'? Which is intended... Ocaasi (talk) 07:48, 4 November 2010 (UTC)[reply]

I think we need the statement, because it's been such a controversial question in this discussion and is likely to arise over and over. A small but persistent minority have wanted a strict requirement of explicit consent for everything. I think it's important that we make it clear that is not the policy we are proposing. - Jmabel ! talk 15:08, 4 November 2010 (UTC)[reply]
Agreed. There are many works for which explicit consent is unnecessary and unreasonable (e.g. notable artwork that happens to include a nude person). Dcoetzee (talk) 21:39, 4 November 2010 (UTC)[reply]
That sentence is not about "everything". There are already a list of cases where explicit consent is not necessary (including non-photographic artwork..., and old photographs). That sentence is only about old uploads. We compromised that there would be no mass deletion of old untagged uploads, but that individual discussions would proceed. So, how does the new wording not convey that? --99of9 (talk) 09:40, 5 November 2010 (UTC)[reply]
Logically, it shouldn't matter if we say explicit consent is necessary or not if we're leaving it to case by case discussion anyway. Even so, I agree that I'd like to keep the extra statement anyway, just to be clear about it. Wnt (talk) 09:37, 5 November 2010 (UTC)[reply]
I think there's a risk of being overly verbose, a bit confusing, and perhaps even mildly prejudicial, to be honest. The intent of my edit (as requested) was to keep it simple - a deletion discussion where an older file is deleted due to perceived consent issues is ok, as is a deletion discussion where an older image is kept despite consent issues being raised. If our intention here is not to mandate things either way, which I think it is, then I think it's prolly best the way it is :-) Privatemusings (talk) 23:25, 7 November 2010 (UTC)[reply]
I combined the bullet point with the other point about this to reduce confusion (I hope). Looking at the text and messing about with it this way and that, I didn't see any obvious way to make the text clearer without making it more confusing. ;) I think that as long as we have agreement here that we're not somehow intending this to be interpreted to allow deletion by default of old material (which I don't think is how it reads), I suppose we can spare readers of the policy the extra verbiage. Wnt (talk) 04:02, 8 November 2010 (UTC)[reply]
I kept the combination, but reworded it a bit, dropping the future tense and instead stating it like this:
Uploaders of self-produced sexual content are expected to make a positive assertion that the subjects of the photos consented to the photograph and to its public release. Provision of further evidence of consent is welcome (using the OTRS process) but is not normally required. Content which lacks assertions of consent, usually from files uploaded before implementation of this policy, should not be automatically deleted. The community should instead discuss whether or not consent can be assumed in each case.
I think that covers both issues ok, while keeping it as straightforward as possible. Ocaasi (talk) 17:42, 8 November 2010 (UTC)[reply]

< I tried again, because I felt that 'your' version kind of implied that future uploads of sexual content which are missing explicit assertions of consent might be ok - whereas I think we've agreed that they're necessary. Hopefully it's even clearer now :-) cheers, Privatemusings (talk) 23:45, 8 November 2010 (UTC)[reply]

I see how your version was an improvement in affirming the present requirements. I tightened the phrasing just a touch, no change to content (except 'discuss' to 'determine', if you want to be really technical). On the other sentence, you went from the 'should not be automatically deleted' to 'free to discuss', which though correct is a somewhat weaker way of saying that prior content without consent 'shouldn't be deleted without discussion'. I think in cases where people get deletion-happy, 'free to discuss' doesn't set a requirement for them to make that effort. How about both changes? Ocaasi (talk) 05:55, 9 November 2010 (UTC)[reply]
I thought that version sounded a little too favorable toward random proposals for deletion of old content, while being too restrictive toward proposals to delete new content that seems very suspicious despite an assertion of consent. (A different issue, mentioned only briefly in the previous discussions, was that you might come up with some content that seems intrinsically suspicious despite an assertion - a flash snapshot of a startled couple in a hotel bed, etc. The assertion should generally be a good defense, but we weren't going to require editors to be completely blind, either) I'm not sure my last edit is the best way either... Wnt (talk) 09:14, 9 November 2010 (UTC)[reply]

I've made a small edit, but I'm still concerned by this new wording. One thing that bugs me is that old random Flikr uploads will now be treated differently to new random Flikr uploads (without assertions). The old need reasonable suspicion to delete, the new ones should not be uploaded. --99of9 (talk) 09:55, 9 November 2010 (UTC)[reply]

In general I think we should have some hint of positive trustworthiness (e.g. trusted uploader) to keep the old stuff, rather than just no negatives. --99of9 (talk) 10:00, 9 November 2010 (UTC)[reply]
Looks ok to me. Maybe 'preferred' instead of 'encouraged' just as a word choice. Ocaasi (talk) 14:11, 9 November 2010 (UTC)[reply]
I thought the last draft (99of9) was OK also. I just added a footnote linking an article about the Tyler Clementi case as general background - but if someone doesn't like that I won't insist on keeping it. Wnt (talk) 03:55, 11 November 2010 (UTC)[reply]
That case is really significant (or it will be soon), but I have a few thoughts. I'd like to tighten up the summary a bit just for brevity. Also, do we need to consider victimization issues? This case is receiving ample attention right now, and the family hasn't been avoiding publicity, but maybe until this actually winds up in a court ruling we should just speak generally about it. I could go either way. Thoughts? Ocaasi (talk) 20:13, 11 November 2010 (UTC)[reply]
I've seen several news cycles about Clementi, and in the U.S. his case is being used as something of a rallying cry against bullying. I don't think it would be inappropriate to name him. The legal case against the people involved in the videotaping is likely to be a significant test of a 2003 "invasion of privacy" law — as I understand it, they were operating their own webcam in their own (shared) dorm room. It may be more important in social terms, however — before this case, I think that many people would think of the consent issues more in the context of actresses suing to try to keep sex tapes off the Internet, whereas now some are going to think of it as a life and death issue. I don't think that's really fair — it was prejudice and bullying, not video, that caused this tragedy — but I can't pretend it won't be true, and editors (many of whom may not be from the U.S. and never heard of the case) should have some warning.
As people here know by now, I believe in the "not censored" principle of Wikipedia, and even now I wouldn't have acquiesced to demands for an explicit assertion of consent for future content without a tangible reason. If scenes from the Clementi video had been posted to Commons without an explicit assertion of consent, policy wouldn't touch Clementi (we wouldn't know who he was), nor most likely the uploader (who would be breaking the law and existing policy with or without an explicit statement about it, and probably wouldn't need a notice to tell him this). But it might have protected people "distributing" the video, which I think might be interpreted to affect various types of innocent editors by various arguments which it is probably not productive to detail here. By having an explicit statement to fall back on, they would have a firm defense, whereas without it prosecutors might well end up Wikilawyering in the courtroom.
I am not at all happy to have yet another way for it to be illegal to copy content you find on the Internet!, but from what I've read (which questions whether the video was viewed by anyone) I don't know if the current prosecution could lead to constitutional review of that part of the law; if it did this is not the test case I'd want to see used to make the decision. Wnt (talk) 21:46, 11 November 2010 (UTC)[reply]

Commons Study on Controversial Content[edit]

I noticed there's no mention, not even a link to the recent study on controversial content. Is that intentional? Does this policy assume that those recommendations have no teeth or force until the foundation decides to do something about them? Does it make sense to link to them in a 'see also' section, perhaps? Ocaasi (talk) 10:01, 4 November 2010 (UTC)[reply]

See also seems a good idea but the study is just a study, not a policy mandate. I think we ought to adopt this proposal as policy on our own rather than by being forced to via a WMF mandate, though. ++Lar: t/c 11:44, 4 November 2010 (UTC)[reply]
Yes, it's always nicer to self-legislate, like when McDonald's added apple slices in lieu of FDA regulation. I'll try a see also link, which seems to be the right fit between informing and capitulating to the (thoughtful) meddling of the outsiders. Ocaasi (talk) 11:56, 4 November 2010 (UTC)[reply]
(I really don't get the apple-slices/McDonalds reference. Was this a US problem?)Archolman (talk) 00:43, 6 December 2010 (UTC)[reply]
(There were rumblings about Big Food becoming the next Big Tobacco, facing multi-billion dollar obesity lawsuits, congressional hearings, etc. with documents showing execs tried to market to kids and get them addicted to fatty, salty, hormone-ridden snacks. So the industry took a hint and voluntarily adopted milk, apples, more salads, and a few other things to health-wash their image. Which uncharitably suggests by analogy that that is what we're doing, but there's either a) no truth to that or b) nothing wrong with it) Ocaasi (talk) 18:20, 6 December 2010 (UTC)[reply]
The study makes few specific recommendations, and those it makes largely apply to other projects (image hiding on Wikipedia) or are not limited to sexual content as defined here (semi-nude women, which are not considered sexual content by our definition). In any case, I'm not very favorable about it. Wnt (talk) 09:31, 5 November 2010 (UTC)[reply]
If the recommendations are passed on by the foundation, we can then consider whether to expand our definition of sexual content. The good thing is that what we've developed here is a framework for how to deal with whatever is considered sexual. --99of9 (talk) 09:44, 5 November 2010 (UTC)[reply]
Yes - it is important though that we adopt this (in some form) as policy before the Foundation considers the study, or else I fear they might swoop in, say we're not taking clear action, and institute something more conservative. Dcoetzee (talk) 20:53, 5 November 2010 (UTC)[reply]
It's premature to speculate on how to deal with a hypothetical arbitrary policy, which might affect anything. But I think we've designed this policy to deal with a very specific class of content, based to a large degree on American legal distinctions. If the WMF imposed some policy about bare breasts I think we would be best off (at least) to try to keep it as something separate rather than trying to integrate it here. The whole point of an imposed policy is that it is imposed on you, so how can you edit it? So how and why would you try to unify it with any other? Wnt (talk) 03:47, 8 November 2010 (UTC)[reply]
Agree with that, except that when something is 'imposed' on you, it can still be in your interest to adapt it to your purposes, or at least to integrate it with them. But it's premature to bring it up while those issues are just floating through the proposals-and-recommendations cloud. Ocaasi (talk) 17:45, 8 November 2010 (UTC)[reply]

Do we need assertions of consent from unidentifiable people?[edit]

The way the last part of the "Prohibited content" section is currently written, we "must" have an assertion of consent even for close-up photos showing, e.g., a particular kind of STD rash. Firstly, is there any reason to require an assertion of consent when the subject of the photograph isn't identifiable? Furthermore, wouldn't requiring an assertion of consent in such cases hinder the ability to illustrate educational medical sexuality articles? 71.198.176.22 16:44, 16 November 2010 (UTC)[reply]

were this proposal adopted, yes - sexual content, as defined on the proposal page, would require an explicit assertion of consent. If you read the note related to the 'close up' type of image you use as an example, I think it's likely such an image would fall outside of the purview of this policy. Personally, I think an assertion of consent would be desirable anywhooo - don't you? cheers, Privatemusings (talk) 20:10, 16 November 2010 (UTC)[reply]
No, I don't think it would be desirable to have to get consent for close up, unidentifiable images because what is the point of such a hassle? I'm not sure what the note you linked to has to do with images of unidentifiable people, either. Is that the note you meant to link to? 71.198.176.22 04:26, 18 November 2010 (UTC)[reply]
I see there was a "grammar tweak" from "with sexual content all are subject" to "all sexual content is subject".[1] This is an example of the perils of copy editing. Additionally, there was some effort to define "identifiable" quite expansively before, including e.g. EXIF information, which has since been taken out; I think the feeling was that it was best to have one single definition of identifiability under the main policy about it.
However, on consideration, I think we should think through carefully just what we want or need to say about this. For example, Wikimedia might want to steer clear of unpublished photos from a secret locker room camera, no matter how carefully cropped and anonymous they may be. On the other hand, we would want to rule in non-identifiable photos from a retired surgeon who has taken a collection of photos for personal use and in medical publications, but never obtained consent to publish them here. My first thought on processing this is that we need to exclude even non-identifiable photos if the photography itself was an illegal violation of privacy, but that we don't need to exclude them otherwise. For example, in the locker room example, it is possible that the act of investigating the dissemination of the "anonymous" photos would end up revealing that they were of a particular person, and put Wikimedia in an entirely undesirable position. But before trying to implement such a zig-zag line, I should see what others here have to say. Wnt (talk) 13:27, 18 November 2010 (UTC)[reply]
I think if you re-read the copy-edit closely, it actually removed a redundancy and did not change the meaning at all. Generally, though I respect legal writing, policy, and constitutions, etc. there is something wrong if a single word or phrase can obscure the intent of the guidance. So we should make it more plain and robust if that is what happened.
I think your 'illegal itself' notion is a good distinction, but I don't know how it would be incorporated. It seems more than reasonable that content which appears to be illegally acquired should be excluded, even if it does not violate other provisions (identifiability, sexual content with children). But how would we know what 'looks illegal'. I think the only question is whether those situations should be a) speedy deleted, with the possibility of discussion afterwards; b) discussed per regular deletion, with the burden on showing it is legal; or c) discussed per regular deletion, with the burden on showing it is illegal.
My gut instinct is that where there identifiability or sexual content with children, a) is the obvious choice; where the content is blatantly sexual or exposing body parts, (or if there are subtle identifiers that could end up as clues), that b) is the best option; and if there is just a generic body part but no children, sex act, or obvious identifiers, that c) is okay. That's not very simple, but it's my best stab at it. Thoughts? Ocaasi 02:20, 19 November 2010 (UTC)[reply]
There's really no way to verify that consent was really obtained for photography under any scenario - even if we kept 2257 paperwork, which we shouldn't, it would be child's play to forge an electronic copy of a driver's license and a signature for which we don't have a valid exemplar. Here we're just trying to figure out a required standard, without much thought for serious enforcement beyond the assertion of consent. However, I think it's possible to be pretty sure in some cases that no consent was obtained, allowing AfDs to be raised. Even so, I don't think that speedy deletion will usually be appropriate for suspected lack of consent, simply because it is going to be a complex conclusion to draw from media that don't explicitly say that they are violating policy. I think that the current guidance that speedy deletion is only for uncontroversial cases is sufficient. As for burden of proof, I don't think that any AfD on Commons truly has a "reasonable doubt" standard; it's more a "preponderance of the evidence", which puts b and c as pretty much the same option. Disturbing the current language on the point could be trouble, since we've had enough trouble just agreeing that there should be a reason to suspect lack of consent... Wnt (talk) 12:00, 19 November 2010 (UTC)[reply]
I've added yet another bullet point to the section, trying to deal with this issue out in the open; it should make any other rewording unnecessary.[2] Provided I didn't screw it up... we're diving into some rather fine detail nowadays. Wnt (talk) 12:25, 19 November 2010 (UTC)[reply]
No no no. Why are we suddenly reopening something we've hammered out so hard already? I thought we had agreed to stick fairly closely to copyedits until we had a clean stable page to vote on. This "close crop" stuff allows someone to put their ex-girlfriend all over commons as long as he cuts her up small enough. Consent is important. I will revert, please obtain wide consensus before making such significant changes since we've already found a compromise point. --99of9 (talk) 13:27, 19 November 2010 (UTC)[reply]
What would the point of a vindictive ex putting unidentifiable images up? Anything intended to embarrass is necessarily going to be identifiable. I don't think this is a legitimate point of compromise since we already have dozens of close-cropped images of genitalia and breasts, several of which are currently being used to illustrate wikipedias. The proposal as worded would require the deletion of those images unless the uploaders, many of whom are long gone, come up with assertions of consent. I don't believe anyone actually wants that. Do they?
I think Wnt's bullet point was a reasonable solution, so I'm replacing it. 71.198.176.22 03:37, 20 November 2010 (UTC)[reply]
Hmmm, I don't want to break down a fragile consensus, but at the same time I'm responding to a November 4 "grammar tweak" as explained above. I don't really want to go back to that rather peculiar phrasing either, especially since we apparently had more difference of opinion about what we were reading there than we imagined. I'd hoped that my wording might be acceptable, but is there a way we can fix this? I should emphasize that if "identifiable" is interpreted narrowly enough, 99of9 does have a point about the cut-up photo sequences. I originally considered some sentences that tried to define "identifiable" strictly, e.g. specifying that a photo can be identifiable from context, clothing, tattoos, etc. That should include cut-up photo sequences where one end has the head and one end has some other part. The problem is, I see this really as a question for the photographs of identifiable people policy - we shouldn't have two different definitions of "identifiable" knocking around or chaos will ensue. Wnt (talk) 05:20, 20 November 2010 (UTC)[reply]
Wnt, I don't understand how you could have otherwise interpreted that sentence before the grammar tweak. Can you please spell out what you thought it meant? The diff you provided seems like a straightforward improvement on the English, and I can't for the life of me see any change of meaning, even implied meaning... --99of9 (talk) 12:09, 20 November 2010 (UTC)[reply]
The pre-Nov. 4 version read "All media is subject to Commons:Photographs of identifiable people.... In addition with sexual content, all [media subject to COM:IDENT] are subject to the following requirements:" It seems to me that the following points were meant to modify the normal requirements of the COM:IDENT policy. Since the consent issue has previously been part of COM:IDENT, this is what was to be expected. Wnt (talk) 13:51, 20 November 2010 (UTC)[reply]
IP address, your statement "The proposal as worded would require deletion of ..." appears completely false to me. We have a dotpoint that explicitly addresses files uploaded prior to activation of this policy. Since that was the justification of your revert, I trust that you will restore to correct your error. Further, my point above was that "I think ... is a reasonable solution" is no longer a good enough reason to make functional changes to the text. We have a carefully agreed compromise, and one or two people is no longer sufficient consensus to make that kind of change. We are supposed to be copyediting. (PS, please consider logging in and backing your opinions with the reputation of your username.)--99of9 (talk) 12:09, 20 November 2010 (UTC)[reply]
I should agree with 99of9 that this part of the policy isn't about the lack of explicit assertions of consent, which would affect the old files, but only about actual lack of consent, which is a much narrower matter. What I'm interested in preserving here is the use of photos taken with consent but not with explicit consent for use here, primarily medical and surgical photos.
To give an example of the sort of things I'm thinking about, consider an NIH-funded paper about a surgery which is published in a private journal, but which by law is required to be under an open license. The paper may contain a low rsolution black and white photo of the procedure, which is published in the journal, but include "supplemental data" at the journal's, author's, or department's web address with a wider array of high-resolution color photos and video. Such photos can be used on Commons by copyright, but may not be regarded by some as having been published (especially since it is not that uncommon to update supplemental data after the paper is out...). The patient would have consented to a medical publication and other uses by the author, but probably not by us.
I think there is a way to write this to clearly rule out the cut-up ex scenario, and clearly rule in the medical images scenario. Wnt (talk) 23:18, 20 November 2010 (UTC)[reply]

I hadn't realized that the text currently includes "Explicit assertions of consent are encouraged but not required for sexual content uploaded prior to implementation of this policy" which does indeed address my concerns about earlier images above. However, I really do think requiring an assertion of consent for unidentifiable images is much less desirable than allowing them. The chance that an uploader is going to try to embarrass someone with an unidentifiable image is tiny, and there's no way it would work if they tried, by definition. That means requiring consent for unidentifiable images is needless instruction creep. Also I hope Wnt can tweak his text to address the external publication issue he describes. I don't understand why we shouldn't still be trying to make improvements to the proposal; we improve policies even after they're adopted. 71.198.176.22 06:51, 21 November 2010 (UTC)[reply]

Want to bet? Send me a full frontal nude image of yourself, your full name, and the copyright holder's release, and I'm sure I can embarrass you on commons without breaking existing policy. --99of9 (talk) 08:42, 21 November 2010 (UTC)[reply]
If you cut an image in two, where one half is identifying and the other is embarrassing, then the existing policy would allow publishing the embarrassing part here. We do not want that, as the identifying part can be published later or elsewhere. That is why we want consent for most sexual content. Truly unidentifiable images and images from serious sources are no problem. --LPfi (talk) 11:48, 22 November 2010 (UTC)[reply]
As I said, 99of9 has a good point if we define identifiability too narrowly. But this scenario extends to many things beyond this policy. After all, a full frontal nude image of a typical person, especially if female, can be quite embarrassing even if the actual genitals are obscured, and it isn't within the scope of this policy at all. Nor do I think we should consider a separate COM:BREASTS policy to deal with that; after all, it might be embarrassing for a man to have his belly cropped out and used to illustrate the obesity article and so forth, if the picture still gets linked back to his name.
My feeling is that we can't force our policies on someone with truly malicious intent - he can always falsely assert consent for the whole image, even fake "evidence" such as it is, so why focus on a later malicious release of a second half of an image? But we might warn editors against doing this unintentionally. Based on this discussion my thought is to add some redundant warning about identifiability, though trying not to set up a separate definition. Wnt (talk) 15:41, 22 November 2010 (UTC)[reply]
I agree; it doesn't matter how close-cropped an image is if it has been, or is likely to be, associated with someone's name or identity. In that case an assertion of consent should be required for new uploads, and if the person associated with an image uploaded any time wants to withdraw consent or claims that there never was consent, we should respect that and delete the image. I am not sure those points are in the policy, and they should be. I'm going to try to put them in. 71.198.176.22 21:54, 22 November 2010 (UTC)[reply]
best way forward in my view, and the one currently written up, is for the sexual content policy on consent to apply to all sexual content - it's easier, clearer, and just plain better that way :-) Privatemusings (talk) 20:26, 22 November 2010 (UTC)[reply]
"All sexual content" includes both images and text, a substantial subset of which there is a wide consensus to allow instead of delete. Which version are you referring to as "the one currently written up?" 71.198.176.22 21:54, 22 November 2010 (UTC)[reply]
not by our definition :-) - take another look..... Privatemusings (talk) 22:36, 22 November 2010 (UTC)[reply]
I'm sure that whoever wrote "media depicting" didn't intend to include textual media or textual descriptions, but text is a kind of media, and according to two of the dictionaries I just checked, depictions include text as well as pictures. Do you think it would be a good idea to resolve such ambiguities before asking people to approve the draft? 71.198.176.22 23:48, 22 November 2010 (UTC)[reply]
  • "Media" here is clearly in the sense that Commons is WMF's media repository. We can't just say "images" because it also includes audio, video, etc. Still, in the context of Commons content, when we refer to "media" we mean the media files, distinct from descriptive text. I believe this is pretty consistent across policies & guidelines. - Jmabel ! talk 05:15, 23 November 2010 (UTC)[reply]
Perhaps I was going about this the wrong way. I'm trying a different edit here [3] explicitly stating that "previously published" includes certain websites. Some of my other concerns are actually not prohibited by the current language of the first section, which only says "public release" rather than release on Commons. I'm not going to work myself up just now over the right to take other people's snapshots for personal use then release anonymized cropped bits on Wikipedia for fun; no doubt there is some case in which this is terribly important but I'm not thinking of it today. An aberration in this edit which might raise some hackles is that I refer to a Wikipedia policy in a link - unfortunately, I didn't see a Commons policy to cite about reliable sources! Wnt (talk) 19:57, 23 November 2010 (UTC)[reply]

Really not ready for prime time[edit]

I think we've got a reasonable consensus on the meaning of the policy we will present, but it's really not yet all that well written. If I have time (and focus), I may take shot at this myself, but I'd welcome another good writer trying in my stead.

For the most part, I would hope any rewrite would be shorter, not longer, but there is one thing I think is missing: an explanation of why sexual content raises certain issues that are not found, or are found only in a lesser degree, in other content. E.g.

  1. Sexual depictions of minors can trigger criminal legal concerns in many countries.
  2. Sexual depictions of identifiable living or recently deceased people have enormous potential to prove embarrassing to those people or their associates, and therefore it is more important than usual to be confident that the subject gave any necessary consent for a photo to be distributed.
  3. To a greater degree than in other subject-matter areas, hosting numerous sexual images with little or no educational value is likely to be detrimental to Commons' reputation. Few people will consider it a serious problem if Commons has superfluous pictures of domestic cats or of the west face of St. Paul's Cathedral. Far more will consider it a problem if Commons has superfluous amateur pornographic images.

If someone wants to rework that, great, but I think something like this is needed.

Also, there is currently one very confusing passage in the lede: "This policy addresses how Wikimedia Commons should handle materials outside Commons' project scope..." Is that really what we mean to say? It seems to me that the policy is, instead, about determining the boundaries of that scope. "Materials outside Commons' project scope" are routinely deleted. - Jmabel ! talk 21:02, 16 November 2010 (UTC)[reply]

I'm not sure if writing style is justification for holding up a vote. After all, it's a policy, not a literary masterpiece, and having a policy that we agree on the meaning of already seems like an accomplishment. We keep having gaps of up to a week when no one edits the policy, and waiting for major revisions could take a very, very long time. And every change, however small, seems to risk reopening disputes. I see you've made improvements, and so far we've been delaying the vote for copy-editing; still, it has to end sometime. Wnt (talk) 11:38, 17 November 2010 (UTC)[reply]
I think Jmabel is getting at the fact that we have a good document that needs one more paragraph in the lead. It can be done. Points 1 and 2 are clearly in the article in different sections. Point 3 is somewhat more controversial because it gets into the why of this policy, and whether there are extra-legal, extra-commons concerns about mere reputation or social responsibility involved. I can't address the last point, but I'll incorporate the first to for the intro and see if we can discuss the third.
Sentences added: Sexual content must be handled differently from other content, because sexual depictions of minors can trigger legal issues in many countries; depictions of identifiable living or recently deceased people may be embarrassing or violate their privacy; and hosting sexual images can be done in a way which protects the reputation of Commons among the public to safeguard their continued support.
Question: Is point three accurate: are we writing this policy out of any consensus-concern about the reputation of Commons or about some general social concern?
I have to say that I oppose this paragraph altogether. The policy as written really does not treat sexual content differently: Commons has never had a policy of hosting illegal content, it has long had the Commons:Photographs of identifiable persons policy, and it does not censor photographs based on perceived external or even internal opposition to their content. Furthermore, it is not actually contributing to the policy in any identifiable way - it's exactly the sort of excess and confusing verbiage that we should be trying to chop out. Wnt (talk) 00:58, 18 November 2010 (UTC)[reply]
I do not like the paragraph either. There are other things that can trigger legal issues. Depictions of identifiable people may be embarrassing. Sure sexual content is extra sensitive, but I think people know that. And the third sentence is strange, it sort of says the thing backwards. We want to have these things straight, why do we say we do it to safeguard our support? --LPfi (talk) 13:53, 19 November 2010 (UTC)[reply]
I've offered a compromise (?) sentence about the above.[4] But I don't know if this will satisfy Jmabel, nor would I object to its deletion since it is more or less redundant. Wnt (talk) 01:08, 18 November 2010 (UTC)[reply]
I see what you're getting at, but if identifiability and illegality are sufficient, what is the point of this policy at all? By your description, either sexual content is already covered by existing policy and a sexual content policy is redundant (or merely guidance for applying the main policies), or else sexual content is handled differently than other content. I don't have a preference either way, but I think we should know which is which. I'm going to try rephrasing the added sentenced to say the same thing with less perceived distinction. See if it gets closer.
Sentences added (2): Sexual content tends to need closer attention because of the legal and privacy-related issues it raises. In order to ensure that such content does not violate U.S. law, or other Commons policies, this policy addresses processes to handle sexual material that falls within Wikimedia Commons' project scope and to identify and remove material that does not.
Re: Wnt's point about voting despite style, I agree. Policy on Wikipedia is rephrased constantly, but it doesn't affect the basic guidance. User:Ocaasi 13:25, 17 November 2010 (UTC)[reply]
I'd be perfectly glad to see us head toward a vote at this time.
As for the matter of reputation: that's essentially what started this all in the first place. Jimbo launched a bit of a crusade, deleting even some images that were clearly significant works of art. His actions were almost certainly in response to sensationalistic news stories besmirching Commons' reputation. - Jmabel ! talk 16:19, 17 November 2010 (UTC)[reply]
I recall that his edits generally said "out of project scope", and I would like to think that whatever external pressures were brought to bear on him, that he would not have launched an all-out attack on a category of content if much of it were not actually at odds with existing policy. We know some files were deleted that should not have been, but many of his deletions stuck based on preexisting policy. Wnt (talk) 01:15, 18 November 2010 (UTC)[reply]

Wikimedia leadership response to controversial content[edit]

Linking these because they may be of interest to people watching this page, and to inform policy discussion.

  • en:Wikipedia:Wikipedia_Signpost/2010-11-15/News_and_notes#Controversial_content_and_Wikimedia_leadership
  • Sue Gardner: Making change at Wikimedia: nine patterns that work: "we’re the only major site that doesn’t treat controversial material –e.g., sexually-explicit imagery, violent imagery, culturally offensive imagery– differently from everything else. The Board wanted –in effect– to probe into whether that was helping or hurting our effectiveness at fulfilling our mission."
  • [Ting Chen] writes: "the ongoing controversial content discussion is a result of our strategic planning (development and adaption in the nonwestern cultures) and the response of the changes in public policy and in our responsibility."
  • meta:Talk:2010_Wikimedia_Study_of_Controversial_Content:_Part_Two#Recommendations_discussion: Inviting feedback on the 11 recommendations of the study, which are:
    1. no changes be made to current editing and/or filtering regimes surrounding text
    2. no changes or filters be added to text on current Wikimedia projects to satisfy the perceived needs of children
    3. creating Wikijunior
    4. review images of nudity on Commons
    5. historical, ethnographic and art images be excluded from review
    6. active curation within controversial categories
    7. user-selected shuttered images (NSFW button)
    8. no image be permanently denied with such shuttering
    9. allow registered users the easy ability to set viewing preferences
    10. tagging regimes that would allow third-parties to filter Wikimedia content be restricted
    11. principle of least astonishment to be codified

Feel free to visit the last link and provide your own feedback. Dcoetzee (talk) 04:03, 17 November 2010 (UTC)[reply]

Should we allow subjects to withdraw consent for images uploaded prior to adoption?[edit]

Is there any reason not to allow subjects associated (correctly or incorrectly) with a potentially embarrassing image to withdraw consent for its inclusion, as a BLP-strength support for deletion? I remember people suggesting that there was, but I'm having a hard time remembering why. 71.198.176.22 22:03, 22 November 2010 (UTC)[reply]

the granting of, and subsequent withdrawing of consent is a tricky issue. I'd be happy to continue discussions about the best policies in this area whilst the current draft is adopted. Privatemusings (talk) 22:37, 22 November 2010 (UTC)[reply]
I'd rather we put forth a draft addressing this issue, because it is tricky, instead of delaying addressing it before asking people whether they want to support the draft. But I'm not opposed to addressing the issue in a wider proposal instead, since it's not specific to sexual content. Commons:Living persons would seem to apply, but that has only been edited by one person about a year ago, and it's in pretty bad shape. Can we use foundation:Resolution:Biographies of living people instead? It doesn't say anything about images, but the spirit of that Resolution seems very clear to me here. Alternatively, should this issue be addressed in Commons:Photographs of identifiable people instead of here? Or do we want to limit withdrawal of consent to sexual content? 71.198.176.22 23:48, 22 November 2010 (UTC)[reply]
Currently the photographs of identifiable people policy is the only place where consent issues come from; depending on ongoing discussions above, things may or may not stay that way.
My position is that the point of Commons is to put images permanently into the public domain. So consent should not be revocable, just as the copyright terms are not revocable. That said, I think we should heavily weight the subject's statement about consent: if he says he never consented, or never consented to public distribution of the image, this outweighs the photographer's assertion of consent. Only hard evidence like a verifiable prior publication of the image is likely to outweigh the subject's statement. It is true that in many cases subjects can abuse this to effectively revoke consent, but at least we're not committing ourselves to the principle that Commons images are only temporarily public. I should note that a subject should only be able to challenge an image if he asserts that it correctly associated with him; dispelling false rumors isn't our responsibility. Lastly, in the interests of defending the public domain, we must not remove published photos that have reached the public domain, no matter how the subject objects; otherwise we fall behind the archives of the original magazine and other copyrighted resources. Wnt (talk) 18:52, 23 November 2010 (UTC)[reply]
2 remarks, though I'm not sure either affects the policy we are discussing here:
  1. I presume you mean "public domain" in a metaphorical rather than a legal sense. I've posted tens of thousands of my photos to the Commons, but I have not placed them in the public domain.
  2. As a matter of practice, we often remove photos as a courtesy to the subject of the photo. If the picture is readily replaceable for all current uses, and there doesn't seem to be anything particularly notable about it, and if the subject of the photo objects to its presence here, we usually do this. For example, if we have a routine photo of someone around the threshold of encyclopedic notability, and they don't like the particular photo, and they provide us with a better one (or we already have a better one), we almost always accede to their wishes. - Jmabel ! talk 05:32, 24 November 2010 (UTC)[reply]
As a matter of practice, not policy, and when the image is replaceable. I would much rather have deleting photos on request a matter of practice when it's convenient or reasonable, rather than policy.--Prosfilaes (talk) 06:33, 24 November 2010 (UTC)[reply]
This is worth a separate policy about. Generally, regardless of legality, copyright, or current policy, I can't really see a good reason not to permit someone to change their mind for any reason at any time. If an image is harmful to a person's reputation, it's not worth hosting it, and we should probably be able to replace it anyway. The only situation I can see where this might not apply are mass uploads to non-sexual content (i.e. trains), where the author wakes up one day and wants to make a profit; that might be too late. Thoughts? Also, where would be the place to discuss this aspect of policy? Ocaasi (talk) 05:25, 25 November 2010 (UTC)[reply]
This discussion is getting off topic but, yes, there are lots of reasons not to allow people to revoke their permissions. None of the following has particularly to do with sexual imagery (which is why this is off topic) but...
Is there a better place where this discussion could happen?
  1. When you upload a photo you took, you grant an irrevocable license. If someone uploads a bunch of useful images, then later stalks out of the project in a fit of pique, they don't get to revoke those permissions and remove their pictures from Wikipedia.
  2. Example rather than abstraction here: the images from Abu Ghraib are certainly embarrassing to their subjects (both Iraqi and American) but we certainly wouldn't remove them because Lynndie England found an image of herself embarrassing.
  3. Example again: a U.S. Senator might object to us hosting anything other than their official photo, but it would still be entirely legitimate (for example) for us to host their high school yearbook photo or a photo contributed by someone who photographed them at a public event.
  4. Similar example, less obvious: if I take a photo of someone about whom we have a Wikipedia article, and I take it entirely in accord with the usual rules in the relevant country (e.g. photographed in a public space in the U.S.), and that person doesn't like Wikipedia having their photo, frankly, unless we decide we want to do a courtesy to them, it's their problem, not ours. Now, if it's really not a good photo, and especially if we have a better one, I'll probably support doing that courtesy, but it's a courtesy, not a matter of policy.
Jmabel ! talk 06:36, 25 November 2010 (UTC)[reply]
I think we're discussing different cases. Generally, I was only referring to pictures that the subject took themselves or granted permission themselves to use, not other public domain images which someone is merely the subject of. Re Lyddie England, she's a public figure at this point, and the photograph wasn't produced or uploaded by her--same with the Senator who presumably didn't upload the content that s/he is seeking to take down. I was trying to think of counterexamples where we should not honor a request for someone to take down a photo they took themselves or granted permission themselves. Can you think of any for those? Ocaasi (talk) 06:48, 25 November 2010 (UTC)[reply]
One of the most frustrating things Commons does is going around deleting images in use, for good reasons or bad. In an optimal world, we would never delete an image used in a current or historical version of any page on a Wikimedia project. We already give photographers much more rights then authors of encyclopedia articles, who have their names buried in the history and don't get to choose their license; there's no need to privilege them further. A photographer shouldn't get to rip their photo out of an article any more than an author could rip their text out of a WP article. We have to treat the portrayed with all the respect of a professional scholar or journalist, which usually coincides with their own desires; when it doesn't, we destroy our own quality as a news source and encyclopedia by letting them dictate to us.
Sexual content is complex, and I will freely acknowledge that the issues surrounding it mean it will get deleted much more freely then most content. I still think it important that policy demand that uploads to Wikimedia projects are not revokable, and that people think before uploading, instead of coming back and demanding that users on 14 Wikimedia projects fix up 47 pages because images that they assumed--and should have had every right to assume--were permanent are now being deleted.--Prosfilaes (talk) 07:09, 25 November 2010 (UTC)[reply]
What you're saying makes sense about the problems caused by revoked images. As long as the warnings to the uploader are sufficiently clear, I think they should at least need a 'good reason' and maybe an OTRS ticket. On the other hand, what about: a photo of a girl in a bikini which ends up on a very popular article like 'beach'; a masturbating woman who decides 5 years later that for employment, personal, or family reasons that the image is harming her; a picture of someone's pierced genitals which has become un-anonymized from cross-publishing in a body-modification mag; a topless photo from an 18 year old which 20 years later doesn't seem like such a good idea. I'm stretching credulity on some of those, but I'm looking for what the precise rationale is. We definitely shouldn't let people revoke content for permissions, except perhaps, when we should. Ocaasi (talk) 07:23, 25 November 2010 (UTC)[reply]
I'm happy to let all of those be dealt with under practice, not policy. But what about some other examples: someone writes the definitive biography of a Stalinist lackey that, while its positive spin has been toned down, stands as the Wikipedia article 10 years later, when he wants it deleted because it will hurt his political career. Or a women who contributes extensively to a Wikiversity project on polyamory who's now worried about the effects on her marriage. Or someone who rewrote masturbation to portray it as dangerous, and filled a talk page with arguments for keeping it; in any of those cases, do we shred articles or talk page to preserve the modesty of the author?--Prosfilaes (talk) 16:27, 25 November 2010 (UTC)[reply]
We're obviously well into hypothetical-world, which is fine with me. I think the major difference between our examples is that photographs are different than text. In both cases, there is a contribution (photo, writing). In both cases, the contributions can be linked to a contributor (uploader, writer). But in the case of the photograph the uploader (or consent giver) is the subject--the contribution bears the mark of the person who uploaded it right out in the open. In the case of an article writer, a screen name could easily be anonymous; and even if the screen name was a Real Name, the text itself would not obviously reveal who did it--you'd have to dig through the history to find the user and then check the diffs, etc. A photograph gives pretty much all of that information without any effort or words, which is why I think the comparison doesn't fit for me. I agree that text contributions like the ones you described should not be revoked, but I don't think that settles the photograph situation.
You're probably right that policy can avoid this issue, but it might be worth beefing up warnings on the upload page to make clear that consent is non-revocable. Once this COM:SEX policy is set, it might be worth linking to or summarizing as well. Something like: "You can upload sexual content to commons if: it is of your body and you understand it will be publicly seen in any context under an open license which you cannot revoke; it is of someone else who consented to and understood those terms; it is of content already released under compatible permissions." Currently the COM:SEX policy is geared towards Commons curators rather than uploaders, which is understandable but not ideal for a broad issue that should address the concerns of both 'users' and 'editors'. Ocaasi (talk) 02:17, 27 November 2010 (UTC)[reply]
So far as I know the upload page has always said that the consent is non-revocable. The point is, once an image goes up on Commons, and especially once it is used to illustrate Wikipedia, it is going to get mirrored all over the world. Publishing a photo here isn't much different than publishing in a sex magazine. Now Wikimedia does recognize a "right to vanish", and the uploader's account name, and presumably any mention of the subject's name in the image, could be removed from the image on this basis. But bear in mind that if we make a policy decision to delete a photo, we're also making a policy decision to delete its derivative works. We're telling the contributor who has taken that picture and put it beside another and labelled up the comparative anatomy between glans and clitoris, labia and scrotum and so forth that all the work he did was for nothing. Now if we were returning the original uploader to secrecy and anonymity we might be tempted, but knowing we're not? Wnt (talk) 18:58, 29 November 2010 (UTC)[reply]
I see what's going on here and the cats-out-of-the-bag nature of a non-revocable, open license. I think where it's pushing me is that we need to be a little more explicit about what exactly placing an image, especially on commons means. Basically, I think we should spell it out: "If you upload an image of your naked body or sexual act onto Wikipedia Commons, you are granting license for anyone in the world to use the image anywhere they choose so long as they follow the terms of the license. Also, you can never revoke your consent once it has been granted." Maybe it doesn't have to be as scary as that, but it's basically just the facts, and it is a little scary. That's what I'm getting at. We shouldn't change policy regarding revoking consent (although I bet OTRS does it quietly on occasion), but we should be abundantly clear that this content is not just for a single Wikipedia article and that uploaders have absolutely no control over content once it is uploaded. For-Ev-Er. Ocaasi (talk) 05:28, 30 November 2010 (UTC)[reply]

Does the upload form create a binding contract when images are submitted? If so, what is the consideration of that contract? Does the notation that the license is non-revocable carry any weight when the uploader isn't reimbursed? 71.198.176.22 14:26, 2 December 2010 (UTC)[reply]

I'm not a lawyer, but similar terms are stated by quite a few prominent sites on the Internet. On Facebook, for example, once you upload content you give them a pretty nearly unlimited license in that content, and it is not revocable either. - Jmabel ! talk 02:10, 3 December 2010 (UTC)[reply]
Armchair analysis (also not a lawyer): Consideration is the mutual exchange of something of value, either money, goods, or a promise not to do something else (e.g. I can contractually pay you $100 not to paint your car red). In the case of cc-by-sa licenses, there is obviously no two-way exchange of value or goods (unless one considers Commons to be exchanging a service by hosting and allowing the distribution of these materials, which is nice but legally unlikely); there is also no exchange of money. Is there a promise 'not' to do something that applies? Well, many copy-left licenses requires attribution, so there is a promise not to distribute the content without doing and also under a compatible copyright. Still, I don't think this is what the courts have in mind, since the license itself should probably be separate from the consideration which makes it binding. There are things which have no contractual power, but can still not be taken back. They're called gifts, and once you give them, you lose your claim of property over them. Although Copyleft licenses are couched in the terminology of contract, they are more just gifts with a few strings attached (that's how I personally think of them; copyleft lawyers probably have more nuanced terminology). This 2006 article discussed consideration and the GNU GPL, seemingly coming down against contract designation (on consideration grounds) but for license status, however that differs. There's more of this out there if you want to poke around the Google or the Google Scholar, or check out http://www.eff.org or ask at the RefDesk, where they are wicked smart.Ocaasi (talk) 09:10, 3 December 2010 (UTC)[reply]
I spend some time at the RefDesk, and I wouldn't trust it for legal advice (which is specifically banned there anyway... ). The main problem here is that if uploads are revocable, it applies to every single image on Commons, so it's not specifically relevant to this policy. I would also worry that any change to the site upload notice might risk the status of all uploads and should be done only with formal legal input. The only question here is if some special courtesy deletion policy is required here, which I don't see. Wnt (talk) 15:39, 3 December 2010 (UTC)[reply]

Simulation resolution[edit]

What is the resolution between a diagram and a simulation? 71.198.176.22 20:36, 24 November 2010 (UTC)[reply]

are you asking for thoughts on whether or not / how this policy should differentiate a diagram (drawn, or computer generated I presume) with a simulation (presumably photographed?) - I'm not 100% of how to kick off a response, so a clarification would be helpful to me :-) Privatemusings (talk) 00:30, 25 November 2010 (UTC)[reply]
Yes, if we are going to prohibit photographs and simulations both, then wouldn't it be a good idea to provide some guidance about what is a simulation and what is merely a diagrammatic drawing? 71.198.176.22 22:40, 25 November 2010 (UTC)[reply]
do you have trouble discerning the difference - aren't they different just by definition? :-) Privatemusings (talk) 01:21, 26 November 2010 (UTC)[reply]
This policy isn't about prohibiting photographs, just keeping them within Commons policies. Diagrams are specifically exempted from speedy deletion because certain uncontroversial reasons for deletion can't exist for them: they can't be child pornography, and they can't be created without consent, and the act of drawing them probably implies a certain educational purpose. Simulations aren't specifically defined here, and might be regarded as diagrams; but it is also possible to imagine looking at a "simulation" and not being sure if it is a drawing or a photograph run through a few photoshop filters, or more controversially (for example) a nude rendition of a particular person recreated from X-ray, terahertz, or other forms of scanning. In any case, no one should claim a simulation can be speedy deleted if it doesn't fall into one of the prohibited content category at all. So I'm not going to stickle on having simulations specifically exempted from uncontroversial speedy deletion when it will probably just raise more arguments. The term is just too prone to varying interpretations to use for policy purposes. Wnt (talk) 18:46, 29 November 2010 (UTC)[reply]

I asked the question poorly. I'm referring to "simulated sexually explicit conduct" -- where is the boundary between that and diagrammatic drawings? For example, are the cartoon illustrations used for sex acts in the English Wikipedia simulated sexually explicit conduct or diagrams, and why? 71.198.176.22 12:01, 30 November 2010 (UTC)[reply]

The policy doesn't try to define a difference between simulations and diagrams. At the beginning it defines sexual content to include "simulated" images, which aren't really defined. The main dividing lines are (1) what is prohibited content, which very likely does not include them, and (2) what must go through a full deletion review, with the additional comment that "Some categories of material which are generally useful for an educational purpose include: diagrams, illustrations..." (which reiterates that diagrams and illustrations probably aren't going to be prohibited by the policy).
So to take the example of File:Wiki-analsex.png, the file probably counts as "simulated sexually explicit conduct", so it gets sorted through the policy. Then you ask if it's illegal (probably not; fortunately even photos aren't counted as "obscene" nowadays), or if it's taken without consent (nobody to consent), or if it's out of the project scope (the policy specifically says that illustrations are generally educational). And if someone wants to argue for its deletion, it has to go through a proper discussion.
This may be a roundabout logic, but it's the result of a people trying to come up with a policy from different points of view, perhaps regarding situations we haven't thought of yet. And to be fair, a photo like that isn't all that different from the sort of "animated/live action" image that you could find in the film A Scanner Darkly, which arguably would need (for example) the consent of the original participant. Wnt (talk) 01:00, 1 December 2010 (UTC)[reply]

Time to take it to a poll?[edit]

No, it's not perfect. But few things in this world are, and it's been pretty stable, and I'd like to see us adopt this before we are overtaken by events from the Foundation level. - Jmabel ! talk 02:26, 25 November 2010 (UTC)[reply]

I think that timeline is important. The remaining questions are:
  • Should a user be able to revoke sexual content including themselves? (If it was uploaded by them; if it was not uploaded by them; if they gave prior consent in either case)
  • How to handle 'close-up' images. What qualifies as identifiable? Is consent required? What is the process or 'burden' regarding identifiability and consent in these cases?
  • How to handle what could be an illegally acquired image? What does illegally acquired 'look like'? What is the process or 'burden' in these cases?
These are increasingly minor and should only be resolved first if they have the potential to change the outcome of the vote. Ocaasi (talk) 07:02, 25 November 2010 (UTC)[reply]

Here's my answers fwiw :-)

  • Revoking consent will remain outside the purview of this policy, to be handled through practice. The principle of irrevocable consent is important to all wmf content generation, so the best thing for this policy to do is to invite contact with the office, which is suitably in place.
  • Close-up images which are sexual content, as defined, require assertions of consent, as does all sexual content. We've left the burden to be decided through community discussion - I'm fine with that.
  • We've stated 'Provision of further evidence of consent is welcome (using the OTRS process) but is not normally required.' - in the event that an image has a reasonable basis for being illegally required (as discussed and established through community discussion) this would, in my view, qualify as 'not normal' hence the community is empowered to insist on further evidence, the nature of which is left to the community discussion to evaluate. I'm ok with that for now too.

that's my thoughts. Privatemusings (talk) 22:09, 25 November 2010 (UTC)[reply]

At a quick read, I agree completely with Privatemusings here. And none of this entails any further rewordings. - Jmabel ! talk 02:05, 26 November 2010 (UTC)[reply]
I like those answers too. Maybe they'd be useful for a COM:SEX FAQ. In fact, an FAQ, even an informal one might help explain the intention of some of the policies and let voters browse through issues. That's not to say the policy shouldn't be self-explanatory or that the FAQ is binding--just that a guide to a new policy might help introduce the policy itself (see: W:Wikipedia:Neutral_point_of_view/FAQ for example). Ocaasi (talk) 05:30, 26 November 2010 (UTC)[reply]
I don't actually understand what you mean, but it looks like people want a poll, so I'll start the section, closely based on the preview text above (which I'll archive now to avoid any misplaced votes). Wnt (talk) 13:56, 4 December 2010 (UTC)[reply]
  • may be we are ready for the polls, but not yet, i feel, till the question of "irrevocable consent" is well settled. "sign" jayan 05:51, 9 December 2010 (UTC)

Summary of poll arguments[edit]

This section is intended to be a brief, neutral summary of opinions raised in the policy poll below. Please make NPOV edits for completeness, concision, neatness, or accuracy, but not for persuasion (that's what the poll comments are for). Also, !nosign! any changes to keep it tidy. Thanks, Ocaasi (talk) 15:50, 7 December 2010 (UTC)[reply]

Thank you for doing this, it's very helpful! -- Phoebe (talk) 00:26, 12 December 2010 (UTC)[reply]
Gladly, I think every poll should have one! Ocaasi (talk) 12:16, 12 December 2010 (UTC)[reply]

Like it...[edit]

  • Policy is needed
  • Good enough for a start
  • Represents Compromise
  • Not having the policy is worse
  • Not a major change of existing policies, just a focused reorganization
  • Can help avoid social backlash and panic
  • Needed for legal reasons
  • Needed for ethical reasons
  • Protects the WMF
  • Prevents more strict censorship
  • Legally sound
  • Prevents becoming a porn repository
  • Better to have a centralized policy than multiple deletion rationales
  • Preempts WMF from imposing a less nuanced sexual content policy

Don't like it...[edit]

  • Slippery slope to more censorship
  • Educational purpose is not clearly defined
  • Doesn't explicitly protect content from deletion
  • Policies on illegality, privacy, pornography, and nudity already cover the material
  • Better to handle on a case by case basis
  • Instruction creep
  • Policy addresses legal issues but is not written by lawyers
  • Policy addresses legal issues but commons users are not legal experts
  • Sexual content should not be treated differently than other content
  • Vague wording
  • The phrase 'low-quality'
  • Out-of-scope deletions should be normal not speedy
  • US-Centric legal concerns
  • Legal concerns that are unresolved by courts
  • Addresses sex but not offensive or violent content
  • After Wales deletions, implementation cannot be trusted
  • Biased against Western cultural taboos (it's too conservative)
  • Biased against non-Western cultural taboos (it's too liberal)
  • Better as a guideline
  • Needs more specific criteria for inclusion and exclusion
  • Better to solve this on the user end (personal image settings/filters)
  • Insufficient protection for erotic art
  • Only available in English language, which makes fair overall understanding and voting impossible

Questions[edit]

  • What does 'low-quality' mean?
  • What qualifies sexual content as educational?
  • What content cannot be deleted?
  • Would it be better as a guideline?
  • Can the legal discussion be rephrased in normal language?
  • Should out-of-scope decisions be normal rather than speedy deletions?

Tweaks[edit]

  • Shorter
  • Less legalese
  • Clarity of legal interpretation
  • More specific criteria for inclusion/exclusion
  • Out-of-scope deletions to be handled by regular process, not speedy

Second poll for promotion to policy (December 2010)[edit]

This is a poll to adopt Commons:Sexual content as a Wikimedia Commons policy.

Please give your opinion below with {{Support}}, {{Oppose}}, or {{Neutral}}, with a brief explanation.

The poll concerns whether people accept or reject the November 26 revision. Edits made to the policy during the polling period may be temporarily reverted, and will need to be adopted by consensus, like changes to the other established policies.

Voting on the poll will proceed until 06:08, 15 December, ten days after the time that poll was first advertised at MediaWiki:Sitenotice.

A previous poll was closed as "no consensus" at Commons talk:Sexual content/Archive 4#Poll for promotion to policy. Wnt (talk) 07:47, 23 October 2010 (UTC)[reply]


Comments added after the poll officially closed[edit]

  1.  Support I agree, it's a good start, and it does seem reasonable. Lamougue (talk) 08:26, 15 December 2010 (UTC)[reply]

Result[edit]

Out-of-scope decisions should not be speedy deletion criteria for files[edit]

COM:CSD doesn't indicate that scope concerns are a legitimate speedy deletion criteria for files; only for pages. I think this is a serious flaw in the current proposal and would like to see it changed. Asking individuals, even admins, to make subjective scope decisions seems like a recipe for conflict. Of all the issues raised in the objections, this one seems the most substantial to me.

I'm asking here if there are any objections to removing "Material obviously outside of scope" from the proposed speedy deletion criteria additions so that the people who have already indicated their support for the proposal in general can indicate whether they have any objections. 71.198.176.22 02:12, 8 December 2010 (UTC)[reply]

Of course I object to changing something so significant in the middle of a vote! I'm afraid you'll have to vote accordingly and then wait until after the poll to re-discuss this. 99of9 (talk) 10:12, 8 December 2010 (UTC)[reply]

I'm not proposing that it be changed before the vote is closed, but I am proposing that the vote be closed in the next few days, and discussion move on to the objections raised by more than one person. We can begin such discussion at once. I'd like to start by asking for people to propose alternatives to this edit, which was indicated as potentially problematic by the editor in the summary and severely changed the character of the "out of scope" section. My opinion remains to delete the remaining, small, "Material obviously out of scope" bullet point paragraph which was made out of the earlier much larger section that was moved into the next section. 71.198.176.22 11:01, 8 December 2010 (UTC)[reply]
As it says in the intro to the poll, it will be closed on the 15th of December. --99of9 (talk) 12:03, 8 December 2010 (UTC)[reply]
  •  Support unless the admins doing the speedy deletions can provide specific facts to make a compelling argument. It's clear from the poll comments that this part of the policy is objected to by many people, and favored by few. However, as I commented above, I would like to know how many speedy deletions really happen and how much time and effort it actually saves to do things that way rather than by normal AfDs, among other things. But I think at this point the burden of proof is on the admins; otherwise we can change the policy to rule out all speedy deletions for scope, and see what experimental data that provides. Wnt (talk) 12:46, 8 December 2010 (UTC)[reply]

If this is the loophole which allows deletion of blurry cell-phone camera snapshots of penises and similar (see File:What Commons Does Not Need.jpg), then removing it would disalign policy from practice (as explained above) and set the stage for future bickering... AnonMoos (talk) 03:00, 9 December 2010 (UTC)[reply]

this "practice" is undertaken by only a handful of admins who seem to have a particular interest in the subject. i have pointed out in the past that it is a violation of policy. with respect, it would be more accurate to say that the inclusion of this provision is an attempt to "align" policy with the preferred practices of those admins who have not been following established policy in this regard. it is also pretty clear from the responses here & in the "oppose" section, that i am not alone in feeling that all admins should be required to follow policy on deletions, & it is not "ok" for some admins to simply ignore written policy, & make up their own rules, when it suits them Lx 121 (talk) 08:44, 9 December 2010 (UTC)[reply]
Unfortunately there is a real problem with drunken guys uploading low-quality cell-phone snaps of their penises, so there it's only reasonable that there be a special policy sub-clause to deal with this special problem. I'm not sure what denying this would accomplish, other than to create additional tedious work for other people in order to comply with your personal philosophical views... AnonMoos (talk) 11:05, 10 December 2010 (UTC)[reply]
yes, i know; we've had this same debate before, repeatedly... BUT as the results of the discussion so far show, i'm clearly not the only one who has an objection to this "special sub-policy clause". it provides unlimited opportunities for abuse. this kind of deletion is a judgement call, & that kind of decision should be made in open community debate, NOT at the discretion of lone admins.
Here's the key point: deletions of this type are NOT URGENT, it's not time-critical, it's not "important" that the files get erased as quickly as possible; the work is just minor tidying. there is no legitimate reason not to allow an open, community debate before deleting the material.
when you see a penis-pic that you don't like, tag it & notify, wait 5 days to see if there's any objections, & if not, then delete. how hard is that? if the exact same number of people devote the exact same number of man-hours to the work, the rate of penis-pic deletions will remain the same, meaning you guys won't "fall behind" just by changing your s.o.p.
right now, we have a massive backlog of unclosed deletion discussions. it would be FAR more useful to work on closing those.
Lx 121 (talk) 16:16, 10 December 2010 (UTC)[reply]
Any individual deletion is probably not particularly "urgent" in itself -- but it's a fairly strong overall priority to prevent Wikimedia Commons from being gradually transformed into the kind of site where such images form a significant fraction of the content, and so far the "Nopenis" escape valve has been found to be the most smoothly-working means towards that end. Any objections to the "Nopenis" escape valve have almost always been on abstract theoretical philosophical grounds, and NOT because anyone really believes that many valuable images are being deleted. If you want to eliminate the "Nopenis" escape valve, then it seems to me that it's incumbent on you to explain what practical positive goals will be achieved on a non-theoretical non-abstract non-philosophical level (i.e. achieving something worthwhile other than creating administrative busywork over ideological views). AnonMoos (talk) 17:55, 10 December 2010 (UTC)[reply]
point 1. we have over 7 million media files @ wmc now; it is pure hyperbole to talk about commons being in danger of becoming "overwhelmed" by a tsunami of penis pics. please provide statistical data to back up this claim
point 2. the practice of admins unilaterally deleting sexual content based solely upon their own judgement of scope HAS NOT EVER been put to a community vote, until now. as things stand, at the moment, you are losing this vote.
point 3. it is impossible for an average user to examine the materials being removed by this method, or assess the validity of the decisions being made, because only admins can check the files, after they are gone. as your already know
point 4. it is not "creating administrative busywork", to honour & respect the policies laid out by the community process. &, as i have already pointed out above, it does not increase the workload to process scope deletions properly, it simply allows time between the nomination & the removal, for members of the community (including the uploader) to express their opinions & discuss the matter, before action is taken. this is why we have a deletion debate process.
point 5. the mere fact that you feel it is appropriate to separate out sexual content as a category where non-debated, non-reviewable "instant deletion" for "scope" is acceptable, shows considerable bias. if we are going to allow admins to instantly delete "out-of-scope" materials, why not do it for everything?
point 6. your comment "Any objections to the "Nopenis" escape valve have almost always been on abstract theoretical philosophical grounds, and NOT because anyone really believes that many valuable images are being deleted." clearly indicates that you have received multiple objections to this activity; which, again, has never been approved of in any open community process.
point 7. i consider it "worthwhile" to respect the policies & procedures laid out by the community, & i do not consider it "abstract, theoretical, & philosophical" to object, when these same policies & procedures are violated. especially when it comes to such important questions as "who gets to judge what materials should be removed from the wmc collection".
point 8. i'm getting tired of indenting, so if you would like to continue this discussion, shall we open a new section? Lx 121 (talk) 19:09, 10 December 2010 (UTC)[reply]
Dude, I am not "losing the vote", because this is a purely non-binding side-discussion to the main poll (which is itself NOT a "vote" in the sense that one more support than oppose would mean the policy would be adopted while one more oppose than support would mean not adopted). It's not that Commons is being overwhelmed by an immense quantity low-quality genital snaps, as that allowing them to accumulate unchecked without any periodic trimming or pruning would eventually tend to tip the balance of Commons in a direction that most thoughtful people don't want it to be tipped... AnonMoos (talk) 12:18, 11 December 2010 (UTC)[reply]
to clarify: you are losing the vote seeking approval for a "special sub-policy" to remove materials as "obviously out of scope" by speedy deletion, without community discussion. both in this sub-discussion & in the main vote, that provision is clearly being demonstrated to be objectionable to a significant part of the community. & you still have not provided any data to back up your stated concerns, even tho you claim expertise on the problem. & you still have not provided any reasonable explanation for why the current deletion procedures are inadequate. all you keep saying, over & over again is: "this is a huge problem, we must have these powers to fight it, & it's not good enough to delete the material via the normal process (because it's too much work?)". Lx 121 (talk) 00:48, 12 December 2010 (UTC)[reply]
Dude, there simply is NO VOTE in any meaningful or relevant sense of the word "vote", and any counting of the comments in this particular subsection of the discussion will not be definitive for any policy changes -- and you're deluding yourself quite strongly if you imagine otherwise. AnonMoos (talk) 06:12, 12 December 2010 (UTC)[reply]
There are/were 1000 penis images on wmc or 0.015% of all the wmc images. Considering all the things that one could photograph that is a huge number. There are 250,000 beetle species each one of which has a different shaped penis. If wikipedians were as dilligent on recording beetle penis as that are on recording there own, number of photos of beetle penis would be over 100 million. Fortunately, biologists don't feel the need to record a 1000 dicks for each beetle species, even though the size and shape is diagnostic in species identification, and in most cases they seem to get by a with a single line drawing. John lilburne (talk) 20:28, 10 December 2010 (UTC)[reply]
I think that standards in biology are likely to change - the days of "the" Cryptocephalus abdominalis penis, or "the" (species) anything, are numbered. The individual species is only an approximation; the world is covered in hybrid zones. And a w:ring species will expand horizons. I doubt we'll see 250,000 species of beetle, or their various accessories, anytime soon, simply because a greater level of skill and equipment is needed, but when the day comes it will be nice to have. Wnt (talk) 14:56, 11 December 2010 (UTC)[reply]
Yet it is the approximation that forms the basic classification for the life sciences. Whilst penis shape remains a main determinate in species determination as far as biologists are concerned, I suspect that female beetles don't go about saying "Hmmm nice cock that will fit nicely", that there are other determinates that actually decide on choice of mate or more specific whether any one organism is considered a suitable partner. The point remains though that in clumping individual beetles into species biologists don't need 1000 photos of penises to a) determine what one looks like, or b) to determine that the one being examined matches with the norm for any particular species. 62.49.31.176 17:25, 11 December 2010 (UTC)[reply]
but if you are an entomologist compiling a comprehensive atlas of comparative beetle anatomy, you do need those images. a "line drawing" is not an adequate substitute for photographic materials. Lx 121 (talk) 00:48, 12 December 2010 (UTC)[reply]
What you don't need is a 1000 of them. Line drawings are good as they only indicate the salient features. Too many poor quality photographs of botched dissections will overwhelm the good. A vast quantity of this stuff isn't needed and an un-curated resource isn't nearly as useful as a curated one. Lets leave beetle genitalia to one side and examine say wing venation on dragonflies which can be diagnostic for some species. Again one doesn't need 1000s of images of show the difference between Onychogomphus uncatus and O. forcipatus.
So when there are 1000 photos of human penis comprising 0.015% of all images on Commons, one has to say that the number is out of proportion to their actual worth, and simply reflects the preoccupation of the uploaders. John lilburne (talk) 16:47, 12 December 2010 (UTC)[reply]
Those can be speedily deleted if they don't have an explicit declaration of consent with the reason "not permitted to host." I would like to try that for a month or twelve to see how it goes before rejecting the potential solution. Ginger Conspiracy (talk) 03:33, 9 December 2010 (UTC)[reply]
  •  Support with prejudice. Easy to abuse with no greatly necessary pragmatic purpose. Low quality cell photos of dicks should just get their own special policy if they're such a problem. The use of this policy has to great of a potential to be abused and isn't critical in removing obviously unnecessary images.AerobicFox (talk)
  • I share the concerns expressed at the top of this section. See my "oppose" vote comment above. My concern focuses mostly on non-sexual pictures that might be speedy deleted for outofscopeness. Teofilo (talk) 15:04, 11 December 2010 (UTC)[reply]

✓ Done Removed the speedy deletion criteria proposal for material out of scope. 71.198.176.22 11:02, 15 December 2010 (UTC)[reply]

Potential "Miller Test" Material-- speedy delete or not?[edit]

The current version gives conflicting instructions. First, we're told to speedy delete any material that is illegal for Wikimedia to host. Later, we're told that "Because of the complex, subjective nature of obscenity law, work should not be speedy deleted on the basis of perceived obscenity".

It's confusing to tell people to speedy delete anything illegal, then teach them all about obscenity law, only to finally inform that that obscenity is just too complex for a lay audience to decide.

Perhaps the obscenity law discussion belongs in some essay, guide, or instructional page--- something that doesn't carry the force of policy. Right now, we try to educate an international population about some US laws that hinge upon US interpretations of 'community standards' in unknown ways, thus making it utterly impossible for a non-lawyer to accurately gauge an image's potential legality.

--Alecmconroy (talk) 03:40, 8 December 2010 (UTC)[reply]

I too have concerns about the "Miller test" provision. The courts have generally been very lenient with the Miller test, rendering it nearly moot. However, the plain language of the Miller test, if applied in a reasonable way, would exclude large swaths of material. I can't really vote on this proposal as long as it states that material must comply with the language of the Miller test, considering the potential for abuse there by editors who take the Miller test at face value instead of applying as the courts have. Gigs (talk) 04:08, 8 December 2010 (UTC)[reply]
If you can cite material explaining this greater leniency, it would make a very welcome footnote to this policy. My untutored impression is that w:Mike Diana lived in Florida. Wnt (talk) 12:54, 8 December 2010 (UTC)[reply]
Regarding the impact of US laws, this is a consequence of the fundamental NOTCENSORED pillar. I've tried to prevent the policy from excluding any sort of material except for some actual reason beyond our control. We aren't picking out stuff and saying it's off limits simply because we don't like it. As a result, the face of Commons is inevitably going to reflect the tread of the boot that stepped on it. We're not going to say that we walked into a doorknob or fell down the steps. Wnt (talk) 13:01, 8 December 2010 (UTC)[reply]
Wnt -- What "Wikipedia is not censored" means is that an image is not automatically deleted just because someone considers it offensive or unsuitable for viewing by children. However, "Wikipedia is not censored" does NOT mean that we can't make a considered deliberate decision not to host certain types of material which we have concluded does not further our goals. AnonMoos (talk) 03:07, 9 December 2010 (UTC)[reply]
i'm sorry, but your agruement is sophistry; when you "make a considered deliberate decision not to host certain types of material", that is censorship. once we define the project's goals (which, for wmc would be: media repository, educational material, free and/or open-source only), then the basic considerations for inclusion/exclusion of materials are: 1. relevance (i.e,: scope) 2. legality (copyright, other legal restrictions on permitted content, etc.) if you want to add a 3rd parameter it should probably be: 3. technical limitations (storage, bandwidth, etc.) on that basis, you are welcome to argue why you feel that sexual content "does not further our goals", but please do not pretend that it's not censorship! Lx 121 (talk) 21:36, 9 December 2010 (UTC)[reply]
People are free to speak all they want, but if they want to use our servers and bandwidth to do so, then it's perfectly reasonable to require that they follow certain basic ground-rules that we have laid down. "Relevance (i.e,: scope)" means exactly and only previous past considered deliberate decisions which have been made not to host certain types of material which we have concluded does not further our goals -- and if we make further considered deliberate decisions in the future not to host certain other types of material which we will conclude will also not further our goals, then that will redefine the exact meaning of "relevance (i.e,: scope)" on Wikimedia Commons. You can call it "censorship" if you want to, but that sounds a lot like mere somewhat superficial rhetoric to me, since 1) We're under no legal or moral obligation to indiscriminately host absolutely everything that anyone submits ; 2) We're doing nothing to limit their ability to use other forums if they're not willing to respect the basic ground-rules here ; and 3) There are still very few topics which are under any kind of absolute content-matter ban, and those are as a result of external laws (not Wikimedia Commons policies). AnonMoos (talk) 11:27, 10 December 2010 (UTC)[reply]

I think it's fair to say that administrators can make satisfactory judgements about anything out of a wide gray area, and the information in the proposal helps them understand where and how wide the gray area is. We already have instructions at COM:CSD to delete some kinds of illegal material (copyright violations), and technically, it is impossible to license a lot of the material forbidden by anti-pornography statutes, so the CSD for non-free content applies too. Therefore, the proposal represents an incremental improvement in any case. 71.198.176.22 10:47, 8 December 2010 (UTC)[reply]

i have no problem with removing material that it would be illegal for wmc to host, but which statues are you referring to? o__0 Lx 121 (talk) 21:36, 9 December 2010 (UTC)[reply]
I think everyone agrees that obviously illegal images should be immediately deleted. Many such kinds of illegality can be judged by our editors-- copyright or consent, for example. But US obscenity case law is never obvious to untrained eyes. Non-specialists cannot correctly apply the miller test.
If you let people start using their own personal standard to apply miller to our content, how do you resolve disputes? All editors can do is argue back and forth on whose standards best represent 'the Florida, US community standards'.
Legal questions are for office. Don't have our admins singlehandedly decide such important and complex legal questions. Don't make our editors debate over legal obscenity, each arguing based upon their own impressions of what "FL, US community standards" look like-- that will just spark divisive flamewars over religious, cultural, and philosophical differences.
If someone thinks an image is obscene, tell OTRS and let the lawyers take a peek. If it's truly obscene, you can be sure they'll be the ones best suited to actually do the deletion and take whatever other legal steps are necesssary. --Alecmconroy (talk) 11:55, 8 December 2010 (UTC)[reply]
I agree; again, I think the proposal represents an improvement over the existing COM:CSD in the direction you're supporting. 71.198.176.22 12:02, 8 December 2010 (UTC)[reply]
I would say that legal questions are for lawyers, but not necessarily that legal questions are for office. If it's something where great haste is needed to avoid discrediting Wikimedia, that's one thing, but if it's a debatable case, we might as well make the decision in public. There's no reason why Wikimedia's legal counsel can't give input directly to the AfD process. Mostly, however, my hope is that the act of discussing a borderline case in AfD will goad those involved into presenting good arguments that the content actually has educational, scientific, artistic, or political importance, thereby hardening Commons against any potential prosecutions. Wnt (talk) 12:51, 8 December 2010 (UTC)[reply]
And then the closer judges whether their arguments met the Miller Test or not? No thanks. Gigs (talk) 17:33, 8 December 2010 (UTC)[reply]
I too raised Dost and Miller as a reason for an oppose. It is ironic, as I do spend a lot of time policing out images from a en:wp article pages that fall within the spirit of Dost and Miller. But that is en:wp, and I can envisage occasions where even those images may have an educational content. Before however I make a decision on a policy I need to understand completely the issues- and from the Miller test and Dost test- I don't get that. Just warning bells. They are only Start-class articles- if they were GAs then I might have enough information- to take the risk and form a policy. At the moment I sense a Weapons of mass destruction fiasco is being engineered. So even before, we take into account whether we wish to guide a global project on the perceived opinions on the views of a small community somewhere on the 26th parallel, could someone work up those two articles to GA status.--ClemRutter (talk) 20:15, 8 December 2010 (UTC)[reply]
Illegal material will be deleted by administrators, with or without this policy. The Miller test is used to deem illegality, so it is relevant whether or not we mention it. But if a work, taken as a whole, has serious literary, artistic, political or scientific value, then it will not be regarded as obscene. That means, more or less, that no work that is in project scope is obscene by this test. Educational is not mentioned, but I understood that would not be a problem.
So mentioning the Miller test is not about allowing censoring, but about ruling out some arguments about illegality in the deletion debates.
--LPfi (talk) 20:58, 8 December 2010 (UTC)[reply]
If applied properly, the Miller test won't be a problem for any of our content. If applied improperly, taking the test at face value, it could lead to lots of problems. Let's cut the current Miller test text-- nobody knows what that test even means within the context of a global project like ours. --Alecmconroy (talk) 06:52, 9 December 2010 (UTC)[reply]
one obvious point here: we already have a policy for deleting illegal materials, which makes that aspect of "commons:sexual content" irrelevant/redundant. as re: "the miller test", i don't see any way that we can realistically train all our admins on here to administer it, to a fair, even, & credible legal standard. failing that, it becomes a hopelessly subjective "bogeyman" standard, being applied differently by every admin, & the source of yet more internal strife @ commons. it's a nice idea in principle, but if even the courts can't make it work as a practical, implementable, objective standard; how are we supposed to? for materials that aren't clearly illegal for us to host, the place for discussion & consideration (especially of issues re: scope!) is in open community debate, NOT by admins in secret, with no true open review of their actions possible Lx 121 (talk) 21:46, 9 December 2010 (UTC)[reply]

other languages[edit]

Well, I know that it is a lot of work, but if I see this case correctly, people that have not the sufficient proficiency of English should deal with the outcome of this voting of people who have. Hm, thought this is a multilingual project, where even non-English speakers are welcome? --Catfisheye (talk) 23:21, 8 December 2010 (UTC)[reply]

If you want to translate it, go ahead. But translations are hard, we don't have anyone volunteering to do translations of this document, and it would take several translations to even start to cover our non-English speaking users. Not to mention that disagreements are turning over delicate turns of phrase that won't translate exactly, and hence those voting based on versions other than the English won't get an entirely accurate view of the proposal. So translation is not our first priority.--Prosfilaes (talk) 00:51, 9 December 2010 (UTC)[reply]
I considered the problems you mentioned, but knowing that, how could the result of this poll be regarded as representative, i. e. obligatory for all, if a not neglectable amount of users is being excluded from the beginning? That of course does not apply only to this poll, but is a general problem. Maybe a system can be adopted, where translations do exist with a caveat for being limited to that what a translation can do, but the voters have to vote for the English version. But how to secure that translations are being made. Do you know which chapter of the foundation is responsible for Commons? --Catfisheye (talk) 01:47, 9 December 2010 (UTC)[reply]
This is a fair argument, except... isn't every Commons process done in English? Deletion discussions, admin noticeboards, village pump, etc. Hopefully the policy won't infringe very much on other projects --- with the notable exception of explicit consent. Handling the language issues regarding that may not be easy. Wnt (talk) 02:17, 9 December 2010 (UTC)[reply]
It's really critical that we involve non-english-speakers in this decision-- they are going to be the ones most affected, since they won't even have the opportunity to come to commons and argue for undeletion. I think I'd feel very uncomfortable about that set up, if the shoe were on the other foot and it was a german-language community deciding what can and cannot be allowed here.--Alecmconroy (talk) 06:32, 9 December 2010 (UTC)[reply]
I think the way to involve non-english-speakers is to have those understanding English report about the proposal and key points in it. In most communities there is at least a big minority knowing English. If the community finds the proposal problematic, we can hope enough people will come here also from those communities. There is easily some hundred people knowing English in every big language wikipedia, enough to overturn any result here. Of course the USA and Western Europe are overrepresented, so we should listen carefully to those representing other cultures.
The same is true about being affected by policies. There is a village pump for all languages, at least in the Wikipedia project. Those not knowing English can ask for help there. I think translation of introductory texts is more important than translation of proposals: when new users feel at home in Commons, using their own language to upload and describe their files and discuss problems (at their village pump or user talk pages), they will know how to find somebody to help them with the English.
--LPfi (talk) 08:30, 9 December 2010 (UTC)[reply]
It's hard to argue that this isn't a concern, but I'm not volunteering to do the translation either. In the near term, if the policy passes, some latitude will be needed for people who don't speak English. If need be (and I don't think it really should) the other projects can also host their own images locally. Certainly on en. that has to be done quite often for copyright/fair use reasons. Wnt (talk) 16:26, 9 December 2010 (UTC)[reply]
The biggest problem of this poll is the language. Who is willing to participate in a poll that he can't even understand or at least in detail? Commons is a project that provides images for all language versions. A poll that is only hold in English language is not representative at all. First the translations. Then the poll. Otherwise it is like cheating on all the other projects. --Niabot (talk) 20:27, 9 December 2010 (UTC)[reply]
Whatever. I've noticed no impulse on anybody's part to start translating it, so all this amounts to is a stalling tactic, which can be used to block any new policy. Most current policies have a handful of translations; none of Commons:Blocking policy, Commons:Photographs of identifiable people, or Commons:Nudity have more than five translations, none of them into German, Russian or Arabic. There is no hope of getting this translated into every language of Wikimedia, so at the very least you're going to have to specify which languages you're agitating on behalf of.--Prosfilaes (talk) 21:57, 9 December 2010 (UTC)[reply]
Sorry, but that's not cool. Every contributor--regardless of native tongue--should be a part of all major Wikimedia-wide policy decisions. The fact that this isn't already mandated absolutely amazes me. And if finding volunteers to translate proposals is such a barrier, then we need to have the oversight to prepare a reliable plan in advance.   — C M B J   22:49, 9 December 2010 (UTC)[reply]
I agree with OP Catfisheye and others: It really is a problem in our multilingual project: confusio linguarum. But it is a problem for all polls here. It is not only relevant for this one.
The European Union has a big, big, very costly institution (en:Translation Centre for the Bodies of the European Union) to provide all the translations to all languages in the EU. But Commons has more languages than the EU and cannot pay certified, professional translators to translate 100 % correctly.
I do not know the solution to this problem right now - but it surly is a problem for Commons. It especially is a problem when it comes to polls where the language restriction creates a bias towards a specific result - as it is surely the case with this topic. This poll cannot be representative for all members of the Commons community. Some parts of the community rely on our wonderful translated templates and messages and get it done to upload images and do the stuff they then need to do. As here is nothing translated this poll is not accessible for them.
This problem does not matter here as long as it is closed as "no consensus", but if it would be not we have to ask the question if it really is a consensus or if it looks like a consensus due to a strong bias in the voters ability to vote. Cheers --Saibo (Δ) 23:09, 9 December 2010 (UTC)[reply]
Once again, you haven't responded with anything practical. All contributors are never going to be a part of all policy decisions; only a tiny percentage of contributors have responded to this. There are 276 Wikipedias; cut, fold and spindle however you want, that's still 200+ languages. Commons:Community portal is available in less than a fifth of them, and that's a lot; Commons:Project scope is available in 14, one fifteenth those languages. So far, not even one person has said "I want to spend my time and energy translating this proposed policy."--Prosfilaes (talk) 04:40, 10 December 2010 (UTC)[reply]
I believe that you are unnecessarily devaluing the points being made here. No one is saying or implying that all contributors will be a part of all policy decisions, but rather that all contributors should have the opportunity to be involved in planning the future of the project that they are part of. And the conundrum is not one of practicality; it is one of fairness and universal access; one that is akin to alt attributes and wheelchair ramps. But even from a pragmatic standpoint, there is no reason why we cannot solicit two or three active, bilingual administrators from each of those 200+ projects to translate major (sitewide) proposals.   — C M B J   05:05, 10 December 2010 (UTC)[reply]
The fact is, alt attributes and wheelchair ramps are tools for cheaply supporting the moderately challenged. Text readers will mangle many words found in Wikipedia articles; should we not dedicate ourselves to making sure that every Wikipedia article has a spoken version? The simple fact is, not all of those project have two or three active administrators, and many of them would find it too much of a call on their time or simply outside their interests to translate these policies. So far, not one person has volunteered to translate this proposal, which doesn't bode well for the theoretical possibility of getting it heavily translated.--Prosfilaes (talk) 05:25, 10 December 2010 (UTC)[reply]
When I skip through the names of the voters, I see a strong geographic bias: There are many high profile german users on the oppose list (Elian, Markus Cyron, Catfisheye, Saibo, Jan eissfeldt, Thogo, Paramecium, DerHexer, Ukko, Sargoth, Julius1990, Widescreen, Fossa, Kmhkmh, Niabot, Adrian Suter, Chaddy, Kuebi, Gereon K., h-stt, Cartinal, Don-kun, Hybscher, Achim Raschka, Grim.fandango, Andim, Gripweed, Simplicius, Liberaler Humanist) many of them admins, all of them very active. The same cannot be said about the support list. I didn't find a single familliar german wikipedia name on the oppose support list. Based on this sub-poll, it seems fair to conclude that German WP would not be happy with the proposed policy. I'd expect some kind of fork in case the policy will be accepted.---<(kmk)>- (talk) 02:47, 10 December 2010 (UTC)[reply]
Add me though probably not high-class :-) and serveral others that I recognize. I wonder that I id not see any Dutch colleagues on either list. I still suspect this is somewhat a European / (US)American shism. --Purodha Blissenbach (talk) 02:57, 10 December 2010 (UTC)[reply]
Thanks for your quick research KaiMartin! (By the way: you probably confused oppose and support here: "I didn't find a single familliar german wikipedia name on the oppose list.") Cheers --Saibo (Δ) 03:49, 10 December 2010 (UTC)[reply]
(corrected)---<(kmk)>- (talk) 10:49, 10 December 2010 (UTC)[reply]
There are some Germans on the support list (Cvf-ps, Singsangsung, AFBorchert, Yikrazuul and High Contrast). And even though you missed a couple of "high profile" Germans who oppose, the German opinion is clearly against the policy.
The main problem with the language is, that by using English we adopt american/britain way of thought. I don't see why a policy for an international project should even mention practice conducted by the US-Supreme court. --bluNt. 07:25, 10 December 2010 (UTC)[reply]
Delete COM:L, and I'll go for it. Otherwise, we're stuck dealing with the fact we live in the real world and have to deal with the laws laid upon us.--Prosfilaes (talk) 08:18, 10 December 2010 (UTC)[reply]
If we think of linguistic fairness, Germans are no problem. English is taught as first foreign language in Germany, German and English are related and Germans do know English better than most non-native speakers (that is my impression at least). Germany is a Western country. The problem is serious e.g. with Latin America, Russia, China, the Arabic world and most of Africa. Spanish, Russian, Chinese, French, Portuguese and Arabic are big languages though, with potential of finding translators (but having good translations of all proposals is probably too much to ask).
We also have a lot of small languages, in countries where none of the mentioned languages is the main foreign language, or foreign languages are not commonly known. The latter is a real problem for the cultures affected, to which I see no viable solution, other than being understanding towards people using less known languages (e.g. in creating categories or asking for undeletion of their files).
--LPfi (talk) 09:33, 10 December 2010 (UTC)[reply]
I would say, that germans have not such an big problem (statistically). But you may consider that germany was parted 20 years ago and anyone older then 30 in East Germany (DDR) had Russian as the foreign language. I know many german wikipedians that suffer from this language barrier and aren't able to sit on the same table as we currently do. And yes, your are right. For other countries/regions it's even more difficult to participate. I guess that 50% of the projects are not even knowing that this poll is hold or aren't able to figure out what it is about. In my opinion such a poll (affecting globally) it should at least be available in the five major languages.
Also i read: "no one cared to translate it". But i also know: "no one asked to do so". --Niabot (talk) 10:19, 10 December 2010 (UTC)[reply]
um, maybe a stable version of this proposal would make it easier to find translators? --Catfisheye (talk) 11:57, 10 December 2010 (UTC)[reply]
Maybe, yes, since I see no sense in spending much time and effort to translate this proposed policy which I do not support. This could be called stealing time by some people who think that we need a new policy with this and that content just because some are overreacting to US sunshine press/media agitation.
@ ~"germans do not have the biggest problems" - sure - but I and others discussing here know it best from German community. If even there occur problems the problems are worse for other language communities who do not have English as a widespread foreign language thought in school and whose language is not similar to English.
@Prosfilaes: Yes, I know and even wrote that my comment above was nothing practical. There is no need to give me a note about this I know it. It was just a bit beyond our own nose to get ideas and impressions from areas with similar problems. Cheers --Saibo (Δ) 15:22, 10 December 2010 (UTC)[reply]

A few little queries[edit]

I've gone through some of this, copy-editing without intending to change the substantive meanings. Among little issues I've spotted (and may not understand) are these:

  • "A media file that is in use on one of the other projects of the Wikimedia Foundation is automatically considered to be useful for an educational purpose. Such a file is not liable to deletion simply because it may be of poor quality: if it is in use, that is enough. Exclusive use of a file in userspace does not constitute educational purpose." I'm surprised that no assessment would be made of whether the use of the file is educational. This is a requirement, for example, of some fair-use rules. Is material allowed on talk pages alone, and other WP space alone?
  • "Your communication will be confidential and will be viewed only by trusted volunteers who have identified themselves to the Wikimedia Foundation by name. For further information see Meta:OTRS." So WMF employees won't be able to access it? Tony1 (talk) 07:09, 9 December 2010 (UTC)[reply]
The main point is that the projects are allowed to define themselves which images are suitable and we on Commons should not delete images that are deemed useful (for an educational purpose) in the projects. We might think that an image is useless low quality porn, but if it is used on a Wikipedia article, then we should not think we know better than the editors of the Wikipedia. User page use is explicitly not considered educational use (but COM:PS has some comments about scope and user pages).
Use on talk pages is more tricky, but I think removing an image from a WP: or talk page could be very disruptive. If there is no problem with the image (other than being out of scope), I think leaving the image is certainly the right thing to do. If the number of such images is too high, then the problem is with the projects, not with Commons hosting them. Individual disturbing images might be protected this way, but there are probably many more disturbing clearly in-scope images. ("Fair-use rules" is probably referring to some individual Wikipedia? The problem is different as it can be handled locally.)
--LPfi (talk) 09:05, 9 December 2010 (UTC)[reply]
Good catch about the WMF employees - when we make a promise like that we should be able to adhere to it literally. Wnt (talk) 16:47, 9 December 2010 (UTC)[reply]

I've removed some of the mid-sentence bolding. (1) It looks messy. (2) It often excludes the critical word, which comes before the bolding. I do not believe it aids comprehension, unless perhaps bolding is used at the very start of each point, as an inline heading, as it were (such as in Speedy deletions). Tony1 (talk) 07:12, 9 December 2010 (UTC)[reply]

Images of deceased people[edit]

I don't know if this is right place for this, if not I'll delete. I hate to re-open an old dispute. There needs to be clearer image policies about situations where the assumed consent of a deceased person should be required.

When I joined Wikipedia completely new and inexperienced last February, I was right away immersed in a dispute regarding the nude image of a murdered child on Wikipedia. Her private area was barely visible but it was there and if that was someone I loved exposed that way, I would have been devastated. For the record I hate censorship but this was a case of the rights of a person comes first.

When I tried to have the imaged removed, editors :

  1. Gave me the old not censored disclaimer
  2. Insulted my intelligence with pseudo logic and platitudes
  3. One editor flamed me with some serious personal attacks
  4. Tried to ignore me away under the pretext of Don't Feed the Trolls when I tried to make a peace gesture

I was so intimidated but I didn't go away, I'm here. The image was finally deleted three months later for a copyright technicality, but the way that poor child was exposed for over a year is a stain on Wikipedia's reputation. Slightsmile (talk) 17:02, 9 December 2010 (UTC)[reply]

The goal of Wikicommons is to create a repository of free works which then can be used for educational purpose. If somebody's feelings get hurt in the process, it is bad of course, but if there is a policy it should be to help those people deal with their feelings rather than doing the a disservice by denying their problem and removing the images. Beta M (talk) 17:28, 9 December 2010 (UTC)[reply]
The existing COM:IDENT policy doesn't contain any word about living, dead, or deceased persons, so there's no reason to think they would be treated differently. However, if there was a copyright issue involved, the photograph may have been previously published, and that policy is addressed to the individual taking the photographs, not people uploading previously published photos. This policy follows in the same path. The idea is that if a photograph is published already, it's already public and Commons can't change that. Wnt (talk) 03:40, 10 December 2010 (UTC)[reply]
Further down in the link provided by Wnt Moral issues, it does touch on the concerns I've expressed regardless of if it's already published. I'm just saying these policies should be better defined.
I want to thank Beta M for the perfect example - if you don't like an image go see a shrink. Users shouldn't have to take this crap when they have concerns like this.
For whatever it's worth I've seen similar discussions in the Black Dahlia and Goebbels children talk pages. Slightsmile (talk) 17:42, 10 December 2010 (UTC)[reply]
I would say that the "moral issues" expressed there are based on a very idiosyncratic notion that the international "right to dignity" is actually a responsibility of general censorship. I would accept that the right to dignity means that a state actor can't lock detainees to a chair and broadcast to the Web as they pee themselves. But I don't accept that international law should be taken to prohibit republication of images by individuals. Certainly we have not heard much of it in the Abu Ghraib affair, which was as flagrant a violation as could be devised.
On reflection, there is a small omission in the policy here. My first reaction is that the policy prohibits a contributor from taking an autopsy photo on his own and posting it here without consent. But what is the "consent" of a dead person? Does it default to yes or to no? Does it matter if he donated his body to science? Maybe you can ask the next of kin, but there's a can of worms for you... you'll be arguing about gay marriage before you get to the bottom of it. Wnt (talk) 14:31, 11 December 2010 (UTC)[reply]
For what it's worth, Belgian law says that the heirs' right to control the dead person's image (with probably exceptions, like public places, or informational purposes) ceases 20 years after death[16]. This could provide ground for deleting a Belgian picture if lacking heir consent. Teofilo (talk) 16:57, 11 December 2010 (UTC)[reply]
I think it would be simple enough to identify which cases would require the Assumed Consent Test - If it was you, would you want to be seen like that?. For example, in my opinion : Innocent murder victims would merit the test. Excecuted Nazis would not merit the test. There will be grey areas but I think most cases would be just plain old common sense. Slightsmile (talk) 18:56, 11 December 2010 (UTC)[reply]
I am not sure about that. Do you say that images of bad guys are ok, while images of good guys are not, unless they would consent? Who's going to judge who are good guys? The victor? What about en:My Lai? Requiring consent is non-problematic when we can get images where the people do consent (such as normal sexual content). In the case of historical events (such as murders and executions) we have to use other criteria. --LPfi (talk) 20:47, 11 December 2010 (UTC)[reply]
Some of the bodies in the top picture in the My Lai would probably fail the consent test but is such an iconic photo - sure there will be special cases. In the case of many Holocaust articles on Wikipedia we see bodies illustrated but I haven’t seen any images, that I remember, humiliating the victims here on Wikipedia and there are lots of them out there. So Wikipedia does already seem to exercise some kind of discretion. It is a good idea to form some kind of policy. Not having some kind of safeguard about humiliating victims is not an option.
There should be should be some kind of require consent list. For starters start out with the most obvious cases where the test would apply - nude bodies of innocent victims seems to me a no brainer. Add or not add situations to the require consent list as issues come up. Something tells me this guy wouldn't qualify. Sorry if I use this word a lot - common sense. Slightsmile (talk) 00:58, 12 December 2010 (UTC)[reply]

while we're chatting.....[edit]

there's a deletion discussion quietly going nowhere featuring two lovely people having sex as radiohead plays gently in the background. There's no indication that both parties agreed to the worldwide publication of their efforts, nor that the band members involved in the creation of the music approve either. In the charming wiki world we live in, one of those facts will at some point, maybe after a month or two, ensure that the file is removed - funny old game.... Privatemusings (talk) 22:52, 9 December 2010 (UTC)[reply]

Radiohead plays in the bakground = copyvio. There is no need for this rubbish to delete the flick. sугсго 08:32, 10 December 2010 (UTC)[reply]
you've sort of made my point, syrcro - imagine for a moment that this is a video of someone's ex partner, and that one of the individuals involved would be mortified and humiliated by the worldwide publication of their shenanigans, which may have been recorded unbeknownst to them, do you really feel that the people in that circumstance we should most concern ourselves with are the multi millionaire chaps who happened to strum a string or two in the background? Privatemusings (talk) 10:35, 10 December 2010 (UTC)[reply]
I thought our positive consent requirement would entail at least nominal permission from both partners (exes). Ocaasi (talk) 10:46, 10 December 2010 (UTC)[reply]
well it would (it's really the only substantive change within this proposal currently) - but it's not been adopted yet :-) Privatemusings (talk) 01:36, 13 December 2010 (UTC)[reply]
No, thats a porn site. In Fact its a porn Newspaper called Österreich, not the federal broadcasting service. --Liberaler Humanist (talk) 20:45, 10 December 2010 (UTC)[reply]
the above commentor is either joking, or factually incorrect. http://www.oe24.at/ is quite clearly a news service Lx 121 (talk) 06:03, 12 December 2010 (UTC)[reply]
oe24.at is the website of the newspaper Österreich which has got nothing to do with the public broadcaster ORF (Österreichischer Rundfunk). However, the event itself was also featured in the ORF news broadcast ZIB 2. --84.112.138.221 13:15, 12 December 2010 (UTC)[reply]

Beating the dead horse[edit]

Expanding my voting rationale, I just want to emphasize the extent to which the proposed policy duplicates the existing sexual content policy COM:CENSOR and the guideline COM:PORN, as well as COM:PRP, all of which are mentioned in the proposed policy. Nearly all sections of COM:SEX are actually moot elaborations of the aforementioned Commons pillars with minor copyright issues. The three types of prohibited content are already covered in COM:CENSOR, COM:IDENT and COM:SCOPE respectively, and all of them are mentioned in that core section. The tandem of famed Miller test and the relevant legal disclaimer in COM:CENSOR for quite a long time has been and still is a powerful and succinct remedium against undesired agenda, encapsulating all relevant policies and guidelines. Without this old tandem Commons most likely would have been flooded with porn, but, from what I see, is not. And, as far as I know, the proposed policy was not put forward amid some rising wave of illegal or out-of-scope uploads. What we are doing now is jailing what has been already jailed. Brandmeister (talk) 13:10, 10 December 2010 (UTC)[reply]

Yes, mostly we are jailing what has been already jailed. But one reason for this effort is that the founder of the project, the board of the supporting foundation and perhaps a public opinion do think that we have a problem. Them thinking we have a problem is a problem.
The mentioned policy and guideline sections are not very well written and they are not very easy to point to, when dealing with somebody not knowing Commons well. The proposal is written not to change current policy and practice, but to have it well written in one place and to clarify some issues.
I was surprised by all "this is censorship" (and "we do not need porn") oppose votes. I do not see what media will be deleted by having this policy that would not be deleted otherwise or vice versa – other than this policy hopefully reducing randomness.
--LPfi (talk) 14:24, 10 December 2010 (UTC)[reply]
If this policy becomes accepted, I think that COM:PORN and COM:NUDITY might reasonably be superseded by it. Wnt (talk) 14:06, 11 December 2010 (UTC)[reply]
If all this seeks to do is to encapsulate existing policy in one easy to go to place and to placate the media then it would be fine. If the two added clauses that make legal and out of scope images subject to speedy deletion and not to a regular deletion review were removed than I as well as many others would probably support it. As for images that have already been speedy deleted as pornographic and out of scope, here are some: File:Édouard-Henri Avril (27).jpg, File:Franz von Bayros 016.jpg, File:Félicien Rops - Sainte-Thérèse.png. --AerobicFox (talk) 02:31, 13 December 2010 (UTC)[reply]

"Prudery" seems to me a very odd accusation[edit]

I notice that an enormous amount of the opposition to this proposal cites "prudery" as a reason to oppose. I fail to see wherein the prudery lies. Basically, except for noting that certain content is illegal for us to host in the country where our servers are located (something we are merely explaining and that we, as participants in the project cannot affect), there is nothing here saying that any subject matter is off limits: quite the contrary. The focus is on making sure that (1) identifiable people who are photographed in a sexual manner have given their consent for these photos, (2) content is accurately described and accurately categorized, in accord with the "principle of least astonishment," and (3) we are clear that Commons is not interested in hosting a bunch of poorly made penis shots (which are already routinely deleted). In particular, note the sentence, "Except for images prominently featuring genitalia or sexual activity, mere partial or total nudity is generally not considered sexual content."

As several other people have pointed out, this policy is far more a gathering in one place of how we already work than it is breaking new ground. Many have objected to it on exactly that basis. I personally think we should nonetheless have an explicit policy in this area in order to stave off rampages like the one Jimbo engaged in a few months back. But it is looking clear that we are not going to have consensus to adopt such a policy.

Can someone please clarify what, precisely, in the policy they see as "prudery"? Are the opponents to this really saying that they think we should host thousands of poor photographs of ordinary penises, allow photos taken with no consent of people having sex (or non-consensual upskirt photography), and/or allow a photo to be titled "my girlfriend's hot pussy" and described in terms that come straight out of Hustler? I, for one, fail to see how that furthers Commons' mission. And, if that is not what you are saying, what exactly are you objecting to as "prudery"? - Jmabel ! talk 17:55, 10 December 2010 (UTC)[reply]

I don't think that the majority of opposers think "we should host thousands of poor photographs of ordinary penises, allow photos taken with no consent of people having sex (or non-consensual upskirt photography), and/or allow a photo to be titled "my girlfriend's hot pussy etc", quite the contrary. As far as I can see from the arguments on the list most opposers, and I agree, they say that our current policies are sufficient to prevent or eventually delete those kind of pictures. What I and a lot of other opposers are saying, is that with vague wording of speedy deletion of "out of scope" images as well as the odd and legalese "Millar test", the proposed policy is in effect granting singular powers of judgement to individual admins that can be potentially misused in defence of "prudery" as well as a host of other non-acceptable excuses. --Saddhiyama (talk) 20:09, 10 December 2010 (UTC)[reply]
You are at least partly right. Looking again, I was wrong to say that an "enormous" amount of the opposition was on this grounds. I guess it's just odd to me that any of it is. - Jmabel ! talk 22:35, 10 December 2010 (UTC)[reply]
You and a lot of the opposers continue to mischaracterize the wording as speedy "out of scope", when it actually says speedy "obviously out of scope". Any admin who deleted files as obviously out which later turned out to be IN scope would most likely be severely chided (as Jimbo was for his mistakes). --99of9 (talk) 02:26, 11 December 2010 (UTC)[reply]
2 problems with this: 1. "obviously out of scope" is a subjective judgement-call & it should NOT be left to the judgement of individual admins; there is overwhelming evidence in the records of this wikiproject (& others) to demonstrate how widely varied individual opinions can be, this is why we have community processes for such decisions. 2. if an admin does "insta-delete" files as out-of-scope inappropriately, there is no way for most members of the community to check, observe, verify, or be involved in the process of dealing with it. ONLY other admins would be able to even see the files which had been removed; without any proper notice, or record of events being available to the general community. i'm not willing to give you guys that kind of power, over the whole project, blindly. & as regards: "severely chided": i'm sorry, but that just doesn't cut it! any admin who abused their powers that severely should be automatically de-adminned permanently & at least temporarily banned from the project. Lx 121 (talk) 05:29, 11 December 2010 (UTC)[reply]
The recourse that uploaders of those images have (if they think that their now-deleted images have real value) is to take it to Deletion Review. The reason why this rarely happens is presumably because most uploaders of such material have little lasting commitment to Wikimedia Commons and its goals, other than being a convenient hosting site for drunken-cell-phone-self-penis-snaps... AnonMoos (talk) 12:35, 11 December 2010 (UTC)[reply]
one key point you are either missing, or deliberately ignoring is that this process is NOT just a dialogue between one uploader & one admin, it is a community discussion & a community decision. when you make these decisions on your own, in secret, instead of openly, the there is no way for the community to review your actions. that is not acceptable. Lx 121 (talk) 00:34, 12 December 2010 (UTC)[reply]
No. Thatś not fair. There are some editors that create content, and others that have your legal mind and do all the backroom work. To be honest, there is just too much to do without learning yet another procedure- and submitted yourself to it from a point of weakness. Have I tried?- no and won't. I trust you to interpret the arcane rules on my behalf- mumble to m'sen and go back to Commonist. However, if I am asked to formulate policy I will attempt to point out a potential boo-boo.
I can see from your postings that low res genitalia is an increasing problem- well there must be technical ways to intercept it and redirect it to water-wheel driven server in Outer Oblivistan, that uploads over a 300/1200 modem.--ClemRutter (talk) 15:00, 11 December 2010 (UTC)[reply]
i would like to point out (again) that the claim about this being an "increasing problem" has still not been backed up with any data. the statement is merely an assertion/justification by certain users seeking approval of this provision. Lx 121 (talk) 00:34, 12 December 2010 (UTC)[reply]
Gloves off. I could point you to hundreds of flickr collections brimming with photos of men or women sucking on a cock or with jizz over their face, some good quality most blurry or underexposed. Regardless of whether one likes a 'cumly' lad or lass, there rapidly reaches a point where each new image impart no new information, and the growing collection simply becomes fetish. John lilburne (talk) 14:17, 13 December 2010 (UTC)[reply]
gloves off: this is wikimedia commons, not flickr. how much of this material is being imported from there to here? Lx 121 (talk) 02:13, 14 December 2010 (UTC)[reply]
It's not usually a major problem at any one moment, but it's a constant annoying trickle which could build up to become a major problem if there wasn't also an accompanying trimming and pruning back... AnonMoos (talk) 06:05, 12 December 2010 (UTC)[reply]
therefore, we should use the tools for "pruning" (i.e.: normal deletion procedures) & not a flamethrower Lx 121 (talk) 02:13, 14 December 2010 (UTC)[reply]
In other words, you insist on making additional administrative busywork for other people in order to satisfy your purely abstract theoretical philosophical views. Funny thing how your position doesn't involve creating any extra work for yourself -- only for other people... AnonMoos (talk) 11:00, 14 December 2010 (UTC)[reply]
It is only in your interpretation that this is a question of "purely abstract theoretical philosophical views". I think it is naive to think that this would be the first time in human history that a clause that is vaguely worded would be misused and misinterpreted. In fact to my experience this happening is something of a natural law, so in a way your position could be construed as the purely abstract theoretical philosophical one, but since namecalling will not get us anywhere I will refrain from doing that. --Saddhiyama (talk) 16:19, 14 December 2010 (UTC)[reply]

Which taboos should be respected?[edit]

We discuss here about one of western taboos. The discussion theme should by wider. Which from many moral and cultural prohibitions and scruples should we respect, and which shouldn't? Child porno - and as appears also many others - should be prohibited, and Muhammad images shouldn't? Why not conversely? Should be Commons censored only from US or western point of view? --ŠJů (talk) 18:08, 10 December 2010 (UTC)[reply]

Child pornography is prohibited by the laws of the United States and of Florida, while depictions of Muhammad are not (and in fact there have been many images of Muhammad made in traditional Muslim cultures, some of which you can see for yourself at Category:Muslim depictions of Muhammad). And if you want to talk about "selective taboos", then images which defame Jews and Judaism are far more protected and sheltered at Commons than images which defame Muslims and Islam... AnonMoos (talk) 18:15, 10 December 2010 (UTC)[reply]
In fact we do respect the taboo on Muhammad images. A separate category Category:Depictions of Muhammad was created, with no images other than in subcategories. I removed an "Everybody Draw Mohammed Day" image from Category:Muhammad in the spring and the category seems to have remained clean (perhaps by the efforts of others).
This is the same as we do with sexual content: place them in more specific categories. Files illegal to host is a totally separate problem, that could be solved by somebody putting up a server somewhere else.
In any case servers outside the control (and thus responsibility) of WMF is something that is independent of this proposal. I am not going to fork Wikimedia to be able to host child pornography, although a mirror with separate deletion decisions could be good to have – mostly for retaining politically sensitive content that might get deleted in USA.
--LPfi (talk) 19:52, 10 December 2010 (UTC)[reply]
as regards the previous comment about depictions of muhammad: what on earth are you talking about!? there is NO policy, or precedent, or community decision about censoring religious/political materials. the muhammad-related materials are subcategories logically, for purposes of efficiency. if you have been deleting materials relating to mohammad, because they "violate a religious taboo", then please provide links to the relevant deletion debates, because i would very much like to know what has been removed, & see how consensus was reached on this! Lx 121 (talk) 05:14, 11 December 2010 (UTC)[reply]
Dude, he didn't say he deleted it, he said he moved it to a subcategory. AnonMoos (talk) 12:26, 11 December 2010 (UTC)[reply]
The subcategories were created this spring, probably in direct response to a person from WMF (I don't remember who) complaining about the issue, not because somebody found them more efficient. It was easy to create the category and seemingly nearly nobody had strong objections (that one image was probably a protest, and no, I didn't delete it). For sexual content we need a lot of categories, so it is easier to state the need in a policy than to watch every potentially problematic category. --LPfi (talk) 21:13, 11 December 2010 (UTC)[reply]
thank-you for clearing that up, & sorry if i misunderstood your original comment. Lx 121 (talk) 03:22, 12 December 2010 (UTC)[reply]
I think, that my taboo about bare hands should be respected. We should also respect the taboos about the theory of evolution. Seriously: The worst thing we can do is to care about any taboos. There is a taboo about almost everything. WP ought to provide serios , not censored content. As I mentioned the Theory of Evolution: The argument is, that we have to use the Miller Test as the servers are in Florida. What would happen, if Florida would introduce a law against the Theory of Evolution? Would we have to replace the Theory of Evolution by the "Intelligent Design"? --Liberaler Humanist (talk) 20:43, 10 December 2010 (UTC)[reply]
If Florida introduced a law against the Theory of Evolution, however absurd, illegal, and inconcievable that may be, we would have to move the servers or comply with the law. In such a case, it's possible (if again absurd and inconcievable) that every place in the world has introduced laws against the Theory of Evolution. In which case, policy or no policy, we'd have to comply with the law.--Prosfilaes (talk) 20:52, 10 December 2010 (UTC)[reply]
since i seem to be spending a great deal of time lurking on this page (at least until the vote closes), i can't resist commenting on this one! if florida ever did introduce such a stupid law: 1. it would be thrown out as unconstitutional 2. if not, then we would need to relocate our operations & servers. 3. if the whole world went crazy enough to do such a thing, it would be time to take our wikis underground, & viva la revolucion! :P Lx 121 (talk) 05:14, 11 December 2010 (UTC)[reply]
I have observed the debattes about Evolution vs. Creationism for many years, there are some groups which promote a ban of the theory of evolution. I do not think, that these groups are disconected to established political parties. However, I do not think, that the servers should be stored at a single place. As you have mentioned the constitution of the united states: The secound party in that country is dominated by a group of very strange people. I dont want to mention all the strange positions of these very strange people, but the scenario of having a political and legal climate which would seriously disturb our work is not as unrealistic as it seems to be. The servers should not be stored at a single place. --Liberaler Humanist (talk) 17:47, 11 December 2010 (UTC)[reply]
without intending to disparage the current state of us politics, i agree, on basic principles, that we shouldn't have our servers concentrated in one country. makes it too easy to shut us down, & leaves us too vulnerable to changes in the legal climate in one host country. the wikileaks situation certainly demonstrates that!Lx 121 (talk) 03:22, 12 December 2010 (UTC)[reply]
This is a straw man - child porn is also banned in Muslim countries, as well as most countries of the world. Also, child porn is not a "taboo", it is visual rape - taking images of someone's sexual organs without consent and no ability to legally consent. The WMF does not condone the spreading of sexual violence through mass transmission of depictions of it. Ottava Rima (talk) 04:34, 11 December 2010 (UTC)[reply]
agreed with the point about the near-global illegality of child porn; but, with no personal disrespect intended to the commentor, the term "visual rape" & the stuff about "spreading sexual violence through depictions of it" i would consider as hyperbole & psychobabble; it's not a serious, scientific description of the actual effects/harms done. this is not intended to in any way minimize the considerable potential for harm by such activities, but it is unhelpful to obscure the real damage with simplistic, one-dimensional, political sloganeering Lx 121 (talk) 05:14, 11 December 2010 (UTC)[reply]
The "servers in Florida" standard actually comes from COM:CENSOR. It wasn't invented here. If WMF sets up another server outside the U.S., then after some legal consultation to consider the effect of integrating it into the sites, we could work on changing this policy and COM:CENSOR to allow for material not legal in the U.S. kept on the other server.
That said, the specific example of child porn has another problem altogether which would be difficult to deal with, even if the Foundation really wanted to set up a server in some Shnagri-La specifically to allow it, which is that it is subject to one of very few censorship laws in Western countries where people are punished merely for looking at it, rather than posting it. By comparison I think I recall that when the Acehnese Wikipedia was having its protest about Muhammed images, they actually put a header on the main page linking to them.
Now in saying this I should disclaim that I don't mean to support or excuse this Western taboo - several billion dollars yearly are made by abducting children to make child porn, which would presumably be much curtailed if we had the common decency to allow those abused and photographed as children to take possession of the copyright over the images should they survive to adulthood and to distribute them commercially if they wish. We have a situation in which according to w:child prostitution there are 162,000 child prostitutes on the streets in the U.S., and yet large numbers of highly technical prosecutions are being made against people for looking at pictures, or possessing a few pictures mixed in with their porn feeds, or kids "sexting" one another. It really isn't sane or decent - our society should be putting its resources freeing children from a condition of literal and depraved slavery, while recognizing that the right to a free press, no matter how offensive, is God-given and inalienable. Wnt (talk) 14:02, 11 December 2010 (UTC)[reply]
"which would presumably be much curtailed if we had the common decency to allow those abused and photographed as children to take possession of the copyright over the images should they survive to adulthood and to distribute them commercially if they wish." Sick. I really hope you never get the chance to decide on these matters.--93.131.239.236 14:13, 12 December 2010 (UTC)[reply]
Real C P is illegal in any country because the subjects depicted have been psychologically harmed, and the circulation and republishing of said material is, in a way, a tacit approval of said practices, which encourages the criminals to produce more of it. Children are like sponges, they absorb information without malice or critical thinking, and then they internalize what was done and think it's normal, while it is actually wrong. They could become like Pedro Lopez and Daniel Camargo Barbosa later in life, and contine a never ending process of harming other children. Aberforth (talk) 14:10, 11 December 2010 (UTC)[reply]
I once read Mein Kampf, but it wasn't a tacit approval. I don't believe that watching axe murderer movies makes kids into axe murderers, I don't think that watching gay pride marches makes children gay. The causes and potential cure of pedophilia is indeed an interesting medical question, and I think that there are many fairly easy neurological/fMRI research projects that would be interesting to try there (e.g. trying to use biofeedback in conjunction with psychological techniques to get patients to stimulate the neural pathways that have been found to be lesioned in some pedophiles, in order one hopes to use adult images to upgrade their animatype), but alas, this isn't the forum... Wnt (talk) 14:41, 11 December 2010 (UTC)[reply]
The laws of the United States and Florida ban child pornography, and that's pretty much the end of the matter as far as WIkimedia Commons is concerned. AnonMoos (talk) 16:56, 11 December 2010 (UTC)[reply]
Agreed. I brought up the argument only to demonstrate that we are making this decision on account of the law - Commons is not making a choice to censor. Wnt (talk) 03:43, 12 December 2010 (UTC)[reply]
There is nothing in this decision that has anything to do with the law. Current policy already says that all material deemed illegal will be deleted. Under the new proposed policy a single admin with no legal knowledge can determine if they think something is illegal or irrelevant and delete it without discussion. Legal areas are blurry, is child porn illegal if it is with clothed anime characters in erotic positions and not with real people? Is anything featuring naked children child porn? If so then are depictions of the nude, children angels known as Cherubs from the Medieval period forms of child porn? Obvious cases are already speedy deleted, but more nuanced cases need to be handled on a case by case basis. The same goes for relevancy, an admin should not be able to decide by themselves if something is relevant enough to stay on Wikimedia except in the most obvious cases(which under current policy they already do). --AerobicFox (talk) 05:01, 12 December 2010 (UTC)[reply]

It concerns me to see so many participants here basing their thinking on whether this is a question of taboos. If it were, I would agree that we should strongly resist censorship. But it isn't. It's about laws, specific laws, those of Florida and the US. The laws don't have to be "right" in the opinions of Wikipedians. And no amount of talk here will change those laws. But the servers, in their present location, are governed by them. If, instead, the servers were in (to take an example) Saudi Arabia, we would be looking at a very different situation indeed. It's fair to discuss whether moving the servers would better serve Wikimedia's mission of free information, but that's not what's on the table here. --Tryptofish (talk) 20:30, 12 December 2010 (UTC)[reply]

If this policy only covered illegality, we'd be fine. I think there's near-universal consensus for the already-established policy that "US servers must comply with US law".
But Commons:Sexual Content, as currently written, goes way way beyond what is required by US law and enters the realm of taboo. US law doesn't say anything about our projects' scope, it certainly doesn't require "out of scope" speedy deletion. US law requires us to comply with orders to remove material once such material has been ruled to be obscene by a court of law, but US law doesn't require us to have amateur volunteers preemptively delete material just on the grounds that it might be, in their untrained eyes, potentially or theoretically obscene.
If we'd just stuck to the letter of the law, as our status quo policies do, I think this would have reached consensus. The controversial parts are the parts that go above and beyond merely complying with US law-- complying with US laws is obviously okay, it's what we do now. Doing more than required by law, however, starts to look more concerned with public relations and cultural taboos, not actual legality concerns. --Alecmconroy (talk) 04:42, 13 December 2010 (UTC)[reply]
Please consult my previously-posted message of "11:27, 10 December 2010" above. Having a site which indiscriminately accomodated absolutely everything which was not definitely illegal might be a noble experiment in theoretical libertarianism, but it's never been the way that Wikipedia Commons has worked... AnonMoos (talk) 18:56, 13 December 2010 (UTC)[reply]
But that is no excuse for introducing a vaguely worded clause that has the potential to be used for god knows what. That Wikimedia Commons is not a noble experiment in theoretical libertarianism, does not mean that we should make it a noble experiment in flexible clauses. If it instead stated clearly what should not be allowed much of this would not be a problem at all. --Saddhiyama (talk) 16:13, 14 December 2010 (UTC)[reply]

The current situation[edit]

The prosposal suggests, that it was not possible to delete half-pornographic waste. However, it is possible to delete useless or pornographic pictures, the problem seems to be a administrative problem. I have asked for the deletion of the terribilities uploaded by User:Toilet (Nomen est Omen!). They are still here. Is it possible, that the problem could be solved without introducing questionable us-laws. --Liberaler Humanist (talk) 17:47, 11 December 2010 (UTC) Example:thumb|Really useful[reply]

His pictures are really useful, maybe they could be printed and given to kids so they can color them. However they aren't suited for this project. Aberforth (talk) 18:55, 11 December 2010 (UTC)[reply]
For this image, I am in agreement with Liberaler Humanist but it could be improved by cropping off the lower 40%- then you could derive an image from it that your kids could colour in and use as Christmas Cards. But even according to the rules above, the image concerned could not be deleted- it is tasteless but in scope as a illustration of a medical condition, it is not sexual content- the anal region is not the genital region, and there is no sign of a sexual act. Checking w:en:Miller test it does not fit on to any of the three prongs the subject is too old to invoke w:en:Dost test. --ClemRutter (talk) 21:17, 11 December 2010 (UTC)[reply]
One problem is pictures of blow-jobs and the like that lack any indication that the person shown is aware of the upload and has consented to it. They are often imported from Flickr, and a few weeks later, the Flickr account shows as "no longer active". Such cases are difficult to address via DR. Someone will say it is educational. Another will say it is in use. Another will say you don't see the whole person's face. The upshot is that one way or another, the image is kept. This was one of the things this policy was meant to address. The Foundation has had several complaints from women whose images were used without their consent, yet were offered to the world as free content for any type of re-use, including commercial re-use. --JN466 21:46, 11 December 2010 (UTC)[reply]
And what is the point of having more than one picture for a given subject? How many blow jobs do you need here to illustrate the relevant page? As for Flickr, the site should be blacklisted for being unstable and being a hive of copyright violations. Most users on that site don't understand the licenses. Aberforth (talk) 21:54, 11 December 2010 (UTC)[reply]
ok. so to summarize the key points in the preceeding conversation:
1. wikimedia commons does not need more than 1 image of any subject. (therefore we should delete the rest of our material as redundant?)
2. flickr should be banned as a source of materials for commons.
3. we need to make it easier to suppress undesirable materials on here.
i would suggest writing that up as a policy proposal & submitting it to the community for a vote.
Lx 121 (talk) 00:23, 12 December 2010 (UTC)[reply]
How do you feel about sexual images uploaded without the consent of the people depicted? Do you feel this is something we should worry about? --JN466 15:43, 12 December 2010 (UTC)[reply]
consent by a person, for the publication of their image/likeness is a separate issue from sexual content rules. we should have a policy worked out to cover all the variables on the consent-to-publish issue, & then apply it across the board; for sexual & non-sexual content. Lx 121 (talk) 02:01, 14 December 2010 (UTC)[reply]

This seems to me to be a much broader question of personality rights, that has nothing to do with sex per se. But no one seems to want to broach that one because it would require us to delete most of our pictures of people. Gigs (talk) 21:02, 12 December 2010 (UTC)[reply]

Change to summary of poll arguments[edit]

Under the Oppose section it lists:
"Not strong enough at keeping out pornography"
I have not found one person who has opposed the proposed policy and cited this as a reason. While there are people on the support side who think the proposed policy doesn't go far enough, there is nobody in the opposition who is opposing it on those grounds. This section may mislead newcomers into thinking that is a significant argument of the opposition when overwhelmingly the opposition thinks it goes too far.
Does anybody else opposing this policy want this removed as it doesn't seem to accurately portray our arguments against it? --AerobicFox (talk) 04:50, 12 December 2010 (UTC)[reply]

agreed it is not the best wording (& could be improved), but do i think that a few of the oppose votes (definitely a minority of them) were predicated on the basis of wanting a more restrictive policy. sometimes politics makes for strange bedfellows. Lx 121 (talk) 06:00, 12 December 2010 (UTC)[reply]

(edit conflict)

No- I read all the posts yet again- No opposer has made that claim. It only occurs as part of rhetoric Supporter #7. So presumably stuck in the mind of the editor drawing up the list of supposed arguments. The claim needs to go. In reading the posts, I more convinced about how unfair this whole process has become to one whose native language is not English. If I had to write a summary of the arguments used I think it would be
Support: Those who think that Commons is a photo collection for en:Wikipedia
Those who think any policy is better than none
Those who think that Florida state law overrides everything else
Oppose: Those who think that Commons is a photo source for many Wikiprojects- including those not yet in existence
Those who think that a badly drawn policy is harmful
Those who think that Commons is must not be restricted than Florida state law.
Finally, the most quoted and supported oppose post was that of Ruslik ,(Oppose #6) that more or less sums up the oppose argument. I quote it here for reference.
Oppose. The proposed policy is just an example of m:instruction creep. Approximately one third of it only repeats what is already in other guidelines and policies (illegal content, copyvios and BLP violations). The other third is devoted to vague and useless interpretations of various US federal laws. These interpretations are not written by lawyers and may be misleading. The remaining third contains some rather explicit descriptions of sex acts, which the authors of the policy think are inappropriate for Commons. I am interested whether this proposed policy itself should be marked by some kind of a "sexually explicit" banner?
--ClemRutter (talk) 10:35, 12 December 2010 (UTC)[reply]

(end of edit conflict- good that it has gone)

Actually, if you read the opposes, some find the implicit support of COM:CENSORED and COM:NUDITY and COM:SEX to be too permissive, and the educational scope is too broadly defined for non-encyclopedic purposes. It seems contradictory that the policy which promotes greater regulation would be seen as insufficiently strict, but that is indeed how some read it... I looked fairly closely and the oppose votes which invoked this were 53 and 143. If it's confusing, I can take it out, but I think it's accurate. Ocaasi (talk) 10:04, 12 December 2010 (UTC)[reply]
I took it out, because it was making the list seem contradictory and confusing (which might be a reflection of the policy stances). If someone can think of a way to point out that some opposes were based on the policy not being strict enough, go for it. Ocaasi (talk) 10:08, 12 December 2010 (UTC)[reply]
Note, vote numbers were changed after someone voted at the top of the list. The relevant Opposes are now 1, 54, and 144, all of which clearly identify this Policy as too lenient. Add to that 194, 195, and 196, as well as any which identified the policy as biased towards conservative us attitudes about sex or practicing censorship. It's there.Ocaasi (talk) 12:20, 12 December 2010 (UTC)[reply]

IP votes[edit]

Can someone indent the IP votes? It is bad enough that we have brand new accounts with editing histories voting but the IPs votes are just outright silly. Can we have some semblance of proper practices? Ottava Rima (talk) 17:11, 12 December 2010 (UTC)[reply]

p'raps we should have a chat about a sensible standard for being 'enfranchised' - I had a click through a few of the voting accounts above, and found some with rather.... um... shallow editing histories, and obviously there's the IP issue ottava mentions (personally, I'm ok with an IP voting, but prefer that they have some significant contributions prior to the vote) - obviously this needs to reflect the 'community voice' as clearly as possible, so what best standards should we consider to apply to the data above? - something like registered for x long and made x contributions? or is there a better way? Privatemusings (talk) 22:30, 12 December 2010 (UTC)[reply]
If we start this game we can also point editors that have been indef blocked in any wikis. --KrebMarkt (talk) 22:41, 12 December 2010 (UTC)[reply]
well we could, I suppose - but personally I don't think that's a good metric really - generally speaking trials and tribulations on individual projects don't really follow editors around - and certainly not in terms of whether or not they're enfranchised. I think there are probably some more sensible markers we could lay down - would you agree? Privatemusings (talk) 00:16, 13 December 2010 (UTC)[reply]
Like people "Not so Neutrally Canvassing" in other wikis, for example the French Wiki? --KrebMarkt (talk) 05:17, 13 December 2010 (UTC)[reply]
Why do you hate French people?AerobicFox (talk) 06:33, 13 December 2010 (UTC)[reply]
I'm French myself :p
There is a non neutral canvassing in there. --KrebMarkt (talk) 08:37, 13 December 2010 (UTC)[reply]
That explains how you could read that. ^_^ --AerobicFox (talk) 16:46, 13 December 2010 (UTC)[reply]
nice idea, if you do not like the outcome of this poll, why do not change the voting policy whilst this poll is still open? oO there is no stable version of this proposal, no translation at all. maybe this poll should be closed rethought and then maybe reproposed. I read that someone hoped this proposal could represent a en:consensus. At the moment there is not even a consensus that this poll is legitimate. :/ --Catfisheye (talk) 08:14, 13 December 2010 (UTC)[reply]
IP editors have always been disregarded in "votes". It is so easy to game that the IPs become pointless. Ottava Rima (talk) 14:12, 13 December 2010 (UTC)[reply]
You don't get the point, right? First create rules, then follow them. Not change the rules in any moment you like and say that they were this way from the beginning. --Catfisheye (talk) 14:22, 13 December 2010 (UTC)[reply]
The rule has been in place since the beginning - IP votes are always disregarded. I'm asking for enforcement of the standard. Ottava Rima (talk) 19:06, 13 December 2010 (UTC)[reply]
In which place? :D Here is written nothing and yeah, there it should have been written or been given a link. What do you think of the idea that votes of users blocked indef elsewhere should not been allowed? I think this would be fair, because wiht blokcing IP votes we leave AGF without a proof just because of prejudice, but the indef blocked voters have already shown that the could not work in an encyclopedic or/and cooperative way. --Catfisheye (talk) 11:21, 14 December 2010 (UTC)[reply]
Your argument shows that you concede and instead wish to perform a distracting measure. IPs, as we all know, can be changed, rotated, and easily gamed with proxies. If someone does that with user accounts they will be banned, so such shenanigans are less likely to happen. This is about one vote per person. Your unwillingness to address the actual problem is a problem. Ottava Rima (talk) 22:19, 14 December 2010 (UTC)[reply]
My argument that rules should be made before voting is distracting? Sorry, I keep it short: Have you any proof for what your Saying, i. e. "IP votes are always disregarded"? And that this rule(?) applies here? --Catfisheye (talk) 00:02, 15 December 2010 (UTC)[reply]
Are you serious or are you kidding? You must be kidding. You don't appear to be new here, so I will assume you know our standards and practices. Ottava Rima (talk) 16:12, 15 December 2010 (UTC)[reply]
Hej, actually you're style of chatting does not distract me from the fact, that you still bring no proofs. :D So give me links and I will stop asking for them. ^^ Btw the proposal is closed. Just in case you did not notice. --Catfisheye (talk) 17:28, 15 December 2010 (UTC)[reply]
I suppose plenty of the ip-users and new accounts are users from the Wikipedias who don't have SUL or Commons accounts. Decisions here are important for the other projects, so voicing an opinion should be allowed. I have no idea how they should be given weight when judging consensus, though. --LPfi (talk) 11:18, 13 December 2010 (UTC)[reply]
I just want to support the idea, that if IP votes don't count, votes of indef blocked in other wikimedia projects should not be counted as well. ^^ --Catfisheye (talk) 15:15, 13 December 2010 (UTC)[reply]
I would give the same weight to both Ip vote and indef blocked users in other wikimedia projects. It's matter of trust. How much can you give to an Ip and how much you can give to someone who succeed to get indef blocked in any Wikimedia projects? --KrebMarkt (talk) 22:27, 13 December 2010 (UTC)[reply]
In this instance, the question is moot. support and oppose are roughly equal-- so, this proposal is as divisive as possible (solely in a mathematical sense of course. :) ). But I agree in the future, we need to make stronger efforts involve editors from projects beyond the european languages-- supporters of this proposal are right to point out that 'non-western' values have been underrepresented in this discussion, and if those voices were here to take part in discussions, perhaps some clarity would be gained.
Of course, I wish I personally could facilitate such a discussion, but alas, I'm a US American-- I wasn't allowed to learn a foreign language until my brain was no longer young and spongy. :) --Alecmconroy (talk) 11:10, 14 December 2010 (UTC)[reply]
"'non-western' values have been underrepresented in this discussion"
This discussion was never about values, but legality and educational usage. Non western users can and should still be involved, but debate about censorship due to their values and not any type of law would not have been appropriate. --AerobicFox (talk) 20:22, 14 December 2010 (UTC)[reply]

Global notices needed - Farsi, Chinese, etc[edit]

I find it odd how en wiki and Germanic wikis get notices, but the majority of cultures and countries in the world that coincidentally happen to have conservative views are not getting notices. I see dozens of users from the German and Dutch wikis, but those from Chinese, Farsi, Arabic etc would definitely have far more people and far more in support. Isn't that odd and inappropriate? Ottava Rima (talk) 22:38, 12 December 2010 (UTC)[reply]

Maybe it has to do with the language barrier. Younger Germans, like me, usually have no big problems to read and understand English writings and are a bit familiar with the US law system. But other regions can't even read the policy (no English as major second language). How can we expect them to vote? If they aren't even notified, then this stinks like rotten fish. --Niabot (talk) 23:02, 12 December 2010 (UTC)[reply]
Yeah, lemme just go ask all the kids in my highschool who took Farsi/Arabic to come translate.AerobicFox (talk) 23:09, 12 December 2010 (UTC)[reply]
O sorry, Ottava Rima, that "Germanic" users do notice the notice in English. Ouch. Btw if those Farsi/Chinese speaking people do not understand the site notice how could you expect them to understand the proposal? ... --Catfisheye (talk) 23:58, 12 December 2010 (UTC)[reply]
We can translate it. It isn't fair that people can claim cultural bias while simultaneously excluding 70% of the world that has a much more conservative view on sexual content. It is hypocrisy. Ottava Rima (talk) 02:26, 13 December 2010 (UTC)[reply]
So do translate it. You can talk about hypocrisy but I mentioned the problem already some threads above. So maybe read first talk then. You are more than unpolite. --Catfisheye (talk) 04:04, 13 December 2010 (UTC)[reply]
Before you post incivil attacks like above, why not pay attention to the fact that this is asking for notices at those other language wikis and your thread has nothing to do with that. Rudeness is not an appropriate response. Ottava Rima (talk) 14:10, 13 December 2010 (UTC)[reply]
Ouch, who complained first that the wrong persons participate because of their knowledge of English? Because there are _no_ extra sitenotices for "them". And who talks then of hypocrisy? More personal you can't get, I guess.
Of course your demand has something to do with what I have already said above, because it won't be easy for people without a certain proficiency of English to vote, if their is no translation and I want them to vote as well as you want them. But I ask you, what should they do with site notices, they do understand, but proposals, they don't?
I do not care if your are for or against this proposal but please try to understand what I try to say to you, ok? Have you found someone willing to translate the notices? Maybe we could convince them to translate the proposal as well. --Catfisheye (talk) 14:37, 13 December 2010 (UTC)[reply]
Your post has nothing to do with mine, so if you expect a response please correct it. Furthermore, there are many people on Meta who translates things into farsi. We even have a Farsi based Steward. Ottava Rima (talk) 22:51, 13 December 2010 (UTC)[reply]
That isn't the point. Such an global decision should be at least available in three or four major languages. Otherwise the will of many other projects is basically ignored. Ask Chinese or Russian people which languages they speak, and which second language they learned at school. Most likely none of them was English. So even if they got informed, how should they be able to vote? Throwing dice? So long i can't consider this poll as fair or balanced. --Niabot (talk) 01:21, 13 December 2010 (UTC)[reply]
That would mean English, Chinese, Arabic, Spanish, Russian and French? That was already six and probably shows my ignorance about most of the world, as 2/3 are indoeuropean (but en:List of languages by total number of speakers supports the choice, if Hindi/Urdu speakers using Internet are supposed to know English). Might be doable, but only when a proposal has a chance to remain stable and get consensus. It seems that this proposal should be translated only after some changes. --LPfi (talk) 12:24, 14 December 2010 (UTC)[reply]
Strongly agreee that we should solicit participation from as much of the Wikimedia community as possible. Non-english-speakers should be, insofar as possible, equal participants in making these kinds of "big decisions" that will affect us all. --Alecmconroy (talk) 05:49, 13 December 2010 (UTC)[reply]

Why Farsi and Chinese specifically? Neither seems to be in the top ten Wikipedias... AnonMoos (talk) 18:48, 13 December 2010 (UTC)[reply]

It says "etc" after it. Did you want a really long title? I added Arabic above, and you can add many others - Russian, French, Spanish, etc. It would be good if -all- the Wikis were given a notice. That is the point. Commons is connected to every Wiki after all. Ottava Rima (talk) 19:05, 13 December 2010 (UTC)[reply]
However, as a practical matter, it's not going to be translated into every language, and failing to realistically prioritize your request does not make it more likely to be fulfilled. AnonMoos (talk) 10:40, 14 December 2010 (UTC)[reply]
So, you are saying that it is perfectly fine to have a completely skewed minority dominate the policy proposal while claiming everyone else is biased and such? It is hypocrisy of the highest form going on in the opposes above. The majority of the world and WMF users do not hold the views of the opposes, and it seems obvious that it is only through skewed canvassing that we get a close result above instead of a complete landslide for the policy. Ottava Rima (talk) 22:17, 14 December 2010 (UTC)[reply]
Please keep your hypocrisy judgement for yourself. Repetition of personal opinions does not create truth. The persons that voted with oppose, voted with oppose because of their reasons not because they try to biase this poll. It is not their fault, that there are no site notices in any other language than English or a translation of this proposal.
Instead of judging around you should have tried to find somebody who is able to translate. But all I see is that you are chatting here for days know. --Catfisheye (talk) 00:16, 15 December 2010 (UTC)[reply]
It is interesting how your statement is more applicable towards those claiming some awful US bias. Ottava Rima (talk) 16:09, 15 December 2010 (UTC)[reply]
Whatever, dude -- You can be as theoretically ideologically pure as you want, but the simple fact is that in the real world this simply ain't gonna be translated into every single language, and therefore when you refuse to prioritize any particular language over any other language, it demonstrates the purity of your theoretical ideology much more than it does any effort by you to offer a practical concrete constructive proposal... AnonMoos (talk) 02:18, 15 December 2010 (UTC)[reply]
So, in the real world it is fine to have a small minority canvass support to interfere with a necessary policy? Nice. I bet if you were on the other side you'd be one of the first to cry foul. Ottava Rima (talk) 16:09, 15 December 2010 (UTC)[reply]

When you think it can't get any worse...[edit]

Commons:Deletion requests/File:Wikipe-tan Cartoon - Something is missing.png Anyway a good example why i voted with oppose. --Niabot (talk) 03:02, 13 December 2010 (UTC)[reply]

I just don't see how this qualifies as being "sexual", is there some innuendo I'm missing?AerobicFox (talk) 04:49, 13 December 2010 (UTC)[reply]
If anything, that cartoon would make you want to support because it makes fun of all the porn. Ottava Rima (talk) 14:08, 13 December 2010 (UTC)[reply]
If you look at it at different angles, also regarding this poll, that is really a good reason to oppose. The Deletion Request itself makes clear how uncontrollable deleting of offending material could end. One really good reason to stop this nonsense. --Niabot (talk) 14:22, 13 December 2010 (UTC)[reply]
No, it doesn't. The deletion request is done on a pro-porn agenda because the image makes fun of users like you who are trying to justify the unjustifiable. Stop trying to spin it to exactly opposite of what it is. The image makes fun of you. It doesn't support you in any way. Ottava Rima (talk) 22:49, 13 December 2010 (UTC)[reply]
ok, this quote got my attention: "The deletion request is done on a pro-porn agenda because the image makes fun of users like you who are trying to justify the unjustifiable"
o__0
i think that's going a little bit over the top; it certainly violates agf. Lx 121 (talk) 01:50, 14 December 2010 (UTC)[reply]
Odd, because it is a direct reverse of the original claim of the first entry with evidence tha the reverse of the first post is true. If you want to make accusations of good faith, please do so to the appropriate party. Making such claims as above can be seen as incivil and are not appropriate to proper discourse, especially when done to try to justify the first post of this thread which was highly inappropriate and without factual basis. Ottava Rima (talk) 04:16, 14 December 2010 (UTC)[reply]

Dealing with rejection[edit]

With the raw vote count very close to the 50% point, it looks like we're not going to have "consensus" behind the policy as proposed. It is possible that we could strip a few controversial provisions and try again, but I don't think the community is going to want to deal with another wiki wide vote on a new policy anytime soon. Therefore I propose that after the poll ends and there is formal failure to achieve consensus, we should do the following:

  • Evaluate the vote carefully. Specifically, we should get a fairly solid number as to how many people find even this policy too restrictive, to use preemptively against other proposals.
  • Process the draft and underline all mandates (change to existing policy or guideline).
  • Take all the unmarked text, giving instructions about policies and guidelines and laws without enacting change, and put it into a file Help:Sexual content with relatively little alteration.
  • Take all the underlined text and boil it down into small, self-contained proposals.
  • Propose those changes one by one at COM:NUDITY guideline and see if any get strong consensus.
  • In order to keep the readership at COM:NUDITY high, create a link from the vote result to that guideline's talk page.
Agreed. Definitely split the proposed policy changes away from the explanatory instructional text! This is a wonderful idea that would make discussion a lot more effective. The proposed policy changes will get the focus they deserve without all the distractions and disputes over the instructional text. Meanwhile, the instructional text doesn't need to be policy itself, it can still serve as an 'informational guide' for our readers and editors. --Alecmconroy (talk) 09:39, 14 December 2010 (UTC)[reply]
(edit conflict)
Good idea to think about after the poll. In the last week I have followed up all the leads and viewed images in categorys I really don't want to enter again. In a previous post I have summed up the attitudes displayed on the two sides of the argument and while applaud the intention to find a way forward it is written with an underlying assumption that a new policy is required. I think that is disputed by many of the *oppose* posts.
I think the proposers should evaluate the results and their own cultural conditioning that has lead them to believe that sexual content should be treated differently to other images. I think that the proposers should consider how the instruction creep happened, they should definitely expand up the en:wiki articles to which they refer to GA before even including a link, and then ensure that there is an equivalent article in other wikis : nl, de, fr, es etc.
And talking about revolting images- I have just visited Commons:Nudity and suggest that the page is restored to something useful. I note this page is not available in German Spanish Polish Russian. It would be sensible to be more positive and give examples there of borderline images that have been discussed and accepted than to edit in such a culturally conditioned way.--ClemRutter (talk) 16:12, 13 December 2010 (UTC)[reply]
(end of edit conflict)

Another idea[edit]

I'd propose a different course of action:

Definitely

  • Make deletions of out of scope content follow normal not speedy processes
  • Have this policy supersede COM:PORN and COM:NUDITY (and demote those to guideline status if keeping them at all)
  • More explicitly clarify the procedures and protections as requested in numerous Oppose votes
  • De-legalese the legal section, translating all instructions into plain English with specific Commons-relevant guidance
  • Revote within 3 months, pending a survey of non-dogmatic oppose voters who have a chance to reevaluate a draft of the new policy.

Possibly

  • Clarify the phrase 'low-quality' to deal specifically with random junk shot photos.
  • Address erotic art/art erotica specifically.
  • Solicit feedback from the Foundation regarding possible implementation of user-side image-filtering capabilities. If affirmative, anticipate these changes with a stance to support user-side selection without limiting the core collection.
  • Clarify that decisions regarding child pornography and privacy violations are irrespective of the location of WMF servers, and that though these need necessarily be followed, that direction is rooted in broader principles related to not doing harm.
  • Draft a separate or sub-policy explaining the Commons stance on other controversial content, including Violence and Culturally Sensitive Images.
  • Work with chapters and/or Google Translate to post local language drafts in at least the top 5 languages represented on Commons no less than one week before the start of the poll. Ocaasi (talk) 16:05, 13 December 2010 (UTC)[reply]

I'm one of the proposers, and I don't particularly believe that sexual content should be treated differently from other images, with the exception that consent of the person photographed becomes more of an issue. If someone happens to snap an unauthorized photo of me sitting in my living room, no great harm in uploading it. If someone happens to snap an unauthorized photo of me having sex in my living room, that's a rather different matter. I believe that is more or less what the proposed policy says.

If we don't come up with something here, folks - at least a guideline - I can more or less guarantee that WMF will impose something, and that it will be tremendously more restrictive than what we have proposed. - Jmabel ! talk 02:47, 14 December 2010 (UTC)[reply]

We are already deleting illegal media.
Regarding the threat of a more stricter policy: That's the idea behind it: Self-censor or we will get censored. Great idea. Pure freedom!
For some reason I heard the idea about making a fork several times now and for some reason I think we do not need a WMF trying to impose rules on us. --Saibo (Δ) 03:20, 14 December 2010 (UTC)[reply]

Specific guidelines for speedy deletion[edit]

Who here would also support this if the guidelines for speedy deletions were as followed.

  • Clear and unambiguous child porn:

Defined as photographs or photographic video of children engaging in sexual intercourse where sexual intercourse is defined as vaginal or anal penetration, or oral sex.

Questionable material featuring children that is not photographic(animated, painted, etc), or of children engaged in apparent sexual conduct that is not intercourse should undergo a normal review process.

  • Redundant Images of lower quality:

These include images whose function is already clearly fulfilled by another image, and are of a lesser quality. Images that show the same subject, but have any clear differences (apart from quality) are not subject to speedy deletions and should undergo a normal deletion review. Redundant images typically include, but are not limited to:

  1. low quality images of genitalia(especially male genitalia)
  2. works of erotic art taken by a disposable camera or cell phone where a higher quality photograph of the same piece has already been uploaded
  • Deletion at the subject's request:
If Commons hosts a media file of you that:
  1. was taken in a private, non-public context;
  2. shows you wholly or partially nude and/or engaged in sexually explicit conduct; and
  3. you would like removed from Commons for privacy reasons;
please contact the Wikimedia Foundation. Your communication will be confidential and will be viewed only by trusted volunteers who have identified themselves to the Wikimedia Foundation by name. For further information see Meta:OTRS.

Due to the complex nature of issues concerning copyright, such cases should undergo a normal deletion review process.

Some more could be added later(libel, previously deleted images, mass spamming of images, etc), but does this look good for others on the opposing side as well?AerobicFox (talk) 20:39, 13 December 2010 (UTC)[reply]

honestly,: no, it dosen't improve the current proposal. as stated by many commentors in the oppose section, speedy deletion as "out-of-scope" is unacceptable. it opens the door to potentially unlimited admin abuse, with no way for the average commons user to assess, review, block, or reverse this action. it would change "scope" decisions on sexual material to an admin perogative.
also, as a whole, this text doesn't represent much change from the proposal that is currently being voted on.
finally, this part: "Some more could be added later(libel, previously deleted images, mass spamming of images, etc)"
really doesn't sound good; it seems like you are planning on getting the door open "a crack" on expanded admin powers for speedy deletion, & then later you want to see how much more you can force though it.
i'm sorry & no personal disrespect intended to the poster of the comment, but i'm not "ok" with that, & i think that there are a lot of other people on here who wouldn't be "ok" with that either. Lx 121 (talk) 01:28, 14 December 2010 (UTC)[reply]
1. You do realize that I did not list "out of scope" as a reason for speedy deletion. o_O
2. I am in the oppose category myself, this represents a compromise that I would be okay with. The "additional criteria could be added later" was meant to appease the support crowd, it wasn't my attempt to "get my foot in the door". :/ --AerobicFox (talk) 06:01, 14 December 2010 (UTC)[reply]
with respect, 2 points:
1. "Redundant Images of lower quality" IS a scope decision; it requires subjective opinion & should be the subject of open community debate, not the prerogative of individual admins.
2. whatever your intentions in "appeasing" the support crowd, the net effect is still the same; to open the door to incremental restrictions being imposed over time. censorship through the back door is still censorship.
Lx 121 (talk) 00:15, 15 December 2010 (UTC)[reply]
That was a brave attempt and is obviously much better, but we have to accept that the whole policy is a dogs breakfast. A change of tone would make it so much better. If it highlighted the areas that are acceptable and then pointed out areas where there may be problems many editors would be less suspicious. For instance the Level Headings- we have a
* 1 Definition
* 2 Prohibited content
* 3 Deletion of prohibited content
o 3.1 Speedy deletions
o 3.2 Normal deletions
o 3.3 Deletion at the subject's request
* 4 Working with sexual content
o 4.1 File descriptions
o 4.2 Categories
* 5 Legal Issues
o 5.1 The Child Protection and Obscenity Enforcement Act
o 5.2 Obscenity law
* 6 Notes
* 7 Previous proposals
* 8 See also

But it could easily be changed, and be so much more welcoming. It would better reflect what the Supporters are trying to say- and remove the tank traps that the opposers detect. Try this

* 1 Definition
* 2 Permitted content
* 3 Prohibited content
o 3.0 Deletion of prohibited content
o 3.1 Speedy deletions
o 3.2 Normal deletions
o 3.3 Deletion at the subject's request
* 4 Working with sexual content
o 4.1 File descriptions
o 4.2 Permitted Categories
o 4.3 Categories requiring further discussion
* 5 Legal Issues
o 5.0 The American legal framework and Wikipedia obligations
o 5.1 The Child Protection and Obscenity Enforcement Act
o 5.2 Obscenity law
* 8 See also
* 6 Notes
* 9 External Links
Genuine editors usually know if they are uploading a potentially controversial image, so lets have a tag we can use to inform the admins why our image has educational value, for instance << {{EdVal}} reason=For work on History of Manga >>. Encourage self certification rather than spitting threats. If the mobile phone artist does displays his crown jewels anf can give a sensible reason then it stays- if he doesn't then he is not serious and deletion would not be a problem. A simple undo.
I have said before that the text must be de-regionalised and clear description of the Florida legal system must be given in a language that is NPOV, and assumes that the reader is encountering if for the first time - and links checked that the articles are to GA standard. If this is done them much of Rusliks objections have been covered.--ClemRutter (talk) 00:09, 14 December 2010 (UTC)[reply]
no strong opinion on the merits of making the wording sound friendlier, but i don't think i like the idea of having a tag for "controversial" & requiring the uploader to "prove" that the item belongs on wmc, in order for it to be kept. that sets a really bad precedent. the onus has always been on the deletion nom to prove that an item should be removed, NOT the other way around. Lx 121 (talk) 01:28, 14 December 2010 (UTC)[reply]
as a secondary observation, doing things this way would also massively increase the file-processing workload. workload has been a key complaint by some of the users pushing for expanded use of speedy deletions. if it's already too much work to do a standard deletion nom, instead of a speedy, then it's going to be a whole lot more work having to process every "controversial" file. Lx 121 (talk) 01:28, 14 December 2010 (UTC)[reply]
You'll probably want to include bestiality images too.
There are images of preteen kids posed in sexual positions where they are clothed only in thongs, with the pubic area covered by a strip of cloth that barely covers the genitals. Is it necessary to debate whether those should be kept? Additionally, there are the photos of young boys with erections tenting out underwear, is a debate needed for those? Then there are the photos of kids boys and girls dressed in tight fitting pants so as to emphasis a cameltoe, do you want to debate keeping those? Then there are all the photos of naked brown skinned kids where the focus of attention is on the groin area and where if the kids were white skinned they'd be deleted on sight. John lilburne (talk) 00:25, 14 December 2010 (UTC)[reply]
yes, it is necessary for the community to discuss & debate openly what material should & should not be included as a part of "scope", when that material is not actually illegal for commons to host. that is the whole point of having an open, wiki-community process. the way the preceeding commentor has built up an incremental list of things that "of course" all right-thinking people must object to, & therefore they should not be debated, just deleted on sight, illustrates the problem. it doesn't take much brains to see how the preceeding list of "obvious speedy deletions" could be abused, or just interpreted radically differently, by different admins. Lx 121 (talk) 01:07, 14 December 2010 (UTC).[reply]
You want to debate each and every example of a 10yo ass in a thong, why? If its your 10yo in a thong and posing as a porn star then cool uploaded it, just make sure that all the documentation is in place to verify that it is your child or that you have the permission of the child's parents. John lilburne (talk) 10:01, 14 December 2010 (UTC)[reply]
Photographs of nude children are legal and are permitted at Commons. See Category:Nude children. The question of whether a photograph is sexual enough to be subject to child pornography law is rather subjective, at least when some type of intercourse is not occurring. I'd support deletion reviews for these ambiguous cases, as proposed above. Dcoetzee (talk) 01:43, 14 December 2010 (UTC)[reply]
Legality isn't the issue. Few have problems with you or anyone else taking and making available compromising photographs of their OWN underage children, siblings, grandchildren, nieces, or nephews, if they happen to be illegal then one takes the consequences. The issue is whether you or anyone else should be making available a collection of photographs of nude children that one does not have any relationship with. If someone wants to upload photos of their 12yo in speedos sporting an erection fine, or if they want to take close up shots of their 14yo cameltoe then OK. If what they are doing is staking out the local swimming pool and photographing unsuspecting kids then that is not cool at all. That sort of photographic is PREDATORY, should not be condoned, and should not be given hosting space by any responsible organisation. John lilburne (talk) 09:53, 14 December 2010 (UTC)[reply]
I do have a problem with people making available compromising photographs of their own relatives. The children are not their property.
But, as Lx 121 writes, the list is very problematic. As you present it, few would support keeping the images, but somebody else might honestly describe the images as "children playing on the beach" or something like that.
So, I think we might have to use something like the list of AerobicFox, if we do want to allow the speedy deletions and think the judgement of admins cannot be trusted.
--LPfi (talk) 13:30, 14 December 2010 (UTC)[reply]
Children are more the property of their relatives than the children of strangers are. Simply if one is not prepared to upload photographs of ones own underage relatives in the nude, in provocative poses, or in an embarrassed state, then one really shouldn't be uploading or voting to keep similar photos of other people's kids. There is Junge_Hamer_in_Südäthiopien.jpg on commons, of some one that appears to be under 21, which was proposed as an illustration for the page on Primates do you think that as she walked through town she was agreeing to that, or that if she knew about the image here that she wouldn't object? John lilburne (talk) 20:25, 14 December 2010 (UTC)[reply]
i'm not even going to touch the straw-man arguement about the merits or uploading images of your own relatives vs strangers. the line of reasoning that gets you there is a little too "out there" for my tastes. this is clearly something where the commentor strong personal feelings, & attempting to debate the matter would be unproductive.
my key points of objection, to the original post were:
a) how it managed, very nicely, to build a list of "obvious speedy deletions" incrementally; first, one thing that must clearly be objectionable, then add another, then add something that's not quite as absolute, but make sure to throw in some nice graphic examples, to achieve an emotional response, etc. using such tactics, this "perfectly reasonable little list" could easily grow by "creep" to include just about anything, where one can get a strong emotional response by citing a few graphic examples. that is a dangerous & unwise way to make policy.
b) setting aside the commentor's descriptive skills in providing graphic examples, what they are proposing to ban-&-speedy would be:
1. "bestiality" (with no clear definitions provided).
2. pretty much any image of a minor less than fully clothed (with the only real definition provided being where the pic "feels dirty").
3. "photos of kids boys and girls dressed in tight fitting pants so as to emphasis a cameltoe", so basically any image of children in tight-fitting clothing (again the only standard of judgement provided being purely subjective).
4."Then there are all the photos of naked brown skinned kids where the focus of attention is on the groin area and where if the kids were white skinned they'd be deleted on sight" (without even commenting on the prejudices implicit in that statement, the standard of judgement is again subjective opinion).
user also makes no provisions to set aside historical materials, anthropology, etc. it's all just "bad" & clearly needs to go.
i can accept that this represents the user's honest, personal opinions, but that does not change the fact that it makes a terrible basis for policy!
try & imagine a world, where each & every admin on commons is doing speedy instant-deletions, based entirely on their own, subjective judgments, without community debate or review, on the basis of this set of "examples".
Lx 121 (talk) 00:15, 15 December 2010 (UTC)[reply]
How is relatives vs non-relatives "too out there"? If one isn't prepared to make an image of ones own kid to provide some encyclopaedic illustration, then why use one of some other kid? Its not as if taking photos is particularly hard.
1. Bestiality refers to people having sex with animals, penetrating, or being penetrated, engaging in oral sex with an animal.
2. If the picture feels dirty it then it probably is, go with your instinct, that is why I said that such photos should only be uploaded by relatives. Most parents don't post photograph online of their 10yo with just a ribbon of cloth covering their genitals.
2a. The act of collecting and categorizing can alter the perception of an image. A photo of a women in stockings sitting on a chair is one thing. A 100s of such image has an entirely different feel. A photo, such as the one I referenced above, categorized as "African girl" is one thing, categorized as "Topless adolescent" or "Primate" is something else.
3. RH gave an example of a Nude Woman on a beach standing in front of a grand piano where the nude woman part was pretty much superfluous. The same is true of 10-15 yo girl in swimsuit with cameltoe. Airbrush out the cameltoe and what do you have left? A 12yo girl in swimsuit and what is so educational about that?
4. Why is it that the only modern images of nude children on Commons are dark skinned, and most of the old ones are too? I'll suggest that the reason is because a) it is still acceptable to show images of nude dark skinned kids, whilst it is acceptable to show images of nude white skinned kids, and b) that their parents are least likely to find out and object. Of course any of the those in this thread that have white skinned kids are welcome to address the imbalance with photos of their own kids.
Nothing that I've said address historic images. I'm simple referring to uploads of new modern material. 194.193.183.253 15:28, 15 December 2010 (UTC)[reply]
To respond to Lx 12's comment above. I made the list incredibly clear. Photographs of a children engaged sexual intercourse is very unambiguous, and currently I believe commons does not host a single file like that. Historic drawings, anime, erotic art, children acting sexual, etc, would not be affected by this. Such a policy would be extremely difficult to abuse. Files without sexual intercourse should undergo a normal deletion review if any attempts are being made to delete them.
"but we have to accept that the whole policy is a dogs breakfast."
True.
I like your idea about adding more material about things that do not qualify for deletion; moving forward(as I'm sure this will) that should be worked out. One such criteria to not allow for speedy deletions could be material that was uploader by a regular user, or that is part of a series. Well until then.(6 months from now?) --AerobicFox (talk) 06:01, 14 December 2010 (UTC)[reply]
again with respect, it is disingenuous to cite the inclusion of "Photographs of a children engaged sexual intercourse" on the proposed list as an example of the merits of the proposal. child pornography is already banned @ wmc, & is already speed deleted. inclusion or exclusion here has NO EFFECT on that. the issue is the inclusion of out-of-scope as a basis for unilateral speed deletions, by individual admins, acting on their own recognizance, without any effective community review possible. why do some many of the people, who are trying to push this objective, keep dancing around that? Lx 121 (talk) 00:15, 15 December 2010 (UTC)[reply]

Encouragement and thinking ahead, from a "semi-dogmatic opposer"[edit]

So, the current draft probably isn't going to become policy-- but that's okay. It's notable, in fact, to consider those principles that were, as far as I can tell, utterly non-controversial.

For example, I think there is universal agreement on the principle that "Wikimedia doesn't host illegal content". Sure, there may be debates about what precise text we should provide in order to help editors understand the law. But there isn't any real dispute over the idea that Wikimedia is a law-abiding organization, where illegal content is unwelcome and deleted on sight.

Similarly, I think everyone agrees on principles like "Porn models need to be adults and they need to have given consent". There may be debates about the details of how to apply that, but the basic principle is solid.

And I think everyone basically agrees that "Wikimedia is not censored". Our scope is wide, we cover the whole of human knowledge. Wiki(p/m)edia has always had porn, it has porn now, and it will continue to have in the future. Notable images are notable images-- regardless of whether they're sexual in nature or not. And in this round of discussion, I think everyone was on board with that-- in the past there had been talk of doing a massive deletion project to rid Wikimedia of all sexual content, notable or not-- but the idea is dead, and it wasn't what was being advocated for here, and such a radical idea would never reach consensus.

The important lesson, I think, is that everything we wanted to accomplish with this proposal is, more or less, being accomplished. We do delete illegal content, we do delete images without consent, etc. So, just because this document isn't getting support, the most important principles behind this document _are_ in effect and being carried out anyway.

Moving forward, I'd suggest boiling this policy down to its essentials. Ask yourself:

  • What policy changes from the status quo do we think are needed?
  • Do those changes have consensus?

In the current draft, the biggest change from the status quo was allowing deletion for 'out-of-scope'-- a change that was very controversial and probably should be excluded from future drafts. As for the majority of the document-- the parts that didn't actually try to change policy but instead tried to explain how we apply existing policy to existing content-- those parts scream out to be accorded "guideline" or "group essay" status.

The real answer, I think, is to start looking ahead, not to new policies, but to user-controlled filters and user-run tagging systems. After much discussion over the course of a decade, there simply is no consensus to delete controversial images (assuming legality, etc.) That debate's basically over, and the conclusion is always-- we're not going to delete them--

We will not delete them in a box,
We will not delete them because of Fox,
We will not delete them here or there,
We will not delete them anywhere!

But we'd be happy to help users voluntarily shutter them. So let's unleash the developers and let's work on a culture-neutral tagging scheme. --Alecmconroy (talk) 09:00, 14 December 2010 (UTC)[reply]

Please name any category containing sexual content where the majority of images have a clear statement of the subject's consent to distribute the images. --99of9 (talk) 12:45, 14 December 2010 (UTC)[reply]
Well, there's always going to be debate about what kinds of evidence is needed to establish whether a subject consented. Maybe those standards of evidence need to be tightened or perhaps we need more consistent enforcement. But the good news is-- Everyone basically agrees on the principle that we shouldn't be hosting images in which the subject did not give consent.
But issues about consent questions don't need to be lumped together with issues of legal obscenity. Nor does consent concerns have to go together with new criteria for speedy deletion. Those are all very different policy issues-- they don't have to be handled all together in a special "Sexual Content" policy. Instead, frame each individual change as a logical expansions of existing principles. --Alecmconroy (talk) 15:57, 14 December 2010 (UTC)[reply]
Everyone basically agrees on the principle that we shouldn't be hosting images in which the subject did not give consent. - are you sure? - my reading of the current proposal is that it bangs on a bit about the current legal requirements (ie. it's got to be legal in US) - explains some of the existing policies we've got, and the only new thing it adds is the fact that sexual content must contain assertions of consent. If you look at our 'females performing fellatio' category, do you think consent either is or should be present for any of those images? cheers, Privatemusings (talk) 22:16, 14 December 2010 (UTC)[reply]
Firstly, I'm shocked there are only 11 images in the category-- I would have thought we'd have far far more.
Secondly, I think there's universal consensus that if we learned any image lacked consent, we should remove it immediately.
So then then only remaining question is finding the right balance on assuming good faith on the part of the uploading user, photographer, and publisher. I remain completely agnostic on whether we've found the right balance or not-- I personally tend to assume that our users are, in fact, moral libertines rather than criminals uploading illegal images, but perhaps we have duty to be even more stringent about ensuring consent.
But none of these issues represent a fundamental division-- on the contrary, the community is all on the same page, more or less-- non-notable porn images distributed without the subject's consent are not welcome at wikimedia. If we need to be doing more (or less) than we currently are, that's an important discussion that should take place without the distractions of the whole "out-of-scope-deletion" question.
--Alecmconroy (talk) 07:53, 15 December 2010 (UTC)[reply]
I'm not sure why you feel good faith would be required on the part of the uploader currently, alec - we simply don't require that consent is required from the subjects in blowjob pics - do you think we should? - further, you sort of imply in your response that you feel consent is in fact present in some (or all?) or those images - would you mind explaining why you feel that's the case? I don't mean to bug you, but I think your responses in this matter could really help illuminate the issues :-) Privatemusings (talk) 12:22, 15 December 2010 (UTC)[reply]
There is no "universal consensus that if we learned any image lacked consent, we should remove it immediately." If there is only one free head and shoulders shot of, say, a prominent politician available, I would oppose deletion based solely on the grounds that the subject of the photo does not consent. If the photo is sexual in nature, that is, in my opinion, another matter entirely, but unfortunately a sizable number of Wikipedians refuse to agree that there should be a significant distinction between sexual and non-sexual material.--Brian Dell (talk) 20:31, 16 December 2010 (UTC)[reply]
Thing is, there is not universal agreement that "Wikimedia doesn't host illegal content". I saw at least half a dozen people opposing this on the grounds that Commons should not be bound by US law. --Carnildo (talk) 00:10, 15 December 2010 (UTC)[reply]
I think the crux of what those participants are saying is that "If US laws truly conflicted with our mission, it'd be valid to consider relocating". It's a rhetorical point-- reminding us that US values aren't necessarily and automatically the Wikimedia movement's values.
But I think we're all on board with the idea that we're currently US-flagged, our servers are in the US, we obey US laws. Supporters and opposers DID have consensus on that, and there aren't any proposals on the table to change that.
It's a good thing. One particular proposed policy text wasn't popular-- but it's not as if anarchy has broken out. :) --Alecmconroy (talk) 07:25, 15 December 2010 (UTC)[reply]
It seemed like they thought that US law was too restrictive (or could be interpreted to restrictively) already. But if we want to move forward about that I see two points:
* The foundation is bound by US laws. It is not enough to move illegal content out of USA, but that content should be administrated by some other entity, for the actions of which WMF cannot have any responsibility.
* Regardless of whether US laws are too restrictive, there may happen something in USA that forces WMF to delete much valuable material by office action. If that happens, I hope we have good mirrors from which we can continue to have that material available (URAA might be such a restrictive law).
Setting up a new site outside USA is not a small undertaking, especially if we want to smoothly get the culture and the contributors of the WMF projects moved over there. Probably we should have some intricate cooperation system, so that good contributions to either project would more or less automatically show up in the other (unless affected by specific laws or policy). We should be able to re-evaluate deletions done after the restrictive practice commencing, even if not noticed right away and even if setting up the non-US site would take significant time.
--LPfi (talk) 13:02, 15 December 2010 (UTC)[reply]
I know we have servers in the Netherlands. That group could, I suppose, be restructured into a 'legally independent' organization.
Going off on a tangent that has nothing particularly to do with sexual content-- if any other nations offer more hospitable laws for our mission of information-sharing, then yeah, let's move! WMF would just become the "US-chapter" and the new umbrella organization would be hosted out of that more-hospitable nation.
Only question is-- _is_ there a nation better suited to our mission than the US? I don't know of one, but I can't say I've researched the matter. I read Iceland may become the "Switzerland of Bits", but thus far, it's all speculations. In short, I think we could leave the US if we wanted to-- but I don't why we'd want to. Maybe some future draconian copyright laws? --Alecmconroy (talk) 04:16, 16 December 2010 (UTC)[reply]
That is not the only question. Moving to another country will also affect donors, contributors and staff. We should not be afraid if there is an important reason for the move, but I think the reason has to be quite important. And future draconian laws may pop up anywhere, so having a distributed infrastructure, also in terms of legislation, would be good. --LPfi (talk) 10:36, 16 December 2010 (UTC)[reply]
I strongly oppose moving everything to a jurisdiction that does not recognize Fair use to the extent that the United States does, which means almost everywhere else. More than 300 000 files on en.wikipedia use the non-free media template. Having said that, I would admit that some people would rather move than strictly obey US obscenity laws.--Brian Dell (talk) 20:54, 16 December 2010 (UTC)[reply]
With all due respect to those participants, I think following the law is a decision that has already been made for us by the Foundation - we do not have a say in it, no matter how strongly we disagree. At the very least, an informational page that summarizes the relevant law should not be the least bit contentious. It is true that we don't seem to all agree about anything - but the goal is to establish consensus, not unanonymity. I think consent is and will remain a very contentious condition, not to mention the very subjective question of distinguishing sexual content from "mere nudity." @Alecmconroy, I don't believe there's such a thing as a "culture-neutral tagging scheme." Dcoetzee (talk) 02:15, 15 December 2010 (UTC)[reply]
No one tagging scheme can be neutral, but infinitely customizable collaborative tagging could be. We can't have one single tagging scheme, but we can have as many 'controversial' lists as we want. We're a wiki, after all-- we're set up to have wide-scale collaboration. If we want to make a tagging system, let anyone create their own lists of images or categories they want shuttered. To use the Ur-example, the same kind of system we set up for shuttering sexual content should be open to Muslims users so they can create a list of Muhammad images they want shuttered-- this would make our muslim readers very happy and the sooner something like that gets set up, the better. --Alecmconroy (talk) 09:57, 15 December 2010 (UTC)[reply]
One of the problems with preceding proposed text was it confused policy with an a priori legal situation. It is wrong to say that Foundation chose to adopt US law- and even more wrong to try and tie that in with the policy. As I understand it US law has the concept of extra-territoriality, and that the WMF has to comply with US law no matter where the act is committed. I also fear that they will try impose that law on non nationals who happen to relate to a company regulated by US law. Trying to live your life in remote Oblivistan is no protection.
So "At the very least, an informational page that summarizes the relevant law should not be the least bit contentious". Well no. It must be obligatory and it will be contentious- thats how lawyers get rich. It is hard to evaluate an issue if you don't understand the locals archaic little customs! Even harder when the locals confuse legal framework with policy.
Practically I can see the legal framework being put in a protected template, which can then be transcluded into the Notes section of a policy page.--ClemRutter (talk) 10:41, 16 December 2010 (UTC)[reply]