Commons talk:Sexual content

From Wikimedia Commons, the free media repository
Revision as of 20:47, 27 May 2010 by Adam Cuerden (talk | contribs) (→‎Archiving: new section)
Jump to navigation Jump to search

Archive 1 (Apr 2009) | Archive 2 (April 2010) | Archive 3 (May 2010)

See also User talk:Jimbo Wales, Village pump discussions on May 6, May 7, mailarchive:foundation-l/2010-May/057789.html, mailarchive:foundation-l/2010-May/057791.html, m:Requests for comment/Remove Founder flag, m:Petition to Jimbo, Commons:Undeletion requests/Current requests, Commons:Deletion requests/2010/05, User talk:Tiptoety, m:User talk:Jan-Bart, w:en:Wikipedia:Wikipedia Signpost/2010-04-12/Sanger allegations.

Proposal clarifications

Targeting sex?

I'm sorry, but "Commons is not censored" is now becoming "No images of sex whatsoever?" And no exemption for artwork, so we'd better go and delete the Dream of the Fisherman's Wife, despite being one of the best-known ukiyo-e.

This is an ill-thought out knee-jerk proposal, which fails to make important exemptions, and tries to claim censoring isn't censoring. We don't need to be Wikiporn, but at the same time, saying things are never appropriate is going too far. Adam Cuerden (talk) 16:21, 6 May 2010 (UTC)[reply]

Hear, hear. Either the WMF board as an entity needs to make a pronouncement, or Jimbo needs to, ahem, step away from ther keyboard. Random dictatorial decrees from the little god-king don't and should not have any more effect than anything said by any other contributor. Particularly when WMF legal counsel has said 2257 doesn't apply to us. Roux (talk) 16:25, 6 May 2010 (UTC)[reply]
No one is arguing that 2257 applies to us legally. Please remember not to engage in personal attacks, it is entirely unhelpful. I expect the Foundation board and/or staff to make a formal statement about this within a few days. The question is not going to be whether we are going to be a porn server.--Jimbo Wales (talk) 17:30, 6 May 2010 (UTC)[reply]
Free means free... just because a few people don't like some of the content produced by broad freedom doesn't mean we should scale back the right. Remember, "I may not agree with what you have to say, but I will defend to the death your right to say it." Voltaire --waltmarkers
Indeed, it was just one of the content criteria that were being suggested for determining what type of content would no longer be allowed in the project TheDJ (talk) 17:36, 6 May 2010 (UTC)[reply]
And, you know, it's not like I even object to deleting images which are redundant and low quality compared to others of the same type, or which serve no encyclopedic purpose. It's the fact that blanket prohibitions, without even any wiggle room for sensible exceptions are being thrown down. Adam Cuerden (talk) 16:30, 6 May 2010 (UTC)[reply]
Adam, please take another look at what I wrote. This is preliminary. It is not ill-thought out nor is it knee-jerk. What it is - is a rough start.--Jimbo Wales (talk) 17:30, 6 May 2010 (UTC)[reply]
I've edited it a bit. Others have, why not me? Adam Cuerden (talk) 16:37, 6 May 2010 (UTC)[reply]
I think you should edit it. Take a look at User:Wikipr0n first. This is not a fight about historical images. It is not about images that have educational value. It's about really stupid stuff that is going to get deleted. Please help formulate a policy that does the right thing.--Jimbo Wales (talk) 17:30, 6 May 2010 (UTC)[reply]
I've edited COM:PORN. Tell me what you think. --TwoWings * to talk or not to talk... 17:36, 6 May 2010 (UTC)[reply]
It is innacurate and misleading. It says: "For the policy, see Commons:Sexual content, COM:PS#Censorship and COM:PS#Must be realistically useful for an educational purpose", while the status of policy of Commons:Sexual content is untrue. There is hope. Please remove. --187.40.204.204 02:39, 8 May 2010 (UTC)[reply]
I've edited it. (Also, fixing indentation) --187.40.235.122 12:51, 9 May 2010 (UTC)[reply]

Jimbo: I reacted so strongly because of the edits by TheDJ, which I thought were your words: [1] and [2]. A blanket prohibition, with no wiggleroom, which includes even spanking or images of the vagina one single person thinks is sexy? No. That's horrible policy to be held to. Adam Cuerden (talk) 18:27, 6 May 2010 (UTC)[reply]

Frankly, the problem with things like these is that deletion will always go as far as the most prudish admin's preference, which means that much of what should, in any reasonable judgement, be acceptable, will get deleted. That addition is a good example: It's possible for a picture to be sexual and encyclopedic, especially if it's illustrating sexual cultures, but his additions explicitly forbid so much, claiming they're out of scope, that it would become impossible to illustrate many encyclopedic articles. For instance, if S&M is always forbidden as being supposedly "out of scope", how would one illustrate an article on S&M, for instance? I'm pretty sure one can make encyclopedic images that illustrated it, but that addition would say they'd have to be deleted. Adam Cuerden (talk) 18:34, 6 May 2010 (UTC)[reply]
I would like to remind you that I'm personally in the exact opposite camp. I have held back these changes for a very long time. It is only per the direction of Jimbo and the board that I'm working within those directives. On a personal level, I heavily oppose them, think they are childish and they are typical american prudism. But my personal opinion is no longer relevant here. TheDJ (talk) 18:54, 6 May 2010 (UTC)[reply]
  • I think the targeting of sexual images does seem a little biased. If we are going to persecute images deemed unnecessary, of low quality, etc. it should be done so equally regardless of content. If we're going it on the basis of what may offend people's moral beliefs then perhaps we should remove Muhammad. If we're removing what might gross people out then lets remove over nine thousand penises. If we're removing depictions of illegal acts then let's remove the JFK assassination or the World Trade Center bombings, we don't want to embiggen assassins or terrorists. Ty (talk) 03:07, 13 May 2010 (UTC)[reply]

Some other questions

I do have other questions :

  1. How was File:Mammary intercourse with dildo.jpg connected to one of the 5 cases of "actual or simulated acts" ?
  2. Isn't the word "lascivious" too subjective to be applied ? (I know the law states that but I really wonder if they really thought about it).
  3. How ejaculation is understood by the law ? Strictly speaking it seems to fail in none of the 5 cases.
  4. Isn't exagerate to say that any smiluated act is outlawed ? Such as this one ?
  5. What about non-human masturbation ?!

--TwoWings * to talk or not to talk... 17:47, 6 May 2010 (UTC)[reply]

  1. That one was more likely deleted for being low quality
  2. US courts use the w:Miller test in these cases, so we would probably have deletion discussions about such images in the future. But if we have a non-lascivious image, the lascivious image is redundant, and can be deleted for that reason.
  3. Ejaculation is sexual conduct and would fall under the deletion criteria in my reading.
  4. That one would also probably require discussion indeed.
  5. A good question.
TheDJ (talk) 18:02, 6 May 2010 (UTC)[reply]
Ceeeeennnnnsorship. Adam Cuerden (talk) 18:52, 6 May 2010 (UTC)[reply]
Correct, the foundation is applying censorship of content by limiting the scope of the project. TheDJ (talk) 18:55, 6 May 2010 (UTC)[reply]
Enforcing the scope of content to be consistent with the Foundation's mission is not censorship. But if you want to view it that way, I'm fine with that. The main thing is that policy will be enforced, and the pornography will be deleted.--Jimbo Wales (talk) 12:19, 7 May 2010 (UTC)[reply]
I thought that the Foundation's mission was to create a free encyclopedia, and to make all knowledge accessible to all. Apparently I was wrong. American sexual morality is obviously more important than knowledge and truth. Many so-called pornographic images can have a value for an encyclopedia--Ankara (talk) 16:08, 7 May 2010 (UTC)[reply]

Record keeping

One question I do have here, is there any method by which we could obtain records sufficient for the purposes of 18 USC 2257? There is potential educational value in some explicit imagery after all, and a 2257 specific template for those we have those records for would easily allow tracking of those images. If its technically feasible to arrange this, then its something to look into - that is unless the board says "no explicit images".--Nilfanion (talk) 18:20, 6 May 2010 (UTC)[reply]

Reading the words of Jimbo on his talkpage, the latter is the intention. We will not start keeping records or require others to keep records as far as I can derive. TheDJ (talk) 18:56, 6 May 2010 (UTC)[reply]
Then Jimbo is an idiot. Adam Cuerden (talk) 20:26, 6 May 2010 (UTC)[reply]
Personally I'd feel happier if the feasibility of record keeping was looked into: Can the WMF do this at all and if so how? I can certainly accept that some explicit imagery that would trigger 2257 requirements is in scope. If its relatively easy, we could then do record keeping for the stuff that's in the borderline area even if the explicit imagery is all deleted - if its not possible we know that and that would make the deletions slightly easier to accept.--Nilfanion (talk) 20:45, 6 May 2010 (UTC)[reply]

@ Adam: This is vandalism! You do not have the right to insult Jimbo. --UAltmann (talk) 20:27, 6 May 2010 (UTC)[reply]

It's accurate. I feel that he's acting directly against the good of the encyclopedia, supporting prudery, and encouraging the deletion fo encyclopedic contend. All wrapped up in Orwellian doubtlethink about how blanket bans on nudity and sex doesn't mean we're censoring anything.
I support cleaning up non-encyclopedic content. Blanket bans on whole classes, especially with weasel-words like "anything lascivious", will, by necessity, include lots of things that are encyclopedic. Adam Cuerden (talk) 20:29, 6 May 2010 (UTC)[reply]
It's unacceptable to move a page that way to make a point. Do it again and you may find yourself blocked until you agree not to do it. We want civil discussion, not grandstanding. ++Lar: t/c 11:15, 7 May 2010 (UTC)[reply]


Sadistic or masochistic abuse

This needs to be better defined. As it stands, it includes the removal of commonplace demonstrations of BDSM activities, such as in Category:Folsom Street Fair, something of which in fact are what we're trying to avoid. Bastique ☎ appelez-moi! 20:43, 6 May 2010 (UTC)[reply]

Most of those files you mentioned do not show explicit sexual conduct and are thus not subject to deletion as per Commons:Sexual content. --UAltmann (talk) 22:03, 6 May 2010 (UTC)[reply]
The wording that is currently at Commons:Sexual content does not require "explicit sexual conduct", but merely depiction of "actual or simulated acts". Black Falcon (talk) 22:16, 6 May 2010 (UTC)[reply]
you're correct, of course, though most images in the cat. given don't feature actual or simulated acts either :-) Privatemusings (talk) 22:18, 6 May 2010 (UTC)[reply]
Good thought, I re-edited the section a little, I am curious about the reactions. --UAltmann (talk) 22:25, 6 May 2010 (UTC)[reply]
I think there is a difference between BDSM in general and simulated or actual abuse.... TheDJ (talk) 23:38, 6 May 2010 (UTC)[reply]
Certainly true, but I'm not sure I know where the line is. For example, if a BDSM outfit is uncomfortable, is that abuse? How about ropes? Uncomfortably tight ropes? Hanging from tight ropes? Etc. I can think of things that probably are abuse and things that almost certainly aren't, but I don't know where the dividing line is. Dragons flight (talk) 23:49, 6 May 2010 (UTC)[reply]
I'm unsure about this as well. Clearly the intent of the foundation directive is "think of the children" and as such I think that excessive collections of BDSM are partly deletable under the 2nd part of the proposal: "repetitive of other content accepted for educational value can be deleted." However where the line of 'abuse' is in regard to BDSM is very much unclear to me as well. TheDJ (talk) 01:03, 7 May 2010 (UTC)[reply]
Looking at Delinker for likely files, I see that File:Spanking_on_Bondage_Furniture.png - A not particularly graphic image, with fully-clothed participants - was deleted and delinked, despite being in use for educational purposes.
So, this is going exactly as I thought: Encyclopedic content is being destroyed for reasons of prudery, and because of bad advice. For god's sake, when images of fully clothed people which are in use to illustrate articles related to BDSM are getting deleted, something has gone horribly wrong.
Also, I really, really object at being forced to scan through pornography just to find out if this idiotic policy is having the effect anyone with two brain cells could've predicted. Adam Cuerden (talk) 01:28, 7 May 2010 (UTC)[reply]
COM:UNDEL and further complaints to be filed with Jimbo. TheDJ (talk) 01:48, 7 May 2010 (UTC)[reply]
You have a very peculiar definition of fully-clothed. I don't think most people consider a woman with an exposed labium to fall under that description.Undomelin (talk) 02:41, 11 May 2010 (UTC)[reply]

Question, and I promise this is not rhetorical: does this proposed policy mean losing most of the images in Category:Abu Ghraib prisoner abuse? None of the exceptions spelled out so far would allow these images, and they certainly are examples of sadism, and non-consensual sadism at that, from the sadist's point of view. Perhaps there is simply another exception needed, something related to historical importance or newsworthiness? - Jmabel ! talk 04:51, 7 May 2010 (UTC)[reply]

I'd add 'sexual' to the description; 'Sadistic or masochistic sexual material - or....' - that clarifies why it wouldn't apply to prisoner abuse, no? Privatemusings (talk) 04:57, 7 May 2010 (UTC)[reply]
No. Why a photograph of a masturbating prisoner is not a sexual material? Btw, Abu Ghraib pictures will be deleted in any case, sooner or later, as copyvio, maybe it's a poor example. Trycatch (talk) 05:13, 7 May 2010 (UTC)[reply]
The masturbating prisoner is a rather horrific image which would indeed qualify under my proposed criteria. Most of the images in the cat wouldn't - I don't know anything about the copyvio aspect I'm afraid. Privatemusings (talk) 05:33, 7 May 2010 (UTC)[reply]
This is a widely known picture, and it would be a serious loss if it will be removed. Ok, what about this picture? Trycatch (talk) 05:45, 7 May 2010 (UTC)[reply]
I'd certainly say there should be room for reasonable discussion - I suppose 'notability' (as in a cultural / historic context for the image already established outside of this project type of thing) - would be a reasonable basis on which to allow media inclusion despite meeting the criteria for deletion. On that basis, it's likely that later (again rather horrific) image would also be ok - as would things like the 'virgin killer' album cover (I know this isn't on this project, but you take the point :-) - or that famous photo of brooke shields. I'd prefer all media to have the facility for ICRA ratings too, mind... thoughts? Privatemusings (talk) 06:05, 7 May 2010 (UTC)[reply]
And how do you determine what level things meet? Oh, and as to your idea for ICRA ratings, absolutely not. It is incredibly culture centric, not NPOV to let the ICRA which is an essentially US and UK organization have any say about general content. JoshuaZ (talk) 15:37, 7 May 2010 (UTC)[reply]

I find it depressing, although unsurprising, that most if not everyone contributing to this portion of the conversation has no idea how BDSM works. Example: "If a BDSM outfit is uncomfortable, is that abuse? How about ropes? Uncomfortably tight ropes? Hanging from tight ropes?". Answer: none of them. It's not abuse when the people in question have willingly consented to take part in those activities. Oh ho ho, I hear you cry, how do we know they consented? We don't. How do we know any subject of any photograph has consented? We don't. There is no difference. Roux (talk) 06:21, 7 May 2010 (UTC)[reply]

I suspect the statute intends "abuse" in an objective sense such as "the infliction of physical injury or harm" rather than a subjective definition such as "wrongful or improper treatment". What the parties themselves consider to be "abuse" is probably less important than what type of content the law's authors intended to cover. I don't think consent matters, just the type of content being photographed. Dragons flight (talk) 06:54, 7 May 2010 (UTC)[reply]
Isn't that glorious? This kind of POV definition would allow you to delete at least 80% of the images in question. It might be a brilliant idea to read en:BDSM before jumping to easy solutions. It's obivios that consent is one of the critical factors to be considered here.Nemissimo (talk) 14:26, 7 May 2010 (UTC)[reply]
In the commentary to CFR 2257 [3], the Justice Department explicitly addresses this question (albeit very briefly). They explicitly state that "actual sexually explicit conduct depends on the content of what is being displayed, not on whether the content is subjectively considered to be abusive", and rejects the view that "that safe and consensual bondage is not abuse". As before, the Government does not appear to consider consent to be a factor in determining whether section 2257 applies to BDSM images. I agree that this appears to target a large swath of BDSM images, and I personally agree that this is bad, but if we are going to use these laws as our starting point then that appears to be their effect. Sadly, I did not get to write the laws. Dragons flight (talk) 17:05, 7 May 2010 (UTC)[reply]
This entire category is deeply disturbing. Aside fromt he fact that it seems to be hideously ill-defined and has all the problems laid out by Roux above, it needlessly focuses on a specific set of forms of sexualized activity. Thus, for example, other fetishtic imagery would not be covered (other common fetishes include for example fetishes of specific clothing type such as leather, latex or rubber, which might frequently involve clothed or fully clothed individuals). I see no good reason to single out a specific category of sexualized imagery like this. JoshuaZ (talk) 15:34, 7 May 2010 (UTC)[reply]

The headline Sadistic or masochistic abuse shows, someone who has clearly not even a basic knowledge of facts related to en:sadomasochism mixed up sadism, masochism, sadomasochism and BDSM based on his personal POV. The current rules are perfect to get rid of most of the related content. What a POV mess and real shame to the intellectual standards of our project. It might be wise to get advice from en:Wikipedia:WikiProject Sexology and sexuality. Nemissimo (talk) 16:16, 7 May 2010 (UTC)[reply]

That headline is quoted straight from the USA definition of pornography, it does not define BDSM, it defines what should be considered pornography. You may not like the definition, but it is the one that was quoted by 2257, the starting point of this discussion. TheDJ (talk) 16:34, 7 May 2010 (UTC)[reply]
There's no good reason to slavishly adhere to 2257 since none of this is being triggered by anything in that law. We're not subject to it and there's no good reason simply because the law might not be well thought doesn't mean we need to follow it (although I'm inclined to argue that given what 2257 is actually designed to do, the law singling out BDSM makes some sense. That doesn't mean it makes any sense for us at all). JoshuaZ (talk) 17:07, 7 May 2010 (UTC)[reply]
Somewhere in the various discussions Jimbo said his intent was to exclude everything that could trigger record keeping requirements under 2257, whether or not those requirements actually apply to Wikimedia specifically. If you want to argue the point with him, feel free, but I think most of us are working from the position that this is our baseline (whether or not we actually agree with it). Dragons flight (talk) 17:16, 7 May 2010 (UTC)[reply]
So he's just ignoring Godwin on this. Great... why listen to experts when you've made up your mind? (Especially fascinating, given Jimbo's pre-wmf internet career). Why should we work from the position that this is the baseline when that baseline is fundamentally wrong, anti-education, neo-puritanical Americentrism? Roux (talk) 17:24, 7 May 2010 (UTC)[reply]
Simply put, that's a really bad standard. If he's going to push this sort of policy he might as well try to construct a reasonable form. Even with Jim pushing for this and even if the WMF backs him on this in a general way, there's no good argument that it has to go exactly as he planned. 2257 isn't made for our purpose and using it this way is going to produce inconsistent and unproductive results (even before we get to the fact that 2257 is legal code that has few active precendents for what it means). It is one thing to use 2257 as a very rough starting point. But that's a very different situation. JoshuaZ (talk) 17:34, 7 May 2010 (UTC)[reply]
As I see it, that is a very rough starting point, and editors here should feel free to improve on it. I'd much prefer to see people start with the areas where there is clear consensus
In general I agree with you that 2257 is less than ideal, but short of regicide, I'm not sure one would be able to effectively push back at this point. If you've visited User talk:Jimbo Wales lately, you'll see he is being very aggressive about this general effort. Dragons flight (talk) 17:44, 7 May 2010 (UTC)[reply]
Eh, he still needs to work with the community. Even if he wants 2257 to be the starting point, us peons are still going to need to construct some sort of workable solution. If that requires removing one section of what would be called under 2257 so be it. JoshuaZ (talk) 18:04, 7 May 2010 (UTC)[reply]

Child-friendly

Commons:Project scope, which is policy, states that "a lawfully-hosted file, which falls within Commons' definitions of scope, will not be deleted solely on the grounds that it may not be 'child-friendly' or that it may cause offense to you or others, for moral, personal, religious, social, or other reasons." I am still trying to understand how this new proposed policy is supposed to square with that. In particular, is there an intent to change that portion of Commons:Project scope? Because quite a few people who have been weighing in have suggested, or even presumed, that only 'child-friendly' content should be on Commons. Is that part of the proposed change in policy or not? - Jmabel ! talk 05:40, 7 May 2010 (UTC)[reply]

I don't read that here. This proposal has two parts - the first is simply a reiteration of existing policy, slightly emphasizing cracking down on low-res / redundant / non-educational images that are also sexual in nature, and encouraging people to make existing images more educational by adding labels and better descriptions. The second, marked as controversial, is a new stricture on media that would trigger USC 2257. Both of these changes might make Commons more 'child-friendly', but they do not address a great deal of material on Commons that most children's librarians would exclude from their reading rooms. SJ+ 08:30, 7 May 2010 (UTC)[reply]
you mean you refuse to think of the children, sj? ;-) Is it fair to read from your comments that you personally don't believe commons is, or should be, 'child friendly' - using a reasonable interpretation of that phrase of your choosing... Privatemusings (talk) 07:07, 7 May 2010 (UTC)[reply]
We do need a separate reference project for children, with its own content standards. (The most significant would likely not be related to media, but would be different language-level and vocabulary than one finds on Wikipedia!) But Commons must serve both that sort of project and all other media needs across Wikimedia. Like any large public library, it is guaranteed to have material that some parents or teachers would consider unfriendly.
We should explain this frankly to our [re]users and describe what knowledge is available. And we should help reusers who need custom snapshots (for themselves, their children, or their [school]) to share their practices with one another. (In my experience, current categories suffice for this. There are already many groups that create school-friendly snapshots of Wikipedia, a fine place for any school to start.) SJ+ 08:30, 7 May 2010 (UTC)[reply]
hear hear. I'll press you for your views on a rating system which would enable commons to serve both child friendly, and less-child friendly projects at a later date - I would point out at this stage though that your approach here mightn't tally exactly with what jimbo described as commons' new policy. I see someone has already re-named it a 'proposal'. Just like a british election, we don't know where we stand! ;-) cheers, Privatemusings (talk) 09:30, 7 May 2010 (UTC)[reply]
should just note that you clarified your statement a bit after I 'hear hear'd you :-) - what experience leads you to feel that current categories suffice for facilitating schools managing content access appropriately? - this doesn't tally with my experience really, nor the anecdotal issues I've discussed with others. Perhaps it's possible that the status quo prior to jimbo's pronouncements is actually ok? - Is this the view you're tending towards? Privatemusings (talk) 10:02, 7 May 2010 (UTC)[reply]
bah! I need to learn to read! (and multi-task - I'm glued to the UK election special, as I have been througout!) - my apologies for misreading / failing to read what you'd written - you were clearly saying that the current categories suffice for the creation of a snapshot. What do you think about a system which would allow ratings on the 'live' site, so schools could interact with that? cheers, Privatemusings (talk) 10:11, 7 May 2010 (UTC)[reply]
I think this probably is the crux of the matter in many ways. I dunno what the answer is, but I'll reiterate my support for a system which allows downstream users like schools to easily make commons 'child friendly', my support for a policy which restricts underage editors and administrators from accessing or working with what could reasonably be termed pornography, and a rating system which allows commons to host all the freely licensed, legal content it can! Privatemusings (talk) 05:57, 7 May 2010 (UTC)[reply]
What's the definition of child-friendly? I support a system that makes it harder for children to access data that isn't child-friendly, but in order to do that we need to define what child-friendly means. Does it mean banning access to articles about genitalia? What about sexual education (in many countries being tought from 11 years of age)? What about violence? I have never understood the double standard that beating the shit out of other people is kind of ok, but showing a female nipple is horrible. Chance is, more children will eventually have sex than murder another human being. Also, all children saw a female nipple every day during their first year of life. I hope we will never see the day when the American christian right definition of child-friendly will be the standard on Wikipedia (even if it's just an optional setting). Wikipedia should not contain porn, but porn is not synonymous to educational images of people having sex. Better to have youths be tought about real sex here, than to have their first experience of it on porn sites. There's a reason men think their penises are too small and women think their breasts are... /grillo (talk) 11:38, 8 May 2010 (UTC)[reply]
SJ wrote : "We do need a separate reference project for children, with its own content standards. (The most significant would likely not be related to media, but would be different language-level and vocabulary than one finds on Wikipedia!)". I agree on that. It does actually exist in dutch (w:nl:WikiKids and Wikiweet), french and spanish (w:Vikidia), and it works quite well. See also the page m:Wikikids Astirmays (talk) 17:36, 8 May 2010 (UTC)[reply]
Thank you for pointing this out, Astirmays! I agree with Privatemusings that a Wikipedia for children would be useful, but think ratings would be counterproductive. If I understand m:Wikikids correctly, it's intended as a fork rather than a rated subset within Wikipedia, which is exactly what I think would work. Lusanaherandraton (talk) 08:55, 11 May 2010 (UTC)[reply]

Please slow down and discuss

Delete first, discuss later?

While I am glad that something is done about content on Commons that serves no educational purpose (though I am a little surprised at the sense of urgency and suddenness that seems to characterize the initial discussion at User talk:Jimbo Wales), I do have a question.

From the beginning, one of the key points in this new policy was that "things should just be speedy deleted and argued about later" (Jimbo Wales, 01:52, 6 May 2010 (UTC)). I suppose this works for Commons sysops, who are able to view and assess the usefulness of deleted files, but how exactly are editors of other Wikimedia projects supposed to argue about or discuss images which they can no longer view and for which no record of discussion exists?[reply]

I have no desire (nor, really, the ability) to stand in the way of this change, but I ask that the manner of its implementation be considered a little more and be a little more considerate of the limited technical abilities of editors of other WMF projects. Thank you, Black Falcon (talk) 17:34, 6 May 2010 (UTC)[reply]

I had the exact same question. Indeed it's impossible for non-admins to argue about a file they can't see ! --TwoWings * to talk or not to talk... 17:37, 6 May 2010 (UTC)[reply]
Consensus doesn't matter when Jimbo (just another user unless he's speaking on behalf of the board, which he isn't) has decreed it shall be thus. And in this case, 'thus' means 'bulldozing all other cultural mores in favour of narrow American conservative viewpoints.' Bad form for a global and educational project. Roux (talk) 17:39, 6 May 2010 (UTC)[reply]
The inability of the wider community to assess these speedy deletions is also compounded by the nature of these images. Even those admins that can access these images might not be in a position to do so due to concerns about been seen viewing some of them. I wouldn't want to look at these images at work so it just makes it even less likely that proper scrutiny of the speedy deletions will be possible. Adambro (talk) 18:00, 6 May 2010 (UTC)[reply]
Whether Jimbo's concept becomes the final or not. And wether or not the "speedy-first/undelete-later"-policy will become our standard for sexual content. I think we should not forget that it is not our policy just yet. Don't start mass-deleting before we've got it all figured, there's no rush ! As Jimbo said himself: Relax !.
I'm mostly neutral in this matter, I'm not very active in this part of Commons, but for the part that I have been I didn't actually noticed any problem with the way it was: Files are nominated regularly and after a short discussion deleted as appropiate.

For one I'd like to refer to our Speedy-policy, and two I support Deletion requests per image or per set, but certainly not to leave it up to the admins to decide wether an image should be deleted. I've seen quite a few files come by in logs with deletions on Commons and watchlists entries with unlinkings on wiki's – because some admins are already acting on this draft.
About that COM:SPEEDY: Last I checked this page: unless a file is corrupted, previously deleted, fair use or pure vandalism/spamming it should not be speedy deleted. Out of scope has been a reason for a deletion nomination, not speedy deletion. If you, as an admin, believe about a dozen files all together are very alike and should be deleted because of sexual content that is out of our scope, just nominate them in a one bigger nomination; wait it off for a seven days and delete it then as appropiate. If you feel the need to do this quick, then it might be that angel from inside telling others will oppose it's deletion. Althemore reason to nominate. My 2 cents, –Krinkletalk 18:56, 6 May 2010 (UTC)[reply]
FYI: Since Jimbo's announcement 17 hours ago, approximately 300 files have been speedy deleted citing Commons:Project scope and/or Commons:What Commons is not. Black Falcon (talk) 19:15, 6 May 2010 (UTC)[reply]
  • Just my 2 cents. As far as I know speedy deletion are for the following things only: Copyrighted or fair use material, badly named files (uploader request), vandalism/attacks, corrupted, previously deleted. Anything else, anything, should have a deletion request opened and be discussed by the community. If the image is truly objectionable, then the community decision will be "delete". If the community finds it acceptable, it will be "kept". This is the only way files that are not in the aforementioned speedy category can be deleted. They cannot be deleted due to a admin's consideration of them as "laviscious", etc. If they are illegal to view or host, or they are not freely licensed, then of course speedy deletion is required. Otherwise, a DR needs to be opened, and the "lavisciousness" must be discussed with the community. Then it can be deleted.
  • Making a special case for certain kinds of images is against this - who is to judge the merits of the image? The one speedying admin? We would not tolerate a radical homophobe deleting images of a gay pride march claiming that they were offensive (they can of course open a DR and get shot down, though), so why should we tolerate admins speedying things they deem lavicious?
  • Furthermore, Jimbo Wales comment that "Remember, there is no hurry to undelete things - nothing is permanently lost - and things mistakenly deleted in the cleanup can be undeleted in the fullness of time after a calm discussion. is totally out of order. The DR process exists so that people can view the image (any people) and decide. If the image has been speedied, only admins can see it, and no meaningful discussion can be had by the wider community.
  • To be clear, my objection is not the deletion of low-quality, pointless or offensive content, my objection is that admins are speedying things based on a half-baked comment from Jimbo Wales without discussion. Cleaning up the dreck on Commons is a huge task, but this is not how it is done, and I am shocked at how the founder condones such irresponsible actions by admins. Inductiveload (talk) 21:34, 6 May 2010 (UTC)[reply]
Well unfortunately, Jimbo seems determined to have things done in a rush: "admins are requested and supported in efforts to be vigorous and get this done in a timely fashion" [4]. –Tryphon 21:18, 6 May 2010 (UTC)[reply]
I've been tempted to begin deletion reviews for as many pictures as I can to protect them from being speedied, but that would be making a point.KTo288 (talk)
Thank you for not doing that KT0288 - but do feel free to work on that as a project if you like, in the fullness of time. I think at this point, it is premature to do a lot of deletion review, unless something was clearly done in error.
I am hopeful to see a lot more speedy deletion today and tomorrow. This is a cleanup project.--Jimbo Wales (talk) 12:18, 7 May 2010 (UTC)[reply]
Fine. You clean up then. Kameraad Pjotr 12:32, 7 May 2010 (UTC)[reply]
It seems that something was clearly done in error. --187.40.235.122 12:55, 9 May 2010 (UTC)[reply]

Slow down!

Having followed the discussions on this page and at Commons:Village pump, I am of the opinion that the majority of editors who have voiced objections are not opposed to the deletion of "amateur porn cruft", but rather to one or more of the following:
  1. The imposition of a new policy by fiat, rather than consensus-building;
  2. The specific limits of unacceptable sexual content defined by Jimbo; and/or
  3. The deletion of sexual content (and non-sexual content depicting nudity) without discussion and citing a policy which still lacks clearly-defined criteria.
I believe that it will be possible to implement Jimbo's decision quickly and to avoid needless drama and ill will within the community by simply crafting a delayed-action speedy deletion template and category (and perhaps a criterion) for images depicting actual or simulated sexual acts, such as {...}, which are low-quality, unused, redundant to higher-quality images, and/or have little or no educational value. An image is tagged and deleted in, say, 2–7 days if no one contests the reason for deletion; if someone contests the reason for deletion, then the image can be deleted via a DR.
If editors take the time to consider the current and potential value and usefulness of individual images instead of mass-tagging any image that depicts nudity or sex, then I think that this system would remove most of the cruft in a short period of time and with a minimum of drama. Black Falcon (talk) 02:40, 7 May 2010 (UTC)[reply]
What he said. Exactly my view! This solution would be an extremely effective one in my opinion. I am all for removing cruft, and totally against arbitrary introduction of new policy and silent deletions, even by the founder of WMF. This way will alert people to what is being nuked and give them a chance to protest. Although, I think a regular DR would suffice too. Just add "salacious pornography" (or maybe a more rigourous definitation to avoid arguments) to the list of DR reasons, and if the image is deemed to be so, it will go after the DR closes. If the community judges it aceptable, it will stay. Job's a good 'un, and in concordance with normal process! Inductiveload (talk) 07:23, 7 May 2010 (UTC)[reply]
I agree. I don't object to deleteion of bad quality images - of any sort - but I object to things being deleted as out of scope by speedy deletion and to things being imposed on us from on high. -mattbuck (Talk) 07:29, 7 May 2010 (UTC)[reply]
What They Said. How this was done isn't helping the situation. If Jimbo wants to change policy, he needs a statement from the board BEFORE changing policies, not after. Changing policy, urging instant implementation, and justifying with "cause I said so" is like pouring gasoline on fire-- no matter HOW acceptable the underlying principle. --Alecmconroy (talk) 15:30, 7 May 2010 (UTC)[reply]

Not a bad idea; I agree there's no call for silent deletion. Has something like proposed deletion been tried before on Commons? SJ+ 08:52, 7 May 2010 (UTC)[reply]

I agree: The only problem I have with this is the section that requires ALL photographs of sexual activity to ALWAYS be deleted - the controversial section. Everything else is fine. Adam Cuerden (talk) 09:19, 7 May 2010 (UTC)[reply]

I think Sj's proposal to implement/apply Proposed Deletion is sound. It allows rapid cleanup with minimal collateral damage. --Kim Bruning (talk) 12:32, 7 May 2010 (UTC)[reply]


Foundation Announcement

"We do immediately remove material that is illegal under U.S. law, but we do not remove material purely on the grounds that it may offend."
We don't intend to create new policy

The way I read this announcement, there is no new policy from the foundation at this time, and all of the "out of scope" deletions look like they were done without consent of the community OR the board.

So, uhh, would it be too much to hope that this was all just a big misunderstanding, wires got crossed, and we can get now handle deletion discussions as per existing policy? --Alecmconroy (talk) 20:07, 7 May 2010 (UTC)[reply]

Yes, per this statement I think especially deletion of images being used on other Wikimedia projects needs to be halted and reversed. There's no justification for speedy deletions like this in anything the Board has said. JoshuaZ (talk) 20:09, 7 May 2010 (UTC)[reply]
I agree. Please take your time and discuss deletions that affect any images in wide use on other projects. Those are clearly educational, and not the central target of this proposed change. As I said above, I see two pieces to this proposed policy change; and the controversial piece (using usc 2257 as a guideline) should be discussed carefully -- if implemented, that might mean replacing some widely-used images with others. But there's no need to leap into a major change overnight, and a number of people (Black Falcon and Jmabel) have made other sensible proposals that bear discussion. SJ+ 09:10, 8 May 2010 (UTC)[reply]


It is time to undelete all the files now, there is no consensus and no justification for speedy deletions and administrators should follow the consensus here, and not Jimbo. --Ankara (talk) 20:11, 7 May 2010 (UTC)[reply]
I'll request further clarification on foundation-l. --Kim Bruning (talk) 22:54, 7 May 2010 (UTC)[reply]




Suggested improvements

Over categorization

For the nudity and other sexual content images that are kept on Commons we need to establish a sane way to categorize the content that does not unexpectedly bring a reader to the content. Currently, if someone clinks a link to Commons on a sister project in an article about a non sexual topic, they might be brought to a category with sexual content in it. Also, searches done of non sexual topic return images of sexual content. Jimbo links to one of the recent example that I found. If you look through my contributions you will find other examples.

I support the wording of the draft that Jimbo wrote. "In the past we have had a problem with images being placed into inappropriate categories, so that viewers were exposed in unexpected ways to sexual content. Image categorization should be done in such a way that readers are not exposed unexpectedly to content that may be offensive. (Editing note from Jimbo so you have an idea what I'm driving at here... this is an example - this image used to be in Category:People eating)" FloNight♥♥♥ 20:29, 6 May 2010 (UTC)[reply]

Another example: There is a lot of erotic/porn images in Category:Second Life. The problem is that this category is included by {{Second Life}}. --Leyo 23:32, 6 May 2010 (UTC)[reply]
Second life template is a sourcing template, and thus should source in a separate hidden sourcing category, as all our sourcing/copyright status categories are hidden. Seems the easiest method here to avoid that problem. TheDJ (talk) 00:58, 7 May 2010 (UTC)[reply]


content tagging

I guess policy reform is a bit like my embarrassing early sexual history.... you wait ages and ages, then it's all a bit of a rush ;-) - for what it's worth there are still some blowjob pics, and assorted other sexual images as seen here - when this has all calmed down a bit, I'd like to encourage the development of the technology to support the ICRA rating system - or equivalent. I'd then likely support the addition / undeletion of media featuring explicit content - a la flickr for example. Dunno where this places me in the spectrum of things - probably in the middle? cheers, Privatemusings (talk) 22:00, 6 May 2010 (UTC)[reply]

I think more than a few people have the feeling this particular policy change was a bit on the non-consensusal side... FYI: see bugzilla:982. (Thanks to TheDJ for providing the link.) Black Falcon (talk) 22:14, 6 May 2010 (UTC)[reply]
Agree with Privatemusings. I'd like explicit content to be on Commons and in Wikipedia, but only once a content rating and filtering system is in place. (Let's hope it won't take as long as flagged revisions on en:WP.) --JN466 22:30, 6 May 2010 (UTC)[reply]
We no longer need tagging now do we ? I don't see why anyone would invest the time to develop a software function that is used by just a few images. It's not worth the effort. Though if anyone does, I welcome the deploy of such software. TheDJ (talk) 22:41, 6 May 2010 (UTC)[reply]
In principle ICRA is wide ranging, covering sex, drugs, bad language, violence, etc. in both images and text. It could certainly find uses, though I don't know if those uses provide enough incentive to drive development of it. Dragons flight (talk) 23:33, 6 May 2010 (UTC)[reply]
I don't know much about ICRA, but in general, such content tagging schemes are not public domain — if a porn site rates itself 100% family-friendly, they reserve some legal method of acting against this. If so, such tags are not appropriate for Wikimedia Commons because they are not free content. Even if not, the tags aren't really appropriate for Wikimedia Commons for the same reason that the amateur porn isn't - because they don't serve an educational purpose. You could say (at best) that they could increase the reach of this site toward children, but only if they are all but universally and correctly applied. Just the vandals alone would make sure this didn't happen. Wnt (talk) 00:05, 9 May 2010 (UTC)[reply]

I believe that once things here calm down a bit, we should get back to the topic of content tagging. I'm not looking to specifically use ICRA, but I think that it would be useful to have a certain hierarchy of tags that would more easily allow users or mirror sites to set preferences that would allow them to filter Commons content in a way that they consider appropriate for their workplace or school. I would not have a comparable concern about text, but the thing with images is that someone simply walking by someone else's computer is subjected to seeing the image, and there are certainly images that are not appropriate in certain settings. Sexual images are certainly not the only ones for which this arises; for example, I can easily see a valid reason why an elementary school would wish to be able to filter out images of dead bodies. - Jmabel ! talk 00:52, 10 May 2010 (UTC)[reply]

Descriptions

Something that hasn't yet been touched upon: in many cases, the way an image is perceived is heavily influenced by the description. One look at File:Leda and the Swan 1510-1515.jpg - an image I hope no one is going to suggest we need to lose - indicates how much of this is context. One could easily give such a picture a lascivious description (and, I suspect, be entirely true to the artist's intent in doing so). Conversely, I suspect that there are some images that are liable to be deleted in the current purge that really only need their description reworded. - Jmabel ! talk 00:38, 7 May 2010 (UTC)[reply]

It is a notable piece of art and thus allowed per the suggested, exceptions isn't it ? TheDJ (talk) 00:56, 7 May 2010 (UTC)[reply]
Absolutely. That's part of why I picked it as an example: it is unlikely to be deleted. But if someone had described it as "young woman after sex with bird, in front of naked kiddies" (and I'm hardly going to the extreme of how it could be described in porn-speak) it would present a rather different tone than it does with the current description. What I'm suggesting is that, conversely, there are probably images that are likely to be deleted in the current purge not because of what the image shows, but because someone with more of that attitude uploaded it and wrote the description. I'm suggesting that people who are trying to clean things up keep their eyes out for images that merely need a more appropriate description, rather than deletion. - Jmabel ! talk 04:37, 7 May 2010 (UTC)[reply]
you're right of course, but I don't really think this is a problem - I've had a look to see if I can find such an image, but it remains in the mind's eye as far as I can tell.... Privatemusings (talk) 04:46, 7 May 2010 (UTC)[reply]
The trouble is that it potentially clashes with two rather different sorts of law. The presence of the swan is a minor issue--you have to know the legend to know what that signifies--so it comes down to a depiction of a naked person, combined with the depiction of naked children. That raises the spectre of Child Porn, which is less strictly defined and often makes mere possession an offence. I'm sure people could rant at length on the stupidity of such laws; they've often been extended in ways which don't make sense; but it maybe makes this a less useful example. And Botticelli seems to have been careful. Still, a naked woman coming out of the sea at a Mediterrean Beach...88.110.3.50 10:41, 10 May 2010 (UTC)[reply]

Policy guidelines

Whatever we come up with for policy here, we are going to need guidelines giving examples of acceptable and unacceptable content. (Presumably, the acceptable can be illustrated by images on Commons and the unacceptable would only be verbally described or by things like "if this painting were a photo, it would be a problem.") Otherwise, this situation is going to get a lot uglier than it currently is, and some of what has been going back and forth between some contributors is already getting pretty ugly. - Jmabel ! talk 00:51, 7 May 2010 (UTC)[reply]


Different accounts for minors?

Neutral description of sexuallity, including images, is important for (minor-aged) users, pornography not. So I ask: Why not exeeding the functions of the software ? It will be helpfull, to declare accounts as "minor-aged" (e.g. access of schools) and to declare images with a flag as "sexual content" (three or four steps like movies). That will enable user or parents of young users, to switch off inclusions by creating such a minor-age-account. That will allow, to keep images wich are ok for youths, but not for children. Antonsusi (talk) 12:48, 7 May 2010 (UTC)[reply]

I am sympathetic to ideas like this. I think the software developers should look into what can be done.--Jimbo Wales (talk) 13:07, 7 May 2010 (UTC)[reply]
While I would support a safe-search option I see no reason it should require separate accounts. If you have an account the flag persists, if you don't it defaults to on but you can flip it off.
The majority of readers does not have a user account. So, do you suggest to differentiate between under-aged readers and adults? --Leyo 13:42, 7 May 2010 (UTC)[reply]
Indeed, tho I think Antonsus's concept sounds good and seems the obvious choise. How to handle IP-addresses ? It would need to have support for IP-ranges ofcourse. But more importantly, I think a better solution would be to not set it in the account or IP (privacy related aswell) but to do it like the beer and adult-content websites. Simply prevent viewing of such images without a confirmation. Would not only avoid adults in schools unable to view it. It would also make it easier for 18+ users at work to accidentally view something (because there's a short confirmation message asking to eitiher filter or view that content). That setting could be saved client side in a cookie (like sitenotices). With the differencce in sitenotice there there is a button to re-hide aswell. –Krinkletalk 14:32, 7 May 2010 (UTC)[reply]
The majority of user do not have an account? so disable sexual content for unlogged account, if they want sexual content, they'll create an account...? Esby (talk) 15:38, 7 May 2010 (UTC)[reply]
That goes against the very principle of COM:NOTCENSORED. And who decides what's sexual content? Is nudity sexual? Powers (talk) 15:43, 7 May 2010 (UTC)[reply]
So deleting them is better than hiding them in the end? It has been a few years that flickr does that... Obviously some threshold has to be decided, from the instant an image can be considered as problematic, it should be hidden, if we want to apply the law to the letter... Esby (talk) 15:47, 7 May 2010 (UTC)[reply]
  • With a content rating system, we could implement something like google's safe search as default, with a notice that a complete search including adult content requires registering an adult account. Schools and libraries could block registration or log-in of adult accounts. --JN466 19:07, 7 May 2010 (UTC)[reply]
That's my vote. Creating different access. Accounts for children, accounts for youths and accounts for adults. The other part of such a system is a flag for images to sort their visibility to this accounts. Because of different laws in different states arround the world, such a system shoudt be separate for each project. Antonsusi (talk) 23:06, 7 May 2010 (UTC)[reply]
I think this is a terrible idea. Wikipedia has thrived for a decade giving everyone who visits the site the full story. Now you're saying you want only those who sign up as adults to have real access? The restriction is absolutely pointless if kids can just lie about their age, so I suppose you want some fancy ID card system for all the editors? Are you thinking fingerprints or a credit card account?
And if kids can't see the whole article, I suppose that means they can't edit it either. I mean, how else can they even tell what their edits look like? And of course, I suppose that means IPs can't edit either, right?
Also, once you identify kids as kids, who is going to take them seriously? Why should they be allowed to see, say, AN/I discussions, or BLP debates, or information about explosives, or articles about racist ideas? You'll never run out of things that you feel obligated to wall them off from.
If you go this way, the only way to properly participate in the encyclopedia will be by logging in for a new account. With some other organization. Wnt (talk) 00:16, 9 May 2010 (UTC)[reply]
Sounds bad to me too, Wnt. First, it is censorship. Second I don't see how it would be at all enforceable. Third, just because a project is language-specific doesn't make it jurisdiction-specific: e.g. German is the official language of more countries than just Germany (some not even in the EU), to say nothing of speakers of German in the US and other countries. I might add in response to Jayen466 that Google does not require signing up for an account to see unfiltered searches! Lusanaherandraton (talk) 09:28, 11 May 2010 (UTC)[reply]
At some point, this issue might come up for a lot of stuff. Arguably, it's a general Internet problem. If some school wants to access data on the Internet, can the data flow be controlled to safeguard the children? It's something that's been argued about for as long as I've been on the Internet, and the only answers that seem to stand any chance of working seem to depend on a combination of user accounts and task-specific software. At least with physical books, there used to be child-safe encyclopedias. (And when I was at school, there were topics on which I knew more than some of the teachers: a general Wikipedia problem there, spotting the ignorant edit.)88.110.3.50 11:00, 10 May 2010 (UTC)[reply]

Monitor brand new accounts uploading explicit stuff

While we're at it.. I always found it pretty awful that we allow brand new accounts to upload porn and sexually explicit images with next to no information provided (apart from "Own work" or something like that). For instance, practically every picture at Category:Penises by size, male human: micropenis fits that description, and I don't believe for a second that those really are freely usable images uploaded with the consent of the model. Shouldn't we have higher requirements for such images, especially when they come from newbies who never uploaded anything else? --Conti| 13:33, 7 May 2010 (UTC)[reply]

Is anyone surprised that panicky actions like this draws on trolls like flies to a corpse? /grillo (talk) 17:34, 7 May 2010 (UTC)[reply]
You mean trolls who upload random explicit stuff? Nah, never gonna happen. --Conti| 17:49, 7 May 2010 (UTC)[reply]


A separate sister project on sexuality?

It seems like most of the wording is targetted toward Wikipedia. But, the Wikimedia Commons hosts multimedia for all of the sister projects as well (Wikibooks, Wikisource, et al). Which also makes me think that it might even make sense for the community to create an entire sister project focused on Human Sexuality where detailed, in-depth, educational discussions on such issues could take place. I personally don't have an interest in creating such a project, but, I bet there are plenty of people out there (as well as within the community) who would jump at the opportunity. --Joshuagay (talk) 20:47, 7 May 2010 (UTC)[reply]

So you're basically saying that Wikipedia shouldn't cover sexuality in the same fashion as for example geography? I think any modern sexologist would argue with that... The best way towards a healthy sex life is not to make it a big deal... That includes allowing Wikipedia articles and images about it. /grillo (talk) 20:50, 7 May 2010 (UTC)[reply]
My suggestion was that in addition to whatever articles are created on Wikipedia, and whatever policies are created on wikipeda, it might be worth having a sister project on the topic of sexuality. Consider, for example, the fact that there are thousands of articles on wikipedia about various species. In addition there is also a sister project called Wikispecies. --Joshuagay (talk) 21:34, 7 May 2010 (UTC)[reply]
As I look at any particular image, I ask myself: would this have a place in a brilliantly-curated, comprehensive Encyclopedia of Sex and Sexuality? A lot of them are simply too poor, non-representative, or repetetive to be valuable for anything. But many certainly would be - and those clearly deserve to be kept. If we discover that there are social and community-health (or worse yet, legal?) reasons not to have all of this media in the same place, or to create new specialized projects on Sex (or violence, for that matter -- we have a similar but much smaller issue regarding our images of corpses, torture and the like) we can work out a better solution, or even spin off a daughter project. But we should not shy from our primary goal of gathering the best possible resources to illustrate all of human knowledge. SJ+ 09:10, 8 May 2010 (UTC)[reply]
I approve of this eloquent defense of our mission-- even if all the images in question were top-quality, I can't imagine why we'd need a separate project. Wikispecies is not merely restricted in scope, but fundamentally different in form and function from Wikipedia. Any Encyclopedia of Sex and Sexuality would still be an encyclopedia, and as such, be redundant within the Wikimedia projects. Lusanaherandraton (talk) 09:38, 11 May 2010 (UTC)[reply]

Jimbo's intervention and policy change

Thread not related to building consensus guidelines - click on arrow on right to view

I think it fair to say that when Jimbo attempts to write policy off-the-cuff by fiat, the result is usually either hugely controversial or have serious omissions or problems, even though the idea behind it is usually quite good. We've seen this happen on en-wiki many times.

One would wish Jimbo would stop making statements by fiat that are meant to be implemented immediately, and instead state that things need to be done, give his initial thoughts, and ask the community to craft a policy.

I propose that we suspend all deletions related to this for one week, and work out a good policy that doesn't threaten encyclopedic content, while getting rid of the amateur porn cruft. Adam Cuerden (talk) 01:41, 7 May 2010 (UTC)[reply]

Adam, I claim no special expertise in writing the details of policy, and so I welcome your help in crafting the language of the policy to be more precise and to allow for appropriate discrimination between random porn and historically important images, etc. However, this is a cleanup project, and I want the speedy deletions to continue. I hope, in fact, you'll help with them. We can undelete things later if errors are made - but I want us to start from a position that pornography on commons is unacceptable, full stop.--Jimbo Wales (talk) 12:26, 7 May 2010 (UTC)[reply]
We should definitely work out a good policy regardless :-) - have you had any thoughts on that, adam? Privatemusings (talk) 01:44, 7 May 2010 (UTC)[reply]

The Jimbo church of My Moralty is Best' strikes again

did you mean something else?

So much for finding rational and productive consensus. Just delete until the community organizes a revolt, bullocks! 71.139.2.187 (talk) 03:23, 7 May 2010 (UTC)[reply]

calm down, calm down :-) Privatemusings (talk) 03:48, 7 May 2010 (UTC)[reply]

Another abuse of power from Jimbo Wales

The laws specifies "photographs and films", therefore there should be no problem for drawings or such illustrations. And what do I discover today ? Jimbo Wales has proceded to the speedy deletion of File:Félicien Rops - Sainte-Thérèse.png, which is a caricature made by a famous 19th century artist/caricaturist ! This is perfectly within the scope and not outlaw. Commons is going crazy now ! There seems to be no rules anymore because of the way everything had been decided ! --TwoWings * to talk or not to talk... 15:25, 7 May 2010 (UTC)[reply]

Jimbo means well, but I sometimes really don't know what he's thinking. My best guess, after talking with Cary, is that, well, I don't think he actually edits much, so when he steps in, he does so rather naively, not really checking what he's doing. Adam Cuerden (talk) 17:04, 7 May 2010 (UTC)[reply]

Jimbo should not do his stupid actions without community consensus. --Dezidor (talk) 19:32, 7 May 2010 (UTC)[reply]

There is a very important reason for his deletings: He and WMF is attacked hard by FoxNews and others. In past, FoxNews started a campaign by correspondence with important sponsors of WMF to turn off the float of money. See http://www.foxnews.com/scitech/2010/05/07/wikipedia-purges-porn/ for more. Antonsusi (talk) 23:28, 7 May 2010 (UTC)[reply]

Outside the US nobody cares about FoxNews. It's their POV. Did I miss the point when the NPOV was deleted from the principles of Wikipedia and it's sister projects? However, not knowing what Jimbo's intentions are but one thing I know for sure: he lost the trust he formerly held. Me might be also in future – forced or unforced – accepted as the founder of the project but many people are losing their respect. --Matthiasb (talk) 11:10, 8 May 2010 (UTC)[reply]
Those in the USA probably don't realise that FoxNews is owned by Rupert Murdoch, notorious in the UK for The Sun, a newspaper he made notorious for publishing semi-pornographic images on Page 3. Under current law, some of the pictures of semi-clad models would be counted as Child Porn. Dear me, purveyor of child porn complains about competitor... Oh, the irony.88.110.3.234 15:44, 11 May 2010 (UTC)[reply]
See also http://foxnewsporn.com/ /Pieter Kuiper (talk) 16:04, 11 May 2010 (UTC)[reply]
(in response to the IP) Maybe my fellow Americans don't know Fox is owned by Murdoch, but US response to anything broadcast on Fox News falls into one of three categories: (1) it must be true! (the smallest group, roughly the same people who thought George W. Bush is the best president ever); (2) it obviously must be a bald-faced lie! (a larger group, who can't believe W. was re-elected for a second term); & (3) So? (the largest group; I wouldn't be surprised if more Americans watch/listen to the US edition of the BBC than to Fox). Honestly, worrying about what Fox says about Wikipedia is less productive than worrying about what The Register says about Wikipedia: the PTB at Fox have decided long ago that we are a bunch of un-American brie-eating Liberuls who must be stopped, & nothing Wales or anyone else can do will change that. If anything, being attacked by Fox is something of an honor, like making Nixon's enemies list was in its day. -- llywrch (talk) 21:22, 11 May 2010 (UTC)[reply]

Jimbo the vandal

Jimbo has now decided to delete what he considers pornographic artworks, by notable artworks.

All of them.

I confronted him on IRC, and he said he intends for all of them to be deleted, then, maybe, sometime in future, we can discuss undeleting some of them by sifting through the remains.

After Commons delinker has removed them from use in every project.

When there's no way to easily tell a deleted artwork from deleted amateur ponc ruft except by looking through every single deletion, and maybe getting lucky with some file names.

Despite noone, literally noone but him advocating for deleting artworks a Commons:Sexual content.

Jimbo is a vandal, he should be blocked, and I refuse to have a part of a project that thinks it knows art better than art historians. That's heathenish behaviour.

Further, he engages in Orwellian doublethink. What does the policy which he wrote and under which he's doing these deletions say? Although there is a common saying that "Wikimedia Commons is not censored," this statement should not be interpreted to imply that we do not make editorial judgments about the appropriateness of content.

Yess it fucking is censored, you're deleting artworks by major artists from the 19th century which you consider pornographic. That's the damn definition of censorship.

Fuck you, Jimbo. I'm off, and not returning here. Adam Cuerden (talk) 18:17, 7 May 2010 (UTC)[reply]

I'm out, too. This neopuritan American conservative bullshit needs to stop, but the mighty god-king has decreed. Interesting given how his career online started, don't you think? Roux (talk) 18:25, 7 May 2010 (UTC)[reply]
Did these actions happen before, or after I talked with him on irc today? (this was at 12:50) --Kim Bruning (talk) 19:38, 7 May 2010 (UTC)[reply]
After kim, way after. TheDJ (talk) 19:44, 7 May 2010 (UTC)[reply]
Wales is out of line here. He is not the chairman of the Wikimedia Foundation; Michael Snow is. He's not the executive director; Sue Gardner is. The board of the Wikimedia foundation could order this by resolution, but they have not done so. Wales is the founder, but he is no longer part of Wikimedia management. He's gone on to other things. He needs to realize that. --Nagle (talk) 19:48, 7 May 2010 (UTC)[reply]
In general, I would say: don't panic. I'm going to respond here first -- a lot has happened since I logged off for work this morning! -- and then on the lists. SJ+ 09:10, 8 May 2010 (UTC)[reply]


+1 --Melanom (talk) 20:48, 7 May 2010 (UTC)[reply]


Wow. WTF ? Jimbo Wales : who explicit content are you ?

As did someone above me, I will present myself : I am just some regular contributor of the French WP, occasional contributor of the English WP. Learned about this topic by chance. And thought : Wow. WTF ? As far as I am aware, Commons only discriminates on license arguments. I understand, as it is obvious, that it also discriminates on legal arguments. It seems obvious to me that child porn must be avoided, for instance. Anyway, any content policy should be discussed by the community.

However, as far as I have understood (correct me if I have missed something, and I'll apologize), some obscure WP founder, namely Jimbo Wales, considered that some images were too offending in his opinion, and deleted them, without any discussion. Would he be any head of state, I would wonder — anyway he should only be a contributor like you and me, or at most some admin, shouldn't he ?

This is the second "WP founder", within a few days, to prove he is a fool. So here is the question : where do one sign up to remove his admin account rights ? Count me in. Having one good idea in one's life then acting with nonsense is far from being uncommon. Skippy le Grand Gourou (talk) 22:41, 7 May 2010 (UTC)[reply]

Sign up here: m:Requests for comment/Remove Founder flag. -- Adrignola (talk) 02:43, 8 May 2010 (UTC)[reply]
He is trustee of Wikimedia foundation, and is maybe liable in the event of some scandal. Or at least have his public image damaged. But yes - his opinions is above the opinion of the majority of other members of community, and they are not always brilliant. While I think he is a great leader, I would support some power exchange on the foundation. This, of course, will not happen, for the same reason Hugo Chavez will not leave power in Venezuela. (not a reference to dictatorship - just saying he is popular and is unwilling to leave power) --187.40.204.204 02:54, 8 May 2010 (UTC)[reply]
Thanks for the link. Skippy le Grand Gourou (talk) 07:45, 8 May 2010 (UTC)[reply]

The reason for the deletings

There is a very important reason for the deletings: Jimbo Wales and the WMF is attacked hard by FoxNews and others from the "sex is dirty league". In past, FoxNews started a campaign by "creating pressure" against important sponsors of WMF to turn off the float of money. See http://www.foxnews.com/scitech/2010/05/07/wikipedia-purges-porn/ for more. Antonsusi (talk) 23:33, 7 May 2010 (UTC)[reply]

Is it reasonable to void solid policies under coercion from mass media? :/ --187.40.204.204 03:15, 8 May 2010 (UTC)[reply]
No, it is not. And that is not what is happening here. Fox obnoxiously spammed some Wikimedia donors, but none responded negatively. SJ+ 09:10, 8 May 2010 (UTC)[reply]
Better to lose a sponsor than to follow their every word. In this case based on scare tactics from Fox News. Isn't that the entire point with the NPOV principle...? What if Sony decides to donate several million if we just delete all info about Sony controvercies. Would that be ok...? /grillo (talk) 08:42, 8 May 2010 (UTC)[reply]
The question is—can it ever stop? Today FOX News is crowing about how it single-handedly cleaned up Wikipedia. Tomorrow they can write another article, and another - and with every cave, Wikipedia becomes easier to cow. You might as well give FOX News the founder bits. This isn't Jimbo's problem alone, it's an American problem - there's never going to be another American in Saudi Arabia again, because Osama bin Laden said so. Wnt (talk) 20:53, 8 May 2010 (UTC)[reply]

Deletion logs

I've started work on a deletion log showing the recent activity - of particular interest to me are the links which are now 'blue' once more - this change is far from having been accepted by this community (or more specifically, some administrators) - mellow or not, admin.s here are undoing each others actions. Privatemusings (talk) 05:57, 7 May 2010 (UTC)[reply]

Most (all?) of the bluelinks I see in your list at the moment seem to have been undeleted by the same admin who originally deleted them, after being informed that the files in question were in use on other Wikimedia projects (which, under COM:SCOPE as currently written, makes them automatically presumed to be educationally useful). —Ilmari Karonen (talk) 08:48, 7 May 2010 (UTC)[reply]
thanks for clarifying, ilmari - if I can work out how to read the logs properly, I'll annotate the list. Bringing the issue of 'in use' images into line with what jimbo described as our new policy is an outstanding problem - discussed above in part.... Privatemusings (talk) 09:32, 7 May 2010 (UTC)[reply]


Delinked content

For what its worth, here's a link to toolserver with list of delinked images: http://toolserver.org/~delinker/index.php . Its possible to use that to generate list of all delinks since before the purge started. It would be useful to transcribe the relevant bit of that, and the deletion log, so that we can track this, and work out what to restore when the situation settles down. Keeping a decent log of the actions related to this is important if we are going to follow the shoot-first-ask-questions later approach.--Nilfanion (talk) 20:26, 7 May 2010 (UTC)[reply]

And there have been a couple of offers on foundation-l to write bots to relink content based on these logs, after review of recent deletions.

Undeletion requests?

First of all I'd like to introduce myself: I have been an admin on Swedish Wikipedia since 2005 and I'd consider myself fairly well respected. In these times it's easy to consider every user name you haven't seen before as being a troll, so I just want to avoid that.

I was looking at this list of commons delinking, and it's really impossible to know wheter some of those images were good deletions or not. For example, one of them has already been restored. In cases like that, the damage is already done, since there is no Commons "relinker" that puts the image back when it has been restored. Also, it's impossible for non commons admins to check the content of deleted images. ALSO, it's impossible to know in advance which images will be deleted when they don't go through the normal process of deletion review (this is a clear case of admin abuse of powers, and yes, that includes Jimmy Wales who I don't consider to have more power than any other user). Also, I don't want to nominate images that could very well be deserving their deletion by massrequesting images for undeletion. That would also be kind of POINTy... So, how can a lowly non commons admin know which images that should be undeleted or not? Could at least someone go through the CommonsDelinker edits in my link and check if any of the images should be undeleted?

I'm very sorry that this forces Commons admins to look at potentially crappy images, but at least I know that it's not my fault... /grillo (talk) 17:41, 7 May 2010 (UTC)[reply]

If this is abuse of admin power, is there any good reason why the community shouldn't request that Jimbo be desysopped? Roux (talk) 17:43, 7 May 2010 (UTC)[reply]
Not really, but realistically that will never happen. I don't know which mandate Jimmy Wales is claiming whenever he takes these kinds of actions. Has he even ever been nominated for adminship on any project? Maybe just for show he should do that...? By the way, why is female masturbation not ok while male is? /grillo (talk) 17:49, 7 May 2010 (UTC)[reply]
For example, File:DoublePenetration.svg is obviously an illustration... /grillo (talk) 17:55, 7 May 2010 (UTC)[reply]
Grillo - just so you know, I don't think there's any reason to discriminate between male versus female pornography. It all needs to go.--Jimbo Wales (talk) 18:17, 7 May 2010 (UTC)[reply]
It's fine for you to have that opinion, but what makes your opinion so much more important than anyone elses? When I was new on Wikipedia I thought that the main principle was that it is the edits that count, not the person making them. What right do you have to go against consensus? /grillo (talk) 18:28, 7 May 2010 (UTC)[reply]

I would support desysopping. He should start deletion requests, not delete it without consensus like vandal. --Dezidor (talk) 19:52, 7 May 2010 (UTC)[reply]

There is no local sysop right that could be removed. --Leyo 19:55, 7 May 2010 (UTC)[reply]
The founder flag gives Jimbo sysop rights on all wikis. However individual Projects can still develop local policies about what is and is not appropriate (in terms of such interventions) -- for instance en:wp, where this comes up most often, has w:Wikipedia:Role of Jimmy Wales. SJ+



Keep images that are in use

This is (or should be) Commons SOP for all images anyway. Images that are being used to illustrate Encyclopedic content are obviously being used to illustrate Encyclopedic content, and thus are within scope, IMHO. I've added that caveat to the page. This should be fairly uncontroversial (famous last words ;-) )

--Kim Bruning (talk) 12:28, 7 May 2010 (UTC)[reply]

I don't agree. Actual depictions of sexual activity, pornography, if in use in some language version of Wikipedia, should be deleted, as the problem is much worse if it's actually in use than if it's buried here on commons.--Jimbo Wales (talk) 12:34, 7 May 2010 (UTC)[reply]
I'm confused by your logic. If an illustration is being used in an encyclopedia, we can err on the side of such images being (used as) encyclopedic and educational in nature and intent, as opposed to pornographic. Also, the wikipedias are much larger and have stronger "checks and balances" than commons does AFAIK, to ensure that this is -in fact- the case.
Are we talking about the same kinds of images? --Kim Bruning (talk) 12:40, 7 May 2010 (UTC)[reply]
We might be talking about different kinds of images. Consider obviously juvenile home-made pornography. If it is being used somewhere in Wikipedia, that's almost certainly a bad thing. On the other hand, if something is borderline, it can make sense to look at what context it is being used in.--Jimbo Wales (talk) 13:05, 7 May 2010 (UTC)[reply]
I suppose that makes some kind of sense.
To clarify my position: if an image is in-use, that's a big red flag indicating that things might not be clear cut, one might get into trouble either way, and to investigate closer; rather than an incentive to delete it even quicker. tl;dr: Proceed with caution, if an image is in-use. --Kim Bruning (talk) 13:28, 7 May 2010 (UTC)[reply]

I've added language that says one should proceed with caution on in-use images, because there's likely something going on there that's not clear cut. Folks should always proceed with caution, but a little extra caution can't hurt. Is this much uncontroversial? --Kim Bruning (talk) 13:51, 7 May 2010 (UTC)[reply]

It seems to me it makes no good confusing images out of scope with something like unacceptable images. While we can imagine a child-pornography image being used in a Wikipedia article, and we would still want to delete it, the reason would be, above all, because it is illegal, and therefore completely unacceptable. Copyright violations are not deleted because they are “out of scope”, but because they are illegal, and therefore unacceptable (similarly, non-free (e.g. cc-by-nc) images are not deleted as “out of scope”, but “against COM:L”). If an image is (properly, vandal edits aside) used in a Wikipedia article, it is definitely in scope. This does not preclude the image being deleted as a copyright violation or illegal pornography. But saying that e.g. a century old illustration by a notable author (used in 3 articles on enwp) is out of scope, seems a bit ridiculous to me (and I can’t think of any other reasons to delete it, anyway).
P.S. OK, this specific image has just been restored by Jimmy; who will now restore its uses in Wikimedia articles? And how? As soon as an image is removed from the articles, there is no way we can find out where it had been used… This is why additional caution is really required when deleting images in use.
--Mormegil (talk) 14:12, 7 May 2010 (UTC)[reply]
I just restored an image that was deleted by Tiptoety for being "amature porn". Though the image might not be educational in that it teaches how the penis works or what it looks like (there are better images for that) - it was a good image illustrating how sexuality in art continues in the digital era. There are god-knows how many classic paintings of nudity, are they educational in a sexual way ? No. But they are educational in illustrating the history and change in art, which is by all means within our scope. As stated above, whenever an image is in-use always nominate unless it's clear vandalism. Other people might know things about it that you as an admin may not know about. That's the whole point about a deletion request that everybody can participate and, local wiki's are usually notified about this on the talkpages of articles. For example, the image I am talking about here (File:Staff Of Life Or Horn Of Man.JPG) was in use on nl.wikipedia (Geschiedenis van de erotische afbeelding) (I know because the CommonsDelinker edited the page and delinked it) - but I dont know where else it was in use. There was no notification on that article's talkpage of CommonsTicker because it was speedied. –Krinkletalk 14:26, 7 May 2010 (UTC)[reply]
Flonight convinced me that an admin or otrs needs to have some leeway when dealing with legal or moral issues. In those cases, it's better to advise that they at least proceed with caution, but better yet consult other admins, and in regular cases where it isn't detrimental also the community, rather than that they run over everyone roughshod. The way you've worded the caution paragraph now, you're backing everyone right back into their respective corners I was carefully trying to coax them out of. I think we both would like to see people consulting as many people as is wise, rather than backed into a corner. Would you like to revise your edits? --Kim Bruning (talk) 14:35, 7 May 2010 (UTC) Incidentally, your position and actions are almost exactly what I predicted to people, an hour or two ago. I certainly don't blame you for them! :-)[reply]

Images used in any featured article in any major project should be keept. It's kind of painfull to see how any image with sexual connotation (following WASP conservative standards) is suddenly worldwide put into question. It might be a great idea to realize that there are fundamental differences between the conservative moral mainstream in the US and practicly all European countries (I'm not talking about hardcore pornography).
It is highly necessary to find a set of rules which will respect cultural views outside the US. The French and German projects have been used several times as an example for highly serious and examplary standards by Jimbo.
The current situation is highly unsatisfactory. Even if the language barrierers block the reception of related European reactions, don't get it wrong, this lawnmover rapid action without any transparent rules or community process is seen highly critical by a huge group of highly motivated authors in other language projects. Nemissimo (talk) 14:44, 7 May 2010 (UTC)[reply]

This is an excellent point. In many places outside the US for example it is acceptable for females to be topless. In the US, generally not. Similarly, in some Middle-Eastern countries, even what most of us consider perfectly normal day to day clothing is completely unacceptable. However, the emphasis on genitalia in the proposed restrictions may handle this problem to some extent. JoshuaZ (talk) 15:43, 7 May 2010 (UTC)[reply]
I could keep writing long messages about individual cases but I strongly suggest that this insane clean up project is being paused untill the draft is in final state, has reached consensus and published as official policy on Commons or has been forced by the Wikimedia Foundation and published by them. But currently it's a loose uncheckable loop.
@ Jimbo/Bidgee: Another set: How about Category:Wiki-Sexuality Images ? Those are hand-drawn educational visualisations for sexual terms etc. that pretty much all have an article on one or more wiki's. The individual image (which I restored) is Wikibukkake.png. How else would one illustrate an article such as Bukkake given that the viewer is 18+ and Commons is uncencored ? I would (obviously) accept and abide an official policy, but I fail to see by which current active official policy that image would have to be speedy deleted. –Krinkletalk 16:51, 7 May 2010 (UTC)[reply]

Policy wording for images in use

  • There has been significant changes in the "in use" warning since it was first authored. User:Thryduulf has changed from:[5]

"* WARNING If the material is currently in use on one or more Wikipedias, and was not just added to those Wikipedias recently, then you should proceed with caution. Use the regular full deletion review process, if at all possible."

Is now[6]:

"*If an image or other media is in use on at least one Wikimedia project for educational purposes, the full deletion process must be used. Legal issues are the only exception."

  • The later statement could be used to disregard other guidelines based on it's in use status. I can see a user uploading an image with high shock value, like erotic coprophagia and having another user insert it into several articles in order for it to remain visible for several weeks. I think the "recently added" needs to be defined to a certain time period, perhaps one or two months. - Stillwaterising (talk) 05:59, 12 May 2010 (UTC)[reply]

Other discussions

Stating the obvious Larry Sanger, FBI & WikipediaReview.com

Let's state the obvious ...

This long overdue Damascus Road conversion has come about due to Dr Larry Sanger's letter to the FBI, since forwarded to other agencies and media internationally, and the ongoing discussion at Wikipedia Review. Let us hope the current flutter of activity is more than just a face saving exercise.

However much such a clean up can be welcomed, it still wont address the dissonance in intentions between those of us who seek to make the Wikipedia an educational resource suitable for young people, and the "No Censorship" cultists seeking to promote whatever their current indulgence is. Nota bene, it has been clearly evidenced that such indulgence can either include, e.g. outrightly pro-pedophiliac agendas or employment by the pornography industry, both seeking to shape public consensus and promote their interests. On one hand, the justification and normalization of sexual exploitation and abuse and, on other hand, the commercialization of sexuality invariably involving the former.

  • Given the semblance of a volte face, could Jimmy Wales and the Mediawiki Foundation clarify which intention they intent to purse in the future?

If it is indeed the former, a global encyclopedia suitable for children and educational institutions, there would seem to be more far reaching reforms required such as; age verification, model release verification, child access to hard core pornography and so on.

  • Would the Mediawiki Foundation care to mention in their statement what their intentions is and how they are going to go about achieving them?

It also would seem respectful of the Foundation if initiators of this reform were credited.

Thank you. --State the obvious (talk) 00:19, 7 May 2010 (UTC)[reply]

I disagree. While removoing low-quality non-encyclopedic content is a good thing, the first part of the statement, which pretty much made our sexual content policy into "all sexual content, no matter how mild, is completely banned", was an appallingly HORRIBLE idea. Adam Cuerden (talk) 01:36, 7 May 2010 (UTC)[reply]
I am sure you can find plenty of other sources and repositories for your masturbation material without forcing it down the throats of children, without due and proper limitations or warnings, under the guise of it being "educational".
It is your choice. You change or the Wikipedia is going to end up being increasingly censored out of all the places it wants to be and portrayed for what it is. --State the obvious (talk) 03:03, 7 May 2010 (UTC)[reply]
"you can find plenty of other sources and repositories [etc.]" is rather ad hominem. To suggest that the only reason anyone would include material about human sexuality is as masturbation fodder is, frankly, rather insulting. I would suggest that people refrain from such remarks about other contributors. I'm quite certain it is not Jimbo's or the board's intention with the proposed changes to declare open season on all participants in WMF projects who believe that human sexuality is part of the appropriate scope of an encyclopedia. I believe that I am not out of line as an admin to suggest that such ad hominem attacks are no more appropriate now than they were a couple of days ago, before Jimbo weighed in. - Jmabel ! talk 04:32, 7 May 2010 (UTC)[reply]
I can assure you that "other sources and repositories for your masturbation material" was not an ad hominem attack aimed at the one individual above.
It was an objective critique aimed at the entire cabal pushing their amateur hard core pornographic agenda, and subsidiary self-interest groups such as the pedophiliac, pederastic and commercial pornographic industry working on the Wikipedia projects collectively.
I think people should wake up to the fact that this issue is going to break the Wikipedia unless it is dealt with and removed; and what is left of the Wikipedia's credibility is going to be destroyed. --State the obvious (talk) 10:06, 7 May 2010 (UTC)[reply]
I don't edit such pages, I don't look at such pages, and I don't think low-quality porn needs stored here. But I also don't think that just because something contains some amount of sexual content it can't be encyclopedic. If you actually read the damn arguments being made, instead of making stuff up about the other side, you might be able to move things forwards, so the porn cruft could get deleted, but material with merit be kept, and I could go back to doing what I want to be doing: Revamping the entire Media of the day project from something that hasn't been touched since Commons' first year, and is woefully inadequate. I will further point out that I have... about a hundred or so featured pictures on commons. You are a new user who's only contributions to commons is to show up here and accuse me and others of masturbating. Adam Cuerden (talk) 11:14, 7 May 2010 (UTC)[reply]


No, I am accusing you, and an irresponsible minority, of stuffing explicit masturbation material down the throats of children under the guise of it being "educational" ... and using a tax empt charitable entity to do so ... simply because it has high ranking with Google. Find your own free hosts for it, and earn your own page ranking, elsewhere.

I am sorry. We have all read the discussion by pedophiles, or so-called "pederasts", intent on using the Wikipedia to subtly alter public consensus about their activities and directly influencing young and immature minds.

Those in support of the pornography industry ... some in the pay of it ... have been doing exactly the same thing. They are little different.

Masturbation material and a subtle but persistent corruption of morals. The statistics for the most highly view pages prove the point that it has been going on for years. --State the obvious (talk) 20:39, 7 May 2010 (UTC)[reply]

I just hope jimbo does not agree with you. --187.40.204.204 03:09, 8 May 2010 (UTC)[reply]
I think it is a question of everyone agreeing otherwise the Wikipedia is going to blocked from at least every school and library.
Which is the greater priority for a registered 510 non-profit, education or to host a free mirror for all the porn on Flickr? --State the obvious (talk) 10:23, 8 May 2010 (UTC)[reply]
Since when do we let Wikipedia Review run this project? Since when do we want our articles to look like FOX News? Have you no respect for what the people here have managed to do in the past ten years? Wikipedia Review, FOX News, they can't produce such a resource - they couldn't even with unlimited resources, because it is something above them. Wnt (talk) 00:22, 9 May 2010 (UTC)[reply]

Foundation-l

For the record, I started a discussion at foundation-l about this issue. [7] Dragons flight (talk) 19:53, 7 May 2010 (UTC)[reply]

Thank you, DF.


Fox News article

http://www.foxnews.com/scitech/2010/05/07/wikipedia-purges-porn --Leyo 20:01, 7 May 2010 (UTC)[reply]

Wow. The amount of blatantly wrong information you can put into one article is truly staggering. --Conti| 20:07, 7 May 2010 (UTC)[reply]
Note that Fox was not reporting about the material on Commons; they were in the process of contacting WMF's funders and supporters to get their position on the sexual content. That's not so much reporting as pressure tactics designed to force the subject into acting in a given fashion. Tabercil (talk) 21:37, 7 May 2010 (UTC)[reply]
Were the publicly reported claims of meritless porn a baseless allegation by Sanger, there would be nothing to clean up. The fact that Jimbo waded into the issue to personally clean up what he characterized as "'trolling' images of people's personal pornography collections" lends substantial credence to Sanger's allegation. Moulton 13:51, 8 May 2010 (UTC)[reply]

Another Jimbo vs Larry Sanger and Fox News drama that everyone else gets to fix

Last week, FoxNews.com revealed that Wikipedia co-founder Larry Sanger had sent a letter to the FBI expressing his concerns that Wikimedia was distributing child pornography. http://www.foxnews.com/scitech/2010/05/07/wikipedia-purges-porn/#discussion-form

Eloquence says[8]:

“It’s a false claim related largely to some historic early 20th century drawings, as described in the summary published by the Wikipedia Signpost. The Wikimedia Foundation’s General Counsel examined the drawings and concluded that they do not violate federal laws; we have not received any communication from the FBI to the contrary, and when and if we are asked by authorities to remove images that are illegal, we will do so.”

Is this true? If yes, this means Jimbo is doing all those arbitrary deletions just as a public relations response to a false claim?

I am starting to think that Larry Sanger is acting in bad faith, just in order to point at supposed flaws of openness and self-regulation. Mere drawings are hardly illegal on democracy, he knows that. --187.40.204.204 03:32, 8 May 2010 (UTC)[reply]

The intentions of Sanger don't really matter. His action resulted in some very bad PR for the WMF, that may have relevant repercussions on donators. Jimbo's current action may in part be motivated to counter that. And who knows whether the FBI has called already. --Túrelio (talk) 07:55, 8 May 2010 (UTC)[reply]




On pornography

Illustrating articles about pornography

How are we to encyclopedically illustrate articles about pornography without displaying examples of modern pornography? Powers (talk) 15:42, 7 May 2010 (UTC)[reply]

Not everything can be illustrated. Guess why child pornography is not illustrated. However, independent from what is legally possible, there is, according to Jimbo Wales, a strong statement by the WMF to be expected into the direction that media with explicit sexual content will be quite restricted in comparison to our earlier practice. At the end this comes down to the point that this site is owned by the WMF and that the WMF is dependent on its donors which are possibly disturbed if we host all this material. And, yes, this was already covered by the press. --AFBorchert (talk) 16:07, 7 May 2010 (UTC)[reply]
See also diff. TheDJ (talk) 16:09, 7 May 2010 (UTC)[reply]
The most pressing arguments I've heard for improving the tone and balance of images on Commons (reducing the chance of coming across unwanted erotica) are that we are driving away parents, classes of schoolchildren, and women with an excessive focus on uneducational porny images. Issues of donors being disturbed may be popular in the press (from the latest Fox article it seems that they spammed donors to try to get a rise out of them), but that hasn't been the source of the discussions here to date. So the real issue it should come down to is: how can we make our Projects comfortable spaces for all audiences (of both readers and editors)? What are the benefits and losses (in terms of usefulness to our wide spectrum of audiences) involved in adding an explicit image to facial sex act? SJ+ 09:10, 8 May 2010 (UTC)[reply]
The proponents of censorship have already won their little victory, such as the courts allow them, with CIPA. They have public money going to a special class of little men who go through pages and decide which to censor to schoolchildren. We don't need to do this, and we wouldn't be any good at it if we did. Wnt (talk) 00:24, 9 May 2010 (UTC)[reply]

What's really sad...

Is that when images deemed as "pornography" in articles about sexual acts are deleted, an image like File:Harmony Rose at Erotica LA 2006 3.JPG might have to be used as an educational image in the article about intercourse. That, if something, would be sad... In this day and age, it's really disturbing that there are people who don't think images depicting sexual acts can be educational. /grillo (talk) 20:48, 7 May 2010 (UTC)[reply]

I can only agree with my compatriot above. I've been a participant in a bunch of the Wikimedia projects for many years and I have uploaded dozens of dozens of pictures here on Commons. Although none of these particular pictures are likely to fall within the scope of the present deletion campaign, I've always been happy and proud of participating in a project that has so far had the guts to stand up and be really free of cencorship in the middle of a world so full of political correctness. With the recent actions taken by Jimmy Wales and some others I can only with great sadness - no, not sadness but anger - see that even Wikimedia Commons is now starting to sink into the swamp of hysterical puritan antisexual hypocricy which to me - and probably a vast majority of Europeans - is one of the very worst trademarks of American mentality.
Maybe it is time that we create an alternate Commons picture database - hosted in Europe? A place where those of us who are enlightened and grown up enough to know that there is nothing dangerous in seeing a picture of human genitalia could host valuable, educational and other useful pictures that are tragically enough not fit to be shown in "the land of the free"? /FredrikT (talk) 22:33, 7 May 2010 (UTC)[reply]
If this turns out to be a legal issue, that might be a fine resolution. To date, legal issues have nothing to do with this discussion. SJ+ 09:10, 8 May 2010 (UTC)[reply]

My $0.02

I'm someone who's actively involved in this field - I'm a member of the Pornography project on EN. I've gotten OTRS permission for images that certainly fall under this discussion such as File:Angelina Ash 4.jpg (mentioned by Jimbo in the main page) as well as a number of other porn stars. So I'd like to chime in with my thoughts.

First up, do we need a policy? In an ideal world probably not, but given the actions and attention being placed on the adult sides of things by outside groups (e.g., that Fox "News" report) we need to have some form of fig leaf in place just so we can tell those folks they're barking up the wrong f'n tree because we have rules in place, and please go chase some other ambulance.

Now on the one hand, I do think there should be a definite trimming of a lot of the low-quality explicit imagery, and I had started to slowly peel stuff away (e.g., Commons:Deletion requests/File:Human Male masturbation (Uncircumcised).GIF and Commons:Deletion requests/File:Penis.15.JPG). So the opportunity to see the explicit sections culled of low-quality stuff is welcomed.

On the other hand we're an international concern, and what's offensive in one place is not so in another (see Alecmconroy's Muhammad comment above). TV shows in Europe frequently have explicit nudity; in contrast American broadcast shows generally stop at what could be considered a tame two piece bathing suit. And I have no idea how much skin would be shown in extremely conservative places such as Saudi Arabia or Iran. Whatever we do for a policy should reflect that.

Absolutely. Bearing these related issues in mind will help keep the discussion sane. SJ+ 09:10, 8 May 2010 (UTC)[reply]

Now I do have one specific image which I'm aware of which I personally will be using as a benchmark to see how the new policy acts, specifically File:Keeani Lei 6.jpg. This image was donated to us by the subject Keanni Lei, who is a notable porn star. We have clear OTRS consent in place for this from the subject. The image is professionally shot so it doesn't have the "amateur porn" look of a lot of stuff. Lastly, the subject of the image is notable by herself. So what I would be looking for (if we ignore it being in use on the other wikis) is an answer to the question "would the image survive under the new rules?" Personally I would hope so.

Now, I've not gotten anything that could be considered "harder" than that Keanni Lei image to date (nor have I asked for such), I do suspect I will eventually get one porn star who will send me an explicit sexual image. So in the event that that happens, would the image be hostable on Commons? Tabercil (talk) 22:13, 7 May 2010 (UTC)[reply]

That's a great question, and one that our community policy makers need to work out. I like Jmabel's approach below. SJ+


Sigh

America.. Shows violent movies and praises guns, put when it comes to love, and making love.. "PLEASE, Someone! Think of the children!". Disgusting. 90.230.163.139 22:16, 7 May 2010 (UTC)[reply]

There's some truth to this, but this isn't just about America. The audiences who are presumably most strongly impacted by the prevalence of sexual content are in countries that censor parts of the Internet, not the US. SJ+ 09:10, 8 May 2010 (UTC)[reply]
That´s prude America... In good old Europe we are beyond this (partly)... Chaddy (talk) 22:26, 7 May 2010 (UTC)[reply]
Well, them killing themselves without reproducing might be salutary… Just thinking. Clin Skippy le Grand Gourou (talk) 22:44, 7 May 2010 (UTC)[reply]
Man, I can't agree more with you. :) --187.40.235.122 10:28, 9 May 2010 (UTC)[reply]
I'd like to correct you about that: not all of the US has that point of view. In my state of Oregon, there is no legal definition of obscenity -- due to rights on expressed in the state constitution -- so community standards here are even beyond European norms. Portland is known for its numerous strip clubs, for example, which from what little I know are superior to those in San Francisco or the Reeperbahn of Hamburg. (No, I've never been to the latter; the time I could have visited my reaction was "I can see naked women dancing in a bar at home any time I want, so why bother?") And Portland was where the junkies refused to hang with Courtney Love because they thought she was a poser, not a real heroin addict. In short, not all Americans are prudes, so please don't stereotype us. -- llywrch (talk) 20:48, 12 May 2010 (UTC)[reply]

A lack of clarity about intention

In the following I don't speak for anyone but myself, but I would hope that as one of the more active contributors to Commons—certainly a more active participant on this particular project than Jimbo—it would count for something.

What I have found most frustrating in this process so far is a lack of clarity about intention of the proposed changes. Is this being done for legal reasons? To make Wikipedia "child-safe"? As a PR move? It is impossible to reach an appropriate policy when intentions are not spelled out and agreed upon.

From my perspective: it is to cut back on a chronic issue of people uploading 'more useless porn', which doesn't help the projects, makes commons feel unnecessarily seedy, and may be limiting our audience (more than having beautiful, high-quality images and illustrations of human sexuality might already do - something we absolutely support.) See below. SJ+ 09:10, 8 May 2010 (UTC)[reply]

Second most frustrating: that so many people are addressing this by resigning or, worse, demanding that others resign. That is no way to get to any sort of agreement and move forward.

Third most frustrating: that Jimbo went off half-cocked. He could easily have canvassed around to a few of the most active people on Commons clarified his intentions, and tried to come up with a coherent proposal that the community could discuss. Instead, he came forward with a half-baked proposal that had no initial consensus behind it at all.

The claim that this Jimbo is merely proposing a "clarification" of existing policy does not seem plausible to me. Clearly many people who were/are part of the consensus behind the policy as it now stands are opposed to this. To call a controversial change a "clarification" is casuistry.

The uncontroversial part of this proposal does seem to be a clarification. The controversial part is not, and should be discussed further. SJ+

On the other hand if, as several contributors (but not Jimbo) have suggested, the intention is to restrict Commons content to material that would generally be considered appropriate for children, the current proposal and the current round of deletion does not go nearly far enough. I don't think that is the intention of any but a handful of active participants, but I think the position needs to be explicitly rejected, especially because some of the advocates of this view have been going around requesting the resignation of admins (myself included) who think otherwise, or have been accusing anyone who thinks that some aspects of human sexuality fall within commons scope of wanting it there as their own personal masturbatory material. I suspect that the intention falls somewhere between the last two, and has not really been very clearly stated.

See my comments under #Child-friendly above. This is not the intention. A child-friendly site is a valid goal, with many other considerations -- likely as its own Project. SJ+ 09:10, 8 May 2010 (UTC)[reply]


Again, I will reiterate my own position, which differs considerably from Jimbo's. It probably lies somewhere between the status quo ante and Jimbo's position. My intentions are based on the current Commons:Scope, and I would be interested to know if anyone thinks any of this is inconsistent with that page.

  • It should go without saying in all of the following that I am only talking about images where intellectual property rights are in order and our where hosting the image does not raise legal issues. I would not mind seeing the latter expanded to "where hosting the image on a commercial site would not raise legal issues," since it is normally our intention to allow commercial reuse.
  • Commons is not restricted only to images that are appropriate for children. However, it might be very useful to come up with a system of tagging that would allow easy filtering out of categories images that would generally not be considered "child-safe".
  • Mere nudity should not be a reason to remove images from Commons, though it may raise the bar on the provenance of images.
  • Work by notable artists should almost always be acceptable regardless of its subject matter. (Given some of Jimbo's recent deletions, this is apparently a place where he and I disagree.)
  • Technical standards for photographs and user-made images should be higher in the area of sexuality than in general. While someone's random snapshot of a semi-notable building may be acceptable content, someone's random snapshot of a couple in a sexual situation is not.
  • Images related to human sexuality should be described objectively and in a neutral tone. Descriptions should avoid "porn speak".
  • There should be no photographs of underage humans engaged in sexual activity. I would have no problem restricting images of underage sexual activity to notable art (example: any of a number of Greek vases), but disallow non-notable user-made non-photographic images.
  • There should be no photographs of humans engaged in even vaguely erotic sexual activity with animals. ("even vaguely erotic" is there because, for example, I don't see any problem with an image of industrial semen extraction from a bull.) Again, we might well disallow non-notable user-made non-photographic images. In this case, though, there is a lot of notable art that we should have: just think of all of the representations of Leda and the Swan, for example.
  • This next one is where I clearly most differ from Jimbo: I don't think any aspect of consensual adult human sexuality should be inherently off-limits, even for photographs. I do think we should provide some easy way to filter these out, and I won't quit Commons if it is decided that we should not host such images, but I think that there can easily be legitimate educational uses to well-made, well-described photos or other images of sexual intercourse in its various varieties, BDSM activity, etc. I've mentioned the Foundation for Sex Positive Culture (FSPC) in Seattle a few times as an example of an organization considered by the U.S. Internal Revenue Agency to be a bona fide 501(c)(3) educational institution in the area of human sexuality. I presume there are others; it's the one I'm familiar with because I live in Seattle. Clearly, the arguments (not on this page; I'm referring to recent deletion discussions) that images of sexual intercourse and BDSM inherently have "no educational value" ignore the existence of such institutions. Again: if such images are to be banned from Commons, I would very much like to see a rationale explicitly stated. As I remarked above, "not child-safe" doesn't seem to me to be an adequate rationale: the images of abuse at Abu Ghraib aren't child-safe either, nor is a lot of notable art that I presume we would want to keep. My own feeling is that the intention to blanket-ban such images stems from prudery, pure and simple, but perhaps there is some other rationale. I'd like to know what it is.

- Jmabel ! talk 23:55, 7 May 2010 (UTC)[reply]

I agree largely with the above. Good description of the situation! --Leyo 00:21, 8 May 2010 (UTC)[reply]
I agree as well, but will not return until it is implemented. TheDJ (talk) 00:34, 8 May 2010 (UTC)[reply]
Any policy or action that sees File:Lynching-of-woman-1911.jpg as acceptable but File:Masturbating hand.jpg as unacceptable is grossly offensive to me. There is far more harm in graphically illustrating the depravity of man toward his fellow man than there is in depicting human sexuality. Powers (talk) 01:02, 8 May 2010 (UTC)[reply]
  • Powers, I think it is a question, again, of goals and intentions. If it is a matter of "child safety", etc, I would agree with you. If, however, it is an issue of public relations and of setting a tone for the site the decision might be different. A case could be made that the lynching picture, however revolting, is an irreplaceable historical document and that the very fact that photos like this of lynchings exist is of historical interest; conversely, a case could be made that we only need a relatively small number of such revolting, and largely similar, images. A photograph of a woman masturbating is (in my view, and apparently yours) much more harmless, but also of much less inherent importance. In the lynching case, I could see us saying "yes, these images are important, but they are so offensive that we only want to host a very limited number". In the masturbation case (with less "weight" on either side of the scale), I could see us saying, "We ought to have a few images like this for illustrative and educational purposes, but for those purposes the dozen or so best such images we can get will suffice, because we don't want to accidentally build a category page that looks like a porn site. But, if we take the latter course in either case, I'd like to see us own up to it. In particular, this would mean removing any claim that "Commons is not censored" and instead owning up to what the rules of self-censorship would be. - Jmabel ! talk 01:34, 8 May 2010 (UTC)[reply]
Should public relations be above COM:CENSOR? It is a shame. --187.40.204.204 03:01, 8 May 2010 (UTC)[reply]
    • But that's just it -- no one has defined why these actions are being taken, except for Jimbo saying "we don't need porn here." What is porn? Apparently, today, it's what Jimbo thinks it is. If this is a Public Relations issue, someone in the know needs to say so P.D.Q.; if it's really an issue of project scope, then that scope needs to be better defined and Jimbo needs to stop wheel-warring. Powers (talk) 01:43, 8 May 2010 (UTC)[reply]
OMFG... a sensible suggestion??!? A wholehearted YES as a simple clear possible way out - though I would quibble with you on one bit: "Technical standards for photographs and user-made images should be notably higher" (emphasis mine; this way if we do host an image which falls into this category, we host the best ones available). And would what you're saying give a clear path to being able to remove images when superior equivalents get uploaded later?? Tabercil (talk) 01:50, 8 May 2010 (UTC)[reply]
Of course I'd be open to working out shadings and details. Again, though, I see clarification of intent as step one. The lack of that is the main respect in which I say that Jimbo went off half-cocked here. - Jmabel ! talk 06:20, 8 May 2010 (UTC)[reply]
In major lines this is something I can agree on, much more than what is now on the page (which leaves only a small crack in the wall to have any sexuality-related photographs), but I still think it's too restrictive. I would wholeheartedly agree on setting high quality standards where we have enough material anyway, but not to delete our possibly-only photograph on a subject for such reasons. Deletion because of poor quality should in my opinion only take place if we can point at another image or other images, of the same or higher quality, that could be used instead in any sensible application of the image in the projects. - Andre Engels (talk) 22:04, 9 May 2010 (UTC)[reply]

Jmabel - your input counts as much as anyone's does in these discussions; thank you for writing eloquently. I don't see this new policy as hinging on a clear definition of porn, nor are we as a set of references against the subject of pornography. Do the Projects conver information about the porn industry? yes. about most every aspect of human sexuality? yes. We have COM:PORN and proposed policies such as this one, because we have a problem with random uploads of what could be amateur porn shots (and presumably video, once we become more welcoming to videographers) -- as wwwwolf says below, we don't need more useless porn.

I think your intentions as listed above are quite reasonable, and hope you can make more progress in improving this proposed policy page. We should set general policy about defining 'appropriate content', and most guidelines about how to scrutinize potentially-controversial content for educational value will apply across the board. And we should pay special attention to becoming better curators for sexual content for many reasons --

  • it is extraordinarily popular, so really excellent material about sexual topics will inform and improve more people's lives than almost any other.
  • it is highly controversial among many audiences, providing an interesting tension between offending some and informing others. [truly excellent images manage the latter with almost none of the former]
  • it is often cited as a reason for restricting access to Wikipedia, which is counter to our mission
  • it is recently cited as a reason for some communities feeling unwelcome joining our community, and when handled poorly may contribute to systemic bias
  • it inspires more than its fair share of inappropriate uploads (from vanity galleries to unauthorized photos of subjects in compromosing positions to sneaky copyvios).

I hope we can continue to build a truly excellent body of work about human sexuality, as part of the Projects, without interfering with the development of their other facets. SJ+ 09:10, 8 May 2010 (UTC)[reply]

SJ, your points lay out the issues well. FloNight♥♥♥ 20:05, 8 May 2010 (UTC)[reply]

Two test cases from my own work

I am still extremely unclear about where people are proposing to draw the lines here. I've uploaded somewhere over 20,000 images to Commons, mostly (~90%) my own photos, mostly (85%) documenting Greater Seattle and its culture. I'm guessing that somewhere between 150 and 250 of these images show someone less than fully clothed. As far as I can recall, those are always in a public situation (usually a performance, a parade, etc.), which I suppose someone could construe as "exhibitionism". A very small number of these photos could also be construed as bondage, and in the case of the first image I'm using as an example here, I suppose someone could consider it to relate to masochism. I'd like to bring two of them up as "test cases" to try to understand where we are going.

File:Julie Atlas Muz - You Don't Own Me 01B.jpg. Julie Atlas Muz is somewhere on the border between a burlesque performer and a performance artist. This photo was taken at a performance during Bumbershoot, an arts festival that often provides a rare opportunity to photograph performers who would normally not be performing in places where cameras are permitted. The performance piece was basically a combination rope escape and striptease to the music of Leslie Gore's "You Don't Own Me". My understanding of the piece is that, with some element of paradox or irony, an escape from what is restricting her leaves her naked. The performance took place in the Bagley Wright Theatre, one of Seattle's leading theater venues. It clearly met "community standards" in my city. If I remember correctly, it was an "18 and over" performance, but only rather loosely (I don't remember anyone checking IDs). Still, it could be "read" as a bondage image, and that was clearly one of its many intended aspects. So, are people proposing that this should now not be in scope?

File:Fremont naked cyclists 2007 - 21.jpg. The annual Solstice Parade associated with a street fair in Seattle's Fremont neighborhood has long been one of Seattle's most popular parades. One of the informal traditions of the parade is in unofficial relaxation of ordinary rules about indecent exposure, in terms of (adult) nudity. A very small number of people participate absolutely naked, and I'm sure I have a couple of photos of those, but this is more typical: people in body paint. The example shown is typical. Again, clearly this is entirely within "community standards" in Seattle: people of all ages attend the parade. The ratio of kids in the crowd might be slightly lower than at an average parade, but not by much. But, arguably, this could be called an image of "exhibitionism". So, are people proposing that this should now not be in scope? - Jmabel ! talk 19:49, 8 May 2010 (UTC)[reply]

Setting policies about "appropriate content" has already been done. It's called W:WP:NOTCENSORED. Once you start making these restrictions, you'll have ten thousand things to argue about, and I don't mean just images, and some discussions by chance will go the wrong way, and with every deletion you'll lose more open-minded editors, so every other day you'll be deleting one ten-thousandth of the database until it's all gone. People had the strength somehow in the early stages of this project to resist that - it is time to call on that resolve once again! Wnt (talk) 00:31, 9 May 2010 (UTC)[reply]

On censorship

The Muhammad Comparison

You know, Commons has a lot of images of Muhammad, which are of course, about a million bazillion times more offensive to our Muslim readers than pornography could EVER be to our Christian readers.

At those deletion debates, we've spent a _LOT_ of time explaining to our Muslim editors that we can't just delete an image because somebody somewhere doesn't like it. We have rules and processes, we decide what's notable and not notable. We form consensus.

Except, the proposed "All Pornography Must Go!" policy, if implemented, basically means we were lying to our Muslim readers when we told them why Muhammad has to stay.

Turns out, it's not free speech, it's not notability, it's not usablility. The reason Muhammad stays while nude artwork is getting deleted is because Jimbo isn't a Muslim. --Alecmconroy (talk) 19:14, 7 May 2010 (UTC)[reply]

Oh that poor nail, it must be hurt from being hit on the head so fucking hard. This is one person's neopuritanical morality being imposed on the rest of the world, and nothing more. Roux (talk) 19:19, 7 May 2010 (UTC)[reply]
Ditto! Well put by Alec and Roux. Maedin\talk 17:08, 8 May 2010 (UTC)[reply]
While we are at it, we should alter the Tiananmen Square article to meet Jimbo's BLP standards. We make out the living persons of China's 27th Army to be some kind of mass murderers based on hints and rumors from foreign intelligence agencies. Maybe it is time to show our respect for the most powerful nation in the world, the world's unquestioned leader that sets the standards for speech and politics, by changing that article to adhere strictly to the facts as the People's Party relays them to us. Wnt (talk) 00:35, 9 May 2010 (UTC)[reply]

Utter nonsense!

I must say this makes me really sad. I always loved WP:CENSOR and how it protects depictions of human body from being gratuitously deleted. Most sources available to general population will not have such courage - I don't know much encyclopedias with that much naturism articles, with so much sexual exposure (including some nude children in a family context - see!), or that will host controversial content even after a block and some outcry. I think that in many ways Wikipedia helps humanity to move forward, and the position on nudity is one of them.

I know Commons and Wikipedia are different projects, but the policy of "delete first, then debate" (cited on wikinews) is utter nonsense. Commons is a repository of free content, and there is no illegality on sexual content per se. They are as relevant as any other kind of content, and should be subject to the same kind of rigour.

I think Jimbo is damaging the project by overreacting to unverified allegations of child pornography. I understand the liability concerns, but Wikimedia Foundation should stick to deleting illegal images on a case-by-case basis, without degrading into a hysteria against human body and human activities. The risk of hosting illegal pictures for a little amount of time shouldn't prevent Commons from building a comprehensive compilation of free images, which of course includes depictions of human body.

But who am I..? Maybe I am losing my time exposing my opinion here.

I'm just an irrelevant IP, after all.. --187.40.204.204 02:26, 8 May 2010 (UTC)[reply]

It looks like Commons has a similar policy. Shouldn't COM:CENSOR ban this attempt of creating a policy that restricts sexual content? I wil quote it:
The policy of "Commons is not censored" means that a lawfully-hosted file, which falls within Commons' definitions of scope, will not be deleted solely on the grounds that it may not be "child-friendly" or that it may cause offense to you or others, for moral, personal, religious, social, or other reasons.
--187.40.204.204 02:31, 8 May 2010 (UTC)[reply]
It should but when over zealous reporters start threating the funding of the foundation by painting Wikipedia as a pornographers den, actions must be taken! — raeky (talk | edits) 02:48, 8 May 2010 (UTC)[reply]
The funding of the foundation is not at risk. We have a greater funding buffer now than we have ever had, and are in no danger of losing our primary supporters -- our editors and small donors -- thanks to this sort of debate. The recent media and on-project attention to the topic has convinced various people who have been frustrated in their efforts to make headway in this area to focus their energies on initiatives like this, some of which has gone a bit overboard :) Please help turn this energy and attention into useful improvement of Commons policies. SJ+ 09:10, 8 May 2010 (UTC)[reply]
And this means undermining the anti-censorship foundations of the project? --187.40.204.204 02:52, 8 May 2010 (UTC)[reply]
No, there is no call for censorship. There have been calls for changing "Commons' definisions of scope" -- a discussion which I encourage you to take part in. SJ+
You forgot:
The counterpoint to this, is that the statement "Commons is not censored" is not a valid argument for keeping a file that falls outside Commons' defined scope, as set out above
See how that informs the other point? Think about the mission. Education. What's the educational value of yet another camera phone picture of a cunt dripping cum? Guy 18:21, 8 May 2010 (UTC)[reply]
(I'm that IP) I'm not wikilawyering, I was just voicing my opinions. Quite frankly: I see no harm to host porn, and I think even unused porn has a place in a free content repository. Maybe tagged and separated, but not banned. The porn itself is not the reason of actual and potential scandals - it's the repulse and prejudices against human body that part of society currently holds. This same repulse will surely be triggered on "valid" content too, such as this article - or you think that a puritan grandma will be happy to see his granddaughter seeing this? Low quality porn do not damage the project, unlike arbitrary mass deletions of so-called porn cruft. In short: I think even unused porn may be found a use in the future. If not by the wikimedia foundation, by third parties - remember that Commons is holding free content to be reused. If something is so useless that is not thought to be usable, then it should be deleted on basis of usefulness alone, under the same criteria of other kind of material. There is no need for a policy against porn - if the porn is useful, it should be kept; if it's as useless as a lot of things that is commonly deleted, it should be deleted. If it's status aren't well defined, it should be default to keep. (I maybe should have trimmed this message; it's only in the end that I say I'm ok with deletion in general. I also don't liked the tone, much less the tone of other messages in this page. Oh well :P) --187.40.235.122 10:26, 9 May 2010 (UTC)[reply]
I agree completely. However, we should also be honest about the fact that those are not the only types of files which were deleted and that those are not the only types of files which are proposed to be deleted. Black Falcon (talk) 18:34, 8 May 2010 (UTC)[reply]

--

I was browsing the commons would-be 'porn galleries'. There is some pictures that is so poor, compared to others, that may actually harden a search for a good picture. This is a point. Unless maybe one wants to illustrate an article on, say, amateur porn.

But I would say that many of them could prove to be useful resources, if you give time to it, specially if someone puts good captions on it. Human variance (skin color, sizes, etc), plus different positions, body art, etc. can be used to illustrate something noteworthy. By deleting pictures just because they are considered 'bad taste', one may actually decrease the usefulness of the collection as a whole. Of course the ideal is that all images are clear, with high resolution and as well good at other technical aspects. But I've saw many low-res sexual images serving an useful purpose on English Wikipedia. It's just a matter of being discernible and to fit the context.

I recognize that someone just creating an account and uploading random porn [as someone said above] may cause real damage (copyright, plus maybe some disruption). But the so-called cruft shouldn't be just all purged at once :/ If the community agrees on a particular deletion, then it's fine. If porn is considered an issue sufficiently different than other things, then it may be ok to have different rules. It frightens me, since the current content of this draft is unacceptable. But that's how laws are born anyway. But reacting just because Fox News said so? Jimmy wants to act 'timely'. And indeed he is in a hurry. Why? Just to please Fox News? It's all about PR? Maybe money? Will Wikimedia Foundation pervert the essence of consensus building and community governance just for money and image? Is the money an end on itself?

I completely understands if someone wants to do regular cleanup, but for me it sounds a lot more than it. It is just like porn were inherently bad, something to be purged and, somehow, a large collection of porn is something that should not be associated with commons. That is what I called utter nonsense. I don't imagine a solid and healthy repository of free content without a increasing gallery of porn. I could say human depictions or depictions of human activities, or other euphemisms (as I said above), but this is not precise enough. It should have said porn, and, happily, it has a little porn by now - so that Wikipedia and other sister projects can use it to advance its educational agenda.

Let's expand it. --187.40.235.122 13:31, 9 May 2010 (UTC)[reply]

Interestingly, Jimbo deleted this: [9]. It was reverted because it is being used on the article nl:Amateurpornografie, or "amateur pornography" in dutch. So 'amateur porn' images might be actually on topic on Commons, and deleting currently unused images prevents other projects to use them (It's not a big deal if there is a lot of them, but still worth a note) --187.40.235.122 13:41, 9 May 2010 (UTC)[reply]
Good call, 187.40.235.122. There is, after all, such a thing as high-quality amateur porn. I won't dispute the very existence of porn cruft, though. I don't think there's an article on "Crappy porn" that needs to be illustrated, nor would such likely be encyclopedic! Lusanaherandraton (talk) 10:18, 11 May 2010 (UTC)[reply]

Commons' responsibility to local projects

As I said at Commons:Village_pump#Commons.27_responsibility_to_local_projects, Commons has a responsibility to local projects. Our collaboration works on the basis that Commons' inclusion and deletion policies are stable, consistent and in case of changes, that affected projects are informed in advance so they can take appropriate measures, mostly uploading locally (as it happened for changes in licensing I think). So if you adopt a new inclusion policy, you must inform all projects before application. Cenarium (talk) 06:09, 8 May 2010 (UTC)[reply]

That's a good point. Hopefully we can simply stop deleting media in use on other projects while this discussion plays out. SJ+ 09:10, 8 May 2010 (UTC)[reply]


Don't turn this into a bunch of legalese, please!

Okay, so let me reiterate the problem in Commons: People upload sexual content, and we already have plenty of that, thank you very much.

So why do we need a policy that's full of stuff that doesn't specifically address this issue?

Let's say it in other words: I support efforts to stop people from uploading more useless porn. I don't support new policies that would complicate people from doing other things we already thought were good ideas (informative and artistic depictions). Yes, the current version explicitly allows it, but mark my words, people will throw this legalese around.

I know this, because people will forget.

Do I need to remind you what happened with the "attack sites" policy? en.wikipedia Arbitration Commitee decided "okay, linking to Encyclopedia Dramatica might not be good because they spread lies about an user." People started wikilawyering and interpreting that decision in a "wider sense". Somewhere along the line, people forgot why that rule was put in in the first place, and Arbcom had to say "okay, that didn't go too well".

So don't make this an awesome, all-encompassing, lawyer-proof policy that can only be interpreted with an advanced degree in wikilawyering. Addressing bigger issues is commendable, but above all, this thing needs to address the actual problems that led to the creation of the policy clearly and specifically... or it will be abused. --Wwwwolf (talk) 07:18, 8 May 2010 (UTC)[reply]

Can you suggest specific ways to simplify this proposal while getting that point across? SJ+ 09:10, 8 May 2010 (UTC)[reply]

Is USC 2257 a helpful standard?

Discussion needed: is USC 2257 a helpful standard here? There are advantages to using a detailed external standard, when it comes from a body trusted to assess the specific problem at hand -- for instance, relying on ISO codes to differentiate languages from dialects. But this US law is a standard for acts which in the US require clearly established consent of the participants, slightly different from our educational goal.

I sympathize with wwwwolf's comments above; perhaps there is a less legalistic and simpler set of guidelines that would achieve the desired end. As this isn't a legal issue, there is no particular reason to be guided by a particular US law. [on the other end of the spectrum, we might want a much stronger validation of model release/approval for any pictures that might be considered embarrassing by the 'model' if published without their consent -- something certainly not covered by the above law, but necessary for us to be conscientious curators] SJ+ 11:06, 8 May 2010 (UTC)[reply]

If we are so worried about uploading images without the model's consent, why don't we simply raise the bar for such images, instead of forbidding them entirely? Let OTRS handle the verification of such images, make it a requirement. That should take care of most of the random porn being uploaded here while retaining the images that are actually valuable one way or another. --Conti| 11:16, 8 May 2010 (UTC)[reply]
Require a statement from the depicted model in every image that shows a sexual act and/or someone naked. Shouldn't be that hard to implement. /grillo (talk) 11:41, 8 May 2010 (UTC)[reply]
Yes. The harder part would be to find a way to properly verify that statement, I would imagine. --Conti| 11:44, 8 May 2010 (UTC)[reply]
Well, if OTRS works for copyright concerns it should work for privacy concerns too... In the case of established professional models, the email will have to come from the official source. In case of amateur images, if they are of low quality they should be deleted. High quality amateur images are hopefully modeled by WM users themselves, in that case we have no issue. I really don't know what should be done about images that don't show any faces but just genitalia though. Tecnically they could belong to anyone, so I don't know if privacy concerns are applicable there. /grillo (talk) 11:53, 8 May 2010 (UTC)[reply]
The images that do not show faces are still a problems because they often come from a cropped image. And the fuller image may or may not have the model consent to be used in the viral way that our license permits. Because these images are taken in private, and most often held privately or hosted behind paid website it would be difficult to determine if the person gave consent or not. So, we need to assume that the image is not available unless we have a firm indication that it is. FloNight♥♥♥ 12:08, 8 May 2010 (UTC)[reply]
Isn't it simpler to ignore USC 2257? --187.40.235.122 13:35, 9 May 2010 (UTC)[reply]
This thing sounds like a nightmare. Read 2257, focusing especially on this "secondary producers" abuse. This isn't just a matter of getting an email, but of keeping copies of pictures of photographic identification cards and some other rigamorale, which seems at heart intended to allow authorities to nitpick and find something wrong with any site they don't like.
But there is hope. The Sixth Circuit appeals court just barely ruled in favor of this thing, and it hasn't gone before the Supreme Court yet. In the case the Sixth Circuit ruled on, they said that the harmful impact of the law on anonymous speech was not a factor for the Connections company. Now on Wikipedia, this law is such a nightmare because it would appear to burden so many types of legitimate speech:
  • Public domain files from years past, which obviously have no paperwork
  • Files from foreign contributors
  • Files self-taken by contributors before the law was passed, if they need to re-upload them for some reason (like somebody deleted them improperly...)
  • Files from contributors who are legitimately concerned about being harassed or victimized if someone tracks them down who dislikes (or too much likes) the article or any of their other contributions under the username
It is clear that this law as amended is intended as a vehicle for ultimate censorship of the internet — to parlay an ill-considered approach against child sexual abuse into a general mechanism take down any site they disagree with, by claiming that two humans are "simulating" sexual activity and then demanding to see records you don't have — to force labeling of any risque content, which then provides a later excuse to demand that children not see it by demanding the adults show identification, which then can be used to track and punish their expressions. But the censors have always sought such things and always will, as detesters of truth, exploiters of men, defilers of beauty, and servants of evil. We will have to go out and defeat them once again. Wnt (talk) 18:49, 10 May 2010 (UTC)[reply]
There's another discussion on USC 2557 below linked here - Stillwaterising (talk) 16:42, 13 May 2010 (UTC)[reply]

How many files are we talking about?

This is a first list of speedy-deleted images (cfr. #Deletion logs). Looks like they're only about 400. Was this mess really necessary to delete such a handful of files? --Nemo 14:51, 8 May 2010 (UTC)[reply]

Here's a toolserver search (courtesy of vyznev) : http://toolserver.org/~vyznev/temp/commons_2010-05-06_scope_deletions.txt


Users that did deletions in this scope (need checking):


 Info I just ran a new version of the toolserver query above and put the results in wikified form at User:Ilmari Karonen/queries/May 2010 deletions. Even with all the false positives, this should be somewhat more convenient than digging through individual deletion logs. —Ilmari Karonen (talk) 20:39, 10 May 2010 (UTC)[reply]

Child filtering system

I'm new, so not in the know. Why has the deletion of sexual depictions been a better alternative to placing automatic child filtering systems, as existing in many sites these days? Or, is it feared that such a feature may serve to 'prove' the claim of "Commons hosting porn?" W00pzor (talk) 20:59, 8 May 2010 (UTC)[reply]

The reasons for that I see are the following:
  1. No one has been willing to develop that software feature so far
  2. It is a lot of work to maintain such ratings
  3. Different people have different interpretations of words as "nude" and different cultures even more so
  4. All content rating systems proposed have seen the argument: "as long as we can also use it to tag images of mohammed, the caricatures of the Thai king, kissing gay people, Tiananman square" etc... Ergo if we are rating, we should not just rate with one culture in mind. TheDJ (talk) 21:44, 8 May 2010 (UTC)[reply]
This is entirely outside Wikimedia's scope. The projects are not censored. Wikimedia should be using your donated funds to make navigation and accessibility better, not hindering it. It is the parent, or school, who should be investing in censorware if they are concerned. - hahnchen 23:37, 8 May 2010 (UTC)[reply]

See my comments above. The purpose these change is terribly unclear to me. I don't think it is to make Commons "child safe," because there will inevitably be (for example) war images (or, as has been pointed out, photographs of lynchings) that are by no means child safe. Again, we would have a much better chance of reaching consensus if Jimbo and/or the Board would be clear about what they are trying to achieve rather than suggesting means toward an unspecified end. -- Jmabel ! talk 00:16, 9 May 2010 (UTC)[reply]

As far as I can see, the goal here was to stop fox news from running a story that would go like "OMG Wikipedia hosts hardcore porn!!!1", see Jimbo's comment here. There doesn't seem to be more to it. So no, it's not about the children, or the law, or what's decent. It's about avoiding bad press. --Conti| 00:32, 9 May 2010 (UTC)[reply]
To the extent it was about avoiding bad press it is certainly proving counterproductive. We will now have bad publicity not only for what we did not do, but also for what we did. DGG (talk) 03:59, 9 May 2010 (UTC)[reply]
So lets hope Fox never find these.Nemissimo (talk) 01:20, 9 May 2010 (UTC)[reply]
Did it occur to anyone that sadomasochism stuff requires 2257 proof age and that is lacking for the Iraqis in those pics? They should be deleted then! 2257 dude (talk) 01:46, 9 May 2010 (UTC)[reply]
Completely off track here: BDSM is voluntary. You wanna tell me those people enjoyed the torture?? Seriously... Seb az86556 (talk) 01:51, 9 May 2010 (UTC)[reply]
The "tops" that were doing them for gratification despite the fact that the "bottoms" were prisoners, as the courts have found in this case? en:Sadomasochism is not necessarily voluntary, check the Wikipedia article! 2257 dude (talk) 05:45, 9 May 2010 (UTC)[reply]
  • I think it would be a good idea to simply have a 'child-friendly Wikipedia' (sort of like we have Wikipedia simple...) and also one where articles and images pertaining to adult topics (such as sex) are hosted. In fact, why not simply ban children from editing anything but Simple and tell teachers and schools to only have them read Simple? After all, if someone is of the developmental level where seeing a sexual image is going to warp them, I imagine they would not be able to comprehend the advanced writings present on the base Wikipedia and would benefit more from limiting themselves to Simple to begin with. Ty (talk) 01:54, 9 May 2010 (UTC)[reply]
    • Not sure how smart that is - only let children be exposed to Simple words? Pretty soon you will have young adults who don't understand anything with more than two syllables....and yes, I am a professional educator.216.168.116.93 03:26, 9 May 2010 (UTC)[reply]
  • Personally, speaking as a parent, I would not encourage a child to have access to a source of information that is censored in preference to one that is not; it is poor preparation for the world both morally and intellectually. DGG (talk) 04:02, 9 May 2010 (UTC)[reply]

I want a filtering system in place for myself. Not because I have an issue with nudity or anything else, but because I work in a wide variety of environments, often in public, and the current situation is that it's not safe to do things on Commons if at any time I might have something displayed on my machine that would get raised eyebrows, or worse, get me in trouble with the operators of the facility I was at (Airline lounges, client sites, my employer, coffee shops, and the like). Other image repositories offer safe browsing. We should too. Has nothing to do with children. We should adopt ICRA ratings and software to honor them. I suspect I am not the only admin that is hindered in this way. (when I'm home it's no big deal) ++Lar: t/c 04:52, 9 May 2010 (UTC)[reply]

When filtering systems came up before on the Village Pump, we indicated that Commons cannot take a culture-neutral stance on offensive material; if you want a "safe" browser experience the best thing to do is produce a filtering web proxy that maintains its own private, off-wiki list of offensive images, according to your own cultural standards, that can be distributed and used within your culture. Such a system could adapt to the user, rather than trying to create a one-size-fits-all solution. I'd be happy to participate in the creation of such software. Dcoetzee (talk) 01:19, 10 May 2010 (UTC)[reply]
An appropriate system of tagging/categorization could help support any of a variety of filter policies. But I think we should get this particular guideline sorted out before we plunge into that issue. - Jmabel ! talk 03:14, 10 May 2010 (UTC)[reply]
On Wikipedia I started w:Wikipedia:Sexual content/FAQ. Some people here sound like they might be helpful contributors. Wnt (talk) 18:52, 10 May 2010 (UTC)[reply]

Mismash discussion?

Jimbo added the comment at the top of the proposal saying "since the particular page that was here before was just a mishmash discussion of previous efforts to clarify policy in this area, I'm just getting rid of it and starting again." In reality, myself and the coauthor Privatemusings have spent hundreds of hours researching and discussing the prior proposal and I find it insulting to have it referred to as "mismash discussion". Thank you. - Stillwaterising (talk) 12:21, 9 May 2010 (UTC)[reply]

So he managed to insult and offend both sides of the fence. Impressive. Wknight94 talk 13:40, 9 May 2010 (UTC)[reply]

Edits

I've tried to edit this into what there's clear consensus for. I think that it's fair to say that there's no consensus that artworks should be deleted, and that we still need to discuss images of sexual acts, but I think that it's far easier to start with the clear consensus, then add to it. Adam Cuerden (talk) 13:04, 9 May 2010 (UTC)[reply]

Looks good. I do have a question about items 4 to 7 in the "Other sexual content" section. Number 4 is straight-forward and I wholly agree with, and number 5 ("User-created erotic artworks") I can understand (though I think the second half of it might need to be rephrased as it just doesn't sound right to me). But how does 6 differ from 7, or for that matter how do they differ from 4 and 5? Can we collapse them together? Tabercil (talk) 14:28, 9 May 2010 (UTC)[reply]
I think it's worth separating non-user-created and user-created artworks, since, on the whole, I think that 99.9% of non-user-created artworks [if we take "user" to include Flickr users and such] are going to be in some way, and to some extent notable if you dig a bit - and hence worthy of discussion - but it's at least possible that someone might use us as his webhost for self-made art, in which case speedy deletion might be appropriate. As to the rest. I was trying to work in everything mentioned in the previous proposal that seemed to be widely supported, and got a phone call just as I was finishing, so I didn't clean it up yet. Adam Cuerden (talk) 14:54, 9 May 2010 (UTC)[reply]
I've reduced it to four points and an exception. Adam Cuerden (talk) 15:15, 9 May 2010 (UTC)[reply]
  • Definitely good points for regular DR, but too complex for speedy. I don't see a good reason for speedy deletion of any not photographs (or video) -- we have relatively small amount of illustrations, and there are no emergency reasons for deletion (privacy, childporn). What about simple rules for speedy like: Amateur photographs & video -> speedy, professional & old photos -> DR, illustrations -> DR. Trycatch (talk) 15:52, 9 May 2010 (UTC)[reply]

Right. I removed the exception, cleaned up a bit, and redid the introduction to be more in line with Commons' basic principles. It is, after all, possible to be selective without censoring, and I don't think that there's any need to be anything but selective: If it's educational, we can defend our choice to keep it. If it's not educational, we don't want it in the first place. That selection criteria also fits in with our principles on all other controversial content, like depictions of the prophet Muhammed. Adam Cuerden (talk) 16:16, 9 May 2010 (UTC)[reply]

Im not very happy here here. I think the addition was unnecessary and could be misused (for example here). Almost every pictures of penies can be deleted under Commons:Sexual_content#Content_which_is_not_permitted.--Ankara (talk) 17:03, 9 May 2010 (UTC)[reply]
Agreed. Redundant to what's above as it already states that "Pornographic material may generally be speedy deleted if they do not fall into at least one of the following classifications" (emphasis original). Tabercil (talk) 17:34, 9 May 2010 (UTC)[reply]

I think the point #3 is redundant (it is a leftover from the previous version, made redundant by the restructuring of the section). It says that a sufficient condition to keep a picture is that it is an “artwork”. I don’t think this should be the case. We want to keep a picture if it is a well-known artwork (covered by #1), or if it is a valuable artwork (covered by #2), or if it has its educational value (covered by #4). Unimportant sexual images by anonymous people without an educational value should not be kept. Heck, they do not even have to be sexual content – any noneducational artwork by a Commons user would be speedied as out of scope. --Mormegil (talk) 19:47, 9 May 2010 (UTC)[reply]

I think we should talk about artworks, since Jimbo's recent actions have shown that people can be very poor at telling the difference between a notable artwork and a non-notable - and we don't want notable artworks speedied out of ignorance, since their disappearance may not be noticed. I've added a footnote explaining the logic and noting the (hypothetical?) situation you mention. Adam Cuerden (talk) 21:29, 9 May 2010 (UTC)[reply]

If Jimbo keeps his God-King powers, with 72% opposition, then we all need to leave. --Alecmconroy (talk) 13:28, 9 May 2010 (UTC)[reply]

At the moment, it looks like he's given them up. Let's see if that sticks. Adam Cuerden (talk) 13:40, 9 May 2010 (UTC)[reply]
According to John Vandenberg, Jimbo still has all his powers right at his fingertips. --Alecmconroy (talk) 14:02, 9 May 2010 (UTC)[reply]
And guess what ---- he always will. Everyone needs to move past that. It is his site. He is never going to give up ultimate power. If it were my site, neither would I. You should be thankful you at least know who is in charge, and can even voice your opinion to him. Were this Flickr or some other media site, your only recourse would be to send an e-mail to opinions@whateversite.com and the response would be a form letter. Wknight94 talk 15:50, 9 May 2010 (UTC)[reply]
It's not his site anymore-- he gave it up a long time ago. We couldn't get tax-free donations if it was his site. Furthermore, he's been deleting things from projects he did not found, on encyclopedias he does not have any authority over, and even from projects whose languages he himself does not even speak.
They all agreed to host on Wikimedia servers, but none of them ever agreed to give Jimbo any special authority to make decisions for those projects. --Alecmconroy (talk) 15:58, 9 May 2010 (UTC)[reply]
And you believe all that? Okay, I won't ruin the fantasy. Wknight94 talk 16:08, 9 May 2010 (UTC)[reply]
<sigh> well, you do hit the nail on the head there.
We're about find out whether our values are just PR BS or not. At least if Jimbo's powers do trump consensus, there will be no shortage of other wikis to go to. If Jimbo stays, Wikimedia is going to explode into many pieces as all the other projects relocate to other servers --Alecmconroy (talk) 16:17, 9 May 2010 (UTC)[reply]
This isn't about "beliefs", it's about simple facts. Wikipedia is run by the Wikimedia Foundation. Wikimedia Foundation owns Wikipedia, Wikimedia Commons and the other projects, not the private person Jimmy Wales. He's just an ordinary member of the board. Giving him any special powers really doesn't make sense. He could have kept the site as his private property but didn't. He needs to deal with that. /grillo (talk) 16:18, 9 May 2010 (UTC)[reply]
I wouldn't pay much attention to what is privately owned by whom and all that. Wales is in charge and always will be as long as he wants, regardless of what the paperwork says. If he doesn't technically own the site, then he owns whoever owns the site. Etc... Wknight94 talk 18:19, 9 May 2010 (UTC)[reply]
Is there any reason to accept that? Just as we always accept dictators who declare themselves leaders for life? /grillo (talk) 20:20, 9 May 2010 (UTC)[reply]
Please, this discussion is seriously off topic here. Take it to Meta if you wish (or COM:A for revoking his rights at Commons in particular), but also note that most of his global rights *has* been revoked already. Finn Rindahl (talk) 19:05, 9 May 2010 (UTC)[reply]

Sorry to breathe life back into an off-topic discussion, but this misinformation was just inappropriate coming from an administrator. In the United States, an individual cannot directly or indirectly own a 501(c)(3) tax-exempt non-profit organization. Jimmy Wales is a chairman of the Wikimedia Foundation.   — C M B J   09:32, 14 May 2010 (UTC)[reply]

Staying, yet disgusted

I almost left over the 'pedophilia' debate. Now, I'm almost leaving over this rubbish. Really, I want Jimbo made to be a regular user, just like everyone else. I'm going to STAY here, with the purpose of trying to fight against all this. Hopefully I can do some good in this mess of JACFCOM (Jimbo's Anti-Consensus Fundamentalist Church of Morality). Ecw.technoid.dweeb (talk) 15:28, 9 May 2010 (UTC)[reply]

I think that, now that Jimbo's backed off, actual work shouldn't be too hard, and we can get reasonable policy to move forwards with. There's a lot of points noone disagrees with. Adam Cuerden (talk) 15:33, 9 May 2010 (UTC)[reply]
Indeed, when you look at the points that noone disagrees with, and remove Jimbo Wales' somewhat unhelpful work, it turns out we can make a sensible policy that fits in very well with our philosophy. Adam Cuerden (talk) 16:20, 9 May 2010 (UTC)[reply]

"High"

I struck what I see as a too-subjective use of the word "high" in "high artistic..." What is "high? What is moderate? What is low or less or lesser? "Artistic" itself is enough of a subjective minefield without making it into an all-out war with calling something "high artistic". That's my opinion. Wjhonson (talk) 21:49, 9 May 2010 (UTC)[reply]

"Unused" Genitalia Images

In the interest of facilitating a less ad hoc cleanup, I have constructed Commons:List of genitalia for review, which lists 411 images of human genitalia that appear on Commons, but do not appear in the main namespace of any Wikimedia project. I'd encourage people to use this list to identify images that may ultimately be suitable for removal from Commons, as well as to think about how the rules being proposed here might affect images like these. Dragons flight (talk) 22:57, 9 May 2010 (UTC)[reply]

An interesting list that clearly might spark some deletion discussions, but some of that material is pretty reasonable. I think that less than 10% of that has clear reasons for deletion. TheDJ (talk) 01:26, 10 May 2010 (UTC)[reply]
Thanks for constructing this list. While some of these images may lack educational potential (relative to other existing images), I think it's best to discuss these in deletion review so we can use them as test cases and figure out where consensus lies (undeletion is not as nice, since users don't have access to the content). Dcoetzee (talk) 02:37, 10 May 2010 (UTC)[reply]

It is not important whether it is used in mainspace but whether it can be used in mainspace in some articles. I agree that Wikimedia should have social responsibility and it should delete some dangerous images but ordinary genitalia and sex photos (or illustrations) can be in categories at Commons without actual usage in Wikipedia articles. Commons is not just for Wikimedia projects but for all people who want to use free images. --Dezidor (talk) 10:19, 10 May 2010 (UTC)[reply]

Wikipedia Review again?

No wonder there is irrational and apoplectic footsoldiering for Fox News. Isn't it time to put a warning template on every drama that Wikipedia Review and Peter Damian start up?

Purging sexually-related content of borderline value

The following passage (which I had added to the project page) was removed by User:Ankara as redundant. Perhaps it is, but I am bringing it over here to see if anyone else thinks some or all of this should be salvaged as part of the page. I don't see anything that strikes me as a clear equivalent of this on the page.

Nonetheless, it has been our longstanding practice to be more aggressive in purging sexually-related content of borderline value than of other content of borderline value. For example, we would not typically delete a picture of a building simply because we had another, higher quality picture of a similar building. However, we would typically delete a sexually-related image in similar circumstances. We're happy to have tens of thousands of well-described photographs of normal buildings. We are not looking to have tens of thousands of well-described photographs of normal penises.

Two points here:

  1. That the removal of sexually-related content of borderline value is nothing new. I think that insofar as we are concerned with this as a PR matter, it is important to say this publicly.
  2. This provides an example of how this guideline is used in practice.

- Jmabel ! talk 00:47, 10 May 2010 (UTC)[reply]

This is, while true, also somewhat misleading. We do not, as a matter of policy, give less leeway to sexual images than to non-sexual images. Rather, sexual images tend to be more closely scrutinized for compliance to scope and copyright policy than other "less interesting" images, simply because more editors are interested in investing the effort in doing so. It's the same reason Wikipedia has more coverage of Pokemon than it does of classical mathematicians. I want to avoid giving the impression that "only the most useful sexual images will be permitted here." Dcoetzee (talk) 01:09, 10 May 2010 (UTC)[reply]
I support Dcoetzee in this. We also want a reasonable range of material in cultural scope for instance, even if that material might not be in use. (this is in the range of "different genitalia might be useful in different language encyclopedias". This is difficult to put into words in the guidelines I think, but still important. TheDJ (talk) 01:24, 10 May 2010 (UTC)[reply]
Actually, I'd say we ought to give less leeway to sexual images than, for example, images of puppies. With sexual images, you have factors like the potential for exploitation (e.g. amateur photos released without a person's permission), and damage to Wikimedia's reputation and the accessibility of the projects from people thinking we collect too much smut. Those factors don't affect images of puppies, and I think it makes sense that we treat sexual content more strictly because of it. Useful images, are still useful images, and if that means we have 300 penis pictures used in the projects, then okay. However, I'd personally prefer a policy that cuts strongly against photos (especially of identifiable persons) sitting around if we can't find any current use for them. To that end, my preference would actually be a policy of automatically deleting sexual photos that aren't used in any project after 30 days (or some similar time period to allow for the possibility that legitimate uses would be found). Dragons flight (talk) 01:55, 10 May 2010 (UTC)[reply]
I'll concede that sexual images should be given somewhat less leeway, in light of the drama they tend to incite. If you can give me two good reasons to delete something, then "it's explicit" is nice icing on the cake. On the other hand, I think we should be aggressive about retaining images that may be useful in some educational context, whether or not they are in use at the present time (this is an area that is still expanding on many projects). Dcoetzee (talk) 02:25, 10 May 2010 (UTC)[reply]
I gave an explanation here Commons talk:Sexual content#Edits. Regards--Ankara (talk) 10:12, 10 May 2010 (UTC)[reply]

model ages

re this bit 'explicit photographs and film where it is not clear that all participants are above 18 years of age should be speedy deleted.' - I think we need to elaborate on how clarity is arrived at. The current system seems to be that we assume the image is ok, unless someone can bring some evidence that it's not. I'd support the burdens being the other way around, that we veer on the side of caution. Unfortunately that may mean deleting many of our most explicit images featuring sexual activity, because we simply have no idea whether the folk involved are 16 or 24. Thoughts? Privatemusings (talk) 02:09, 10 May 2010 (UTC)[reply]

As any bartender will tell you, identifying age based on appearance is sometimes difficult, but often you can make a good guess. The rule we've been following so far - and I think it's a good one - is that if an image appears to contain sexual depictions of people resembling persons under the age of 18, it should be nominated for deletion - at which time the visual impression of age can be confirmed by other users and further documentation can be sought by the uploader and/or other interested parties. I believe speedy deletion is inappropriate in this case because visual identification of age is a good thing to double-check with other users. Dcoetzee (talk) 02:29, 10 May 2010 (UTC)[reply]
does this leave us with media which we can quite happily say 'we're not sure how old that person is, they might be a minor, but they look kind of grown up too' - I'm not sure I support that position. Incidentally - down here in Sydney, bars display a little sign which says 'If you look under 25, we're going to ask you for ID' - and goes on to explain that folk shouldn't get upset by this, it's just 'cos they have significant liabilities if a 17 year old slips in. It's that sort of approach I'm an advocate of - would you agree? Privatemusings (talk) 02:35, 10 May 2010 (UTC)[reply]
I actually think that's a great analogy. We can set some kind of rough "visual" age lower limit that helps to keep the false negative rate appropriately low. 25 seems like an okay number, but I'd like to see what others think. Dcoetzee (talk) 03:03, 10 May 2010 (UTC)[reply]
I also think we need to take into account who the uploader is and whether they have a good track record. - Jmabel ! talk 03:19, 10 May 2010 (UTC)[reply]
I agree. If we know that an user is working on a project and is going through the proper processes then if they make an error with one upload or two we would normally assume that it was an oversight in filling out the on site documentation rather than a violation of policy. On the other hand, if a first time user uploads something dodgy, then we have reason to be concerned that the person does not understand policy. FloNight♥♥♥ 12:53, 10 May 2010 (UTC)[reply]
How do you figure out the age of a penis? Even if the uploader says it's his, how do you know? Wnt (talk) 19:01, 10 May 2010 (UTC)[reply]
  • The responsibility falls on the person uploading the image to give a clear indication that 1) the image is compatible with our license for reuse, 2) people in the image gave consent for the release of the image (if it was taken in private) 3) and that the people in the are not minors. If these are not met then speedy deletion is appropriate because 1) having an on site discussion among other users will not give us the answer to these questions 2) delay would allow the image to be copied by people planning to reuse it 3) the on site discussion and reuse would potentially expose reader to illegal material or invade the privacy of people in the image. FloNight♥♥♥ 11:24, 10 May 2010 (UTC)[reply]
    • While I understand the concern for privacy, I think this should be considered on a case-by-case basis - many images do not include identifiable persons (e.g. the face is often hidden), and others are professional works that are clearly intended to be distributed. To require documentation of subjects in any sexual image, even where the model is clearly an older person, would create a high barrier stifling almost all new contributions in this area, roughly akin to requiring references on all new articles on Wikipedia. We already distribute illegal works (copyright violations) for a short period before they are reviewed and deleted, this is no different. Another issue is that, in analogy with license status, the uploader's word cannot be trusted. Independent review is required regardless. Dcoetzee (talk) 11:58, 10 May 2010 (UTC)[reply]
      • If the person is clearly an older person and the image clearly looks to have come from an on line porn site, then the issue is different, yes. But we have many gray areas where it is impossible to tell....the image is cropped and it is difficult for us to know from looking where it came from, but the people that are in the image would and would people that they know would if the message was passed around that it was here. Since most of these works are easy to replace (nothing really unique most of the time), we are not losing anything of true educational value by deleting them. I also agree that the reputation of the uploader needs to be considered in the decision making. FloNight♥♥♥ 12:16, 10 May 2010 (UTC)[reply]
        • I agree with all the above, but because there are so many factors to consider (apparent age of the person, apparent origin of the photo, value to the project, identifiability, obscenity, prurient interest, reputation of the uploader) it's clear that case-by-case deletion is the only viable option. I think what we should include here is guidelines on factors to consider. Dcoetzee (talk) 19:44, 10 May 2010 (UTC)[reply]

This conversation sounds like trouble. If you find something you know to be child pornography and you tell other people to come look at it, it seems like you're risking being called a distributor. And if you "delete" it (so that only administrators can see it) you're concealing evidence without changing the fact that Wikimedia possesses it. Wnt (talk) 18:58, 10 May 2010 (UTC)[reply]

  • This is not how the law works. Calling attention to a work being distributed is not distribution in itself; and mere possession of child pornography is a much less serious crime than distributing it. However, in the case of unambiguous child pornography, permanent deletion would be advisable if there were a technical mechanism available for doing so. Dcoetzee (talk) 19:44, 10 May 2010 (UTC)[reply]

Per the above, I added 'Generally speaking, uploaders of images which feature participants who appear to be under the age of 25 will be required to clarify, and possibly evidence, consent and the ages of all involved.' - I think maybe I'd like (in time) to drop the 'generally speaking' and strengthen it up a bit - it's pretty important. Thoughts? Privatemusings (talk) 10:13, 13 May 2010 (UTC)[reply]

Sadomasochistic activities are not sex, pictures of them are not porn

I cite the page discussed:

"By "sexual content" we are generally referring to: ...

   * Images that portray sadomasochism"

This definition is simply nonsense.

My statement is only about images which do not show any genitalia, but show nudity, bondage, whipping etc. These activities are not genuine sexual activities. They may be performed for any kind of reasons. Some people experience sexual arousal when tying up someone, being whipping or looking at pictures of such activities. The sexual meaning of a picture is only the mind of the viewer, not in the picture - unlike pictures of people or horses fucking. Here is a sample of pictures which might turn on many masochists, but I have not heard any threats of deletions to them: File:Arte_por_Ruth_Woroniecki_de_Jesus_en_la_cruz.jpg, File:Il_Gesù_013.JPG, File:Vihanti_Church_Altarpiece_20090627.JPG, File:The_crusifiction.jpg, File:St._Mary's_Cathedral_-_Sydney_-_Other_-_006.jpg, File:Stedman-hanging.jpg.

Images that portray sadomasochism with no genital activities should completely be taken out of the defintion of sexual content. --Lightbearer (talk) 18:19, 10 May 2010 (UTC)[reply]

I don't think the purpose of this policy is to say that all pictures portraying sadomasochism are inherently sexual, the point is that pictures portraying sadomasochism should in some way have an educational value to be on commons, just like all other sexual content. --Conti| 18:45, 10 May 2010 (UTC)[reply]
But who decides, what is educational valuable and what is not? Wales, FOX, WMF or the contributors aka "THE COMMUNITY"? Take a look at the abortion debates in the US or nazi-related debates in germany for example. Today it's sexual content, maybe tommorow its creationism vs. evolution - an *fhooomp* next thing is deleted because Mr. Wales(or any other admin as well) believes the he has to protect wp from whatever, because somebody like FOX is reporting shit again(as they always do). So, what's next?


The inclusion of sadomasochism on a page called "Sexual content" strongly implies that all sadomasochism is sexual. It may be subject to the same scope concerns as sexual content (and puppies), but if it's on this page, it should be appropriately qualified. Dcoetzee (talk) 19:47, 10 May 2010 (UTC)[reply]
I don't disagree, I'm just not sure how we could properly do that. Renaming this page to "Sexual and sadomasochistic content" seems rather awkward to me. Any suggestions? --Conti| 20:05, 10 May 2010 (UTC)[reply]
Yeah, drop sadomasochism from the standard full stop. If we're not focusing on other potentially fetishes (and there are quite a lot) there's no good reason to include S&M or bondage in particular. JoshuaZ (talk) 21:38, 10 May 2010 (UTC)[reply]
I don't know, what the term "sadomasochism is sexual" is supposed to mean. It doesn't make much sense. "A penis is a sex organ" makes sense, also "a string of rope is not a sex organ". Bondage and giving pain are performed by many people to effect sexual organs, for other people it is a more romantic expression of the relation altogether with no actual sexual effects. Other people might get off seeing a guy (or woman) in a captain's or general's uniform. --Lightbearer (talk) 16:10, 14 May 2010 (UTC)[reply]
I believe that's an artefact of a somewhat bizarre U.S. legal opinion on the subject. Adam Cuerden (talk) 20:10, 10 May 2010 (UTC)[reply]
  • See §2256, which defines "sadistic or masochistic abuse" as "sexually explicit conduct". --JN466 22:42, 13 May 2010 (UTC)[reply]
    • Also see this earlier comment, above. Apparently, bondage images are classified as depicting abuse, even if the parties were consensually engaged:
      "One comment requests that the Department define ‘‘sadistic or masochistic abuse’’ because some people believe that safe and consensual bondage is not abuse, and requests that the Department distinguish between actual and simulated sadistic or masochistic abuse. The Department declines to adopt this comment. That term is not a subject of this rulemaking. Moreover, actual sexually explicit conduct depends on the content of what is being displayed, not on whether the content is subjectively considered to be abusive." [10] --JN466 22:47, 13 May 2010 (UTC)[reply]
Sounds very strange to me. "TITLE 18 > PART I > CHAPTER 110" is about "SEXUAL EXPLOITATION AND OTHER ABUSE OF CHILDREN". Well, nobody wants to upload images of kids being naked, tied up and beaten to illustrate articles with BDSM topics. So I don't know, why this article is relevant to the discussion. The article is also quite strange: What is "masochistic abuse"? However, I don't know, why US law is applied to wikipedia, which is an international project. Deletions on commons have effects for all wikis. The German law is clear: Pornography involving sadism is forbidden, but images only are considered porn, if there is fucking, masturbation or expostion of genitals involved. If there is just someone tied up and whipped it is not porn. Maybe one server for commons should be moved to a place outside the US to avoid nonsense laws? --Lightbearer (talk) 16:10, 14 May 2010 (UTC)[reply]
As I understand it, tying up a fourteen-year old girl's breasts, say, and selling the pictures would be considered dealing in child pornography in US law, even if there is no intercourse, masturbation or genital exposure involved. To be honest, I don't really have a problem with that. The US definition of pornography (also the British definition) is, "If its purpose is mainly just to turn you on, it is pornography." The German definition is quite similar: "Pornography itself is defined by the German High Court as a presentation of sexuality that is not connected to any kind of psychologically motivated human relationship and which glorifies sexual satisfaction as the only reason for human existence, often accompanied by grossly depicted genitals." See #Pornography, below. --JN466 19:02, 14 May 2010 (UTC)[reply]

Educational purposes

We need a clear definition of clear educational purposes.--Ankara (talk) 19:01, 10 May 2010 (UTC)[reply]

Do we?
In Commons:Project scope it says "realistically useful for an educational purpose". Why not stay by that wording? Is it difficult to argue for deletion of crap porn with that definition? Or do we want to delete "realistically useful" media that are not "clearly useful", i.e. files that may or may not be useful in the projects? The founder wants to, but do we as the community?
Do we want media deleted because some admin doesn't see any clear use, although some other would?
--LPfi (talk) 21:04, 10 May 2010 (UTC)[reply]
I was afraid that clear educational purposes would be used as justification for deleting files (as you say "deleted because some admin doesn't see any clear use"). Maybe we should replace clear educational purposes with "realistically useful for an educational purpose". ?--Ankara (talk) 21:10, 10 May 2010 (UTC)[reply]
An excellent idea. Done. Dcoetzee (talk) 21:21, 10 May 2010 (UTC)[reply]

Suggestion for the policy

I want to suggest that it be made clear that we are not here to decide on editorial policy on other projects, and that if an image is in use then it is fine. -mattbuck (Talk) 21:22, 10 May 2010 (UTC)[reply]

The first part of that statement is a fundamental principle of the operation of the Commons. The second part is way too weak. The Commons is a media bank for projects from which they can select what they deem useful. Projects don't just need images they're using right now. The deletion of these is only the most immediately visible act of destruction and focus of immediate anger. --Simonxag (talk) 21:45, 10 May 2010 (UTC)[reply]
Oh I agree, that shouldn't be the be all and end all, and it is fundamental principle, but it seems some people forget it and so I feel it should be made clear. -mattbuck (Talk) 21:50, 10 May 2010 (UTC)[reply]

Necessity of speedy deletions

According to the current policy proposal anything that is not in scope can be speedy deleted. That means a very hard time to manage borderline cases. Are we counting on non-deletionist admins checking all deletions made by anti-pornography admins? That seems to be a lot of duplicate work.

I do not have experience in crap-fighting at commons, but I am seriously worried about the change in policy, that seems to me to be quite drastic (n.b. that the WMF board has told us no policy change is needed).

I understand that we need speedy deletions for privacy and legal reasons and for crap not worth discussing. But do we need it for anything else?

--LPfi (talk) 07:10, 11 May 2010 (UTC)[reply]

Let me remind you 3 main things about Commons. --TwoWings * to talk or not to talk... 07:16, 11 May 2010 (UTC)[reply]
Frankly, we've always speedied stuff that's out of scope. This is why the scope policy is so narrowly defined. Take a look through the deletion log and every day you'll find random personal photos, personal artwork some kids drew up in MS Paint one day, yet more low-quality penis photos, etc. Speedying is appropriate for these, but many "out of scope" deletions during the sexual content purge have gone blatantly against policy, invoking SCOPE in name only. Images in use are not out of scope. Images that could potentially serve an educational purpose are not out of scope. It's the admins who need to be more restrained, and stop abusing the spirit of the policies. Dcoetzee (talk) 07:21, 11 May 2010 (UTC)[reply]
Yes, crap not worth discussing, and should have added "or clearly out of scope". My concern is that we are writing a policy that could give an excuse for something like what happened (in a smaller scale, one by one, leaving the clearly in scope pictures). I add something about borderline cases. --LPfi (talk) 09:33, 11 May 2010 (UTC)[reply]
I reckon I delete at least one penis a week (more at times) for what it's worth. --Herby talk thyme 09:36, 11 May 2010 (UTC)[reply]
I think the main problem is people who don't realize that pictures depicting sex is certainly in scope and certainly (can be) educational. Sex is not some secret evil desire, it's something many people do almost every day. Also, I don't know how the situation is in USA, but at least in some countries in the world sexual education is introduced already at age 11. The biggest problem when saying that images of sex are out of scope, is that teenagers who want to find out about sex are forced to look for other channels (read: pornography). We all know pornography isn't a good representation of what sex is like, or how the human body looks. It's better if teenagers find out about sex on Wikipedia, than from the porn industry... Why do you think that a majority of men think they have penises that are smaller than normal? A side problem is of course that men with large penises are more likely to upload pictures of them here, though...
One final point: even if paintings depicting sex are easier to deal with considering personality rights, I think that actual pictures would be much better. The best thing that could happen would be if one heterosexual couple and two homosexual couples of each sex would agree to create these kinds of pictures depicting all kinds of sexual behaviour. I realize though that that would be very exposing to those couples... /grillo (talk) 11:59, 11 May 2010 (UTC)[reply]
Agreed. I've often argued in favor of couples producing more high-quality photographic depictions of sexual acts for Wikipedia, and I believe that depending on context, they may be a more valuable illustration than a drawing. Any policy that would prevent these from being contributed would be a travesty. Dcoetzee (talk) 12:23, 11 May 2010 (UTC)[reply]

USC 2257

Do we have to mention that law in the policy or not ? --TwoWings * to talk or not to talk... 07:14, 11 May 2010 (UTC)[reply]

2257 is... messy. Does WMF qualify as a "secondary producer"? If so, the resultant record-keeping requirements would have an overwhelming chilling effect on new contributions, and the penalties are very heavy. However, the definition described in the WP article seems to imply that there's only a requirement for commercial distributors, and only for photographic works. Constitutional challenges to the law have failed so far, despite an initial victory in the Sixth Circuit. What's the deal here? Dcoetzee (talk) 07:41, 11 May 2010 (UTC)[reply]
Well I also thought that WMF couldn't be considered as a secondary producer. But since the recent evolutions, I thought it did. So where are we now ? If this law doesn't apply to us, why do we have to change our habits and procede to massive deletions of photos showing sexuality ? --TwoWings * to talk or not to talk... 08:58, 11 May 2010 (UTC)[reply]
Jimbo made references to the "2257" (thanks for the link, it's hard for us non-US people following discussions where codenames are used), but only as a point of reference. Has somebody stated that the law concerns us? I think WMF should raise the issue if necessary, they have the responsibility and (access to) legal expertise. --LPfi (talk) 09:23, 11 May 2010 (UTC)[reply]
AFAIK, nobody has said so, but it's also a bit besides the point. Jimbo made a number of references to 2257, but actually was talking about "images that trigger the 2257 record-keeping requirements". The criteria that determine this triggering and which he put on this page in his rewrite are actually defined in 18 USC 2256, not 2257. The 2256 definitions do not apply to drawings, cartoons, sculpture, and the like. (Compare Senate Report 108-002 and [House Report 108-066.) For child pornography, there's an additional definition in 18 USC 1466A. It does cover drawings, paintings, and so on, and covers also depictions of imaginary people, but has an exemption for works of "serious literary, artistic, political, or scientific value". Lupo 14:41, 11 May 2010 (UTC)[reply]

Whether or not we are required to do record keeping is still highly disputable, and not something that we as a community can determine. Many people have informed Mike Godwin of the 2257 record keeping act and he has not once come out and stated that we as a project require such record keeping. Thus until the foundation is sued, or Mike says otherwise, we do not require recordkeeping. And like LPfi stated, nor did Jimbo state that we were required, he was just using the requirements of that act. TheDJ (talk) 15:41, 11 May 2010 (UTC)[reply]

I agree that we should not concern ourselves with 2257 without a specific statement from Mike Godwin saying that we ought to, but I really wish he'd say something about it. Mike always says as little as possible - good trait for a lawyer, frustrating for us. Dcoetzee (talk) 23:43, 11 May 2010 (UTC)[reply]

The Case for Using USC 2257 on Wikimedia Projects

A few of the advantages of voluntarily adopting USC 2257 record keeping are:

  1. It's a proven system of record keeping that verifies information like names of subjects, stagenames, date of birth, name of photographer, consent forms, and the location and date the photos were taken.
  2. The legal responsibility for the accuracy and content of 2257 records remains with the record holder, and personal identifying information of the subjects of the photos (and the legal responsibility) remain off-wiki.
  3. It fulfills the licensing requirements of creative commons, saying that our images must be made available for commercial use, however currently our pornographic images CAN NOT be reused legally in the US for commercial purposes because they lack USC 2257. This falls way short of our "free content" ideals (as well as Commons:Licensing).
  4. All primary producers of pornographic images in the US MUST keep records, even if the images were uploaded to Commons. For this reason, pornography transferred from Flickr without 2257 should not be allowed.
I would appreciate help developing this proposed policy. Thank you, Stillwaterising (talk) 06:22, 12 May 2010 (UTC)[reply]
I strongly agree that 2257 documentation is something we should obtain whenever possible - but requiring it for all sexual photographs would just exclude too much content by raising the barrier to contribution too high. Dcoetzee (talk) 07:28, 12 May 2010 (UTC)[reply]
Adapting that policy would mean deleting a lot of existing content. It would also make it more difficult to get some valuable content. If I contribute sexual content I prefer doing so anonymously and not keeping any documents about the participants (other than what I might put on the description page). The procedure would not benefit at all if the records are not checked by Commons: it is as easy to claim the records are kept somewhere as to lie about the ages and consent without that procedure.
Of course following the official procedure has the benefit of most such content (probably) being legal to host also in the future (if the record keeping requirements are followed in practise), but a recommendation to that end would suffice. And most probably people will move without notice and the records disappear.
If Commons takes the responsibility of archiving the records, then all the details will be readable by a number of persons here (and by the "Attorney General"). Although I do not mistrust these people in general, there is no guarantee no one would leek sensitive information (under special circumstances).
--LPfi (talk) 07:45, 12 May 2010 (UTC)[reply]
It's been confidently stated that we are under no legal requirement to implement § 2257. Am I correct in assuming that this is because § 2257 regulates commercial distribution of sexually explicit images, and we are not a commercial site? --JN466 12:48, 13 May 2010 (UTC)[reply]
We distribute images that can be used commercially. Still we are not the end-user here. Esby (talk) 13:05, 13 May 2010 (UTC)[reply]
User:Mike Godwin needs to put some input into this. There's a difference between required by law and voluntary use. The case for the advantages for voluntary use is strong, although it would be inconvenient. Images proven to be produced before July 3, 1995 would not need this information. - Stillwaterising (talk) 14:24, 13 May 2010 (UTC)[reply]
Where are you getting the 1995 date? The statue text provided in the links above gives a date of November 1, 1990. Still, it is an important point to recognize that the "2257" law does not apply to all images, only to photos/videos and only to those produced after a given date. From my (non-lawyer) reading of the law, it would not apply to the WMF, as they are engaged in "transmission, storage, retrieval, hosting, formatting, or translation ... without selection or alteration of the content", an exempted activity. But it would apply to any US-based editors who upload images, as they are engaged in "inserting on a computer site or service", an activity covered under the law. --RL0919 (talk) 14:44, 13 May 2010 (UTC)[reply]
There's many sources for the July 3, 1995 date, one of which is here. All images "pornographic" images (and I do not wish to go into what is or isn't pornography here) also violate COM:PS#Required_licensing_terms which requires "free reuse for any purpose (including commercial)." Since these images can not be reused commercially in US without 2257 information, this essential licensing condition is not met. - Stillwaterising (talk) 16:39, 13 May 2010 (UTC)[reply]
The "Required licensing terms" text is about licensing, not about non-copyright restrictions. That some users in some country are prohibited from using some Commons' content does not mean we should not host it (USA is in no special position regarding reusers). --LPfi (talk) 19:38, 13 May 2010 (UTC)[reply]
  • I agree with Jehochman here. If the received wisdom that "§ 2257 does not apply to us" turns out to be flawed, note that anyone
    inserting on a computer site or service a digital image of, or otherwise managing the sexually explicit content of a computer site or service that contains a visual depiction of, sexually explicit conduct
  • can be imprisoned for up to five years for a first-time offence if they fail to comply with § 2257 record-keeping requirements. --JN466 21:54, 13 May 2010 (UTC)[reply]
I've sent a mail to foundation-l (the Wikimedia Foundation Mailing List) asking for legal help. --JN466 22:32, 13 May 2010 (UTC)[reply]

First, I have to officially disclaim any notion that I'm acting here as a lawyer for editors -- I represent the Foundation only. That said, my view is that there is no Foundation or project obligation to keep records pursuant to the models in uploaded photographs. The obligation is generally understood to apply to the producers of such images, and we're not the producers. Obviously, those who actually produce images such as those described by Secs. 2257 and 2257A may have recording obligations, but there is no duty for us to ensure that they do keep such records. MGodwin (talk) 04:10, 14 May 2010 (UTC)[reply]

I propose mentioning here in the policy (and perhaps in the upload instructions, too) that those uploading images they have produced have a duty to keep records per §2257. This might help ensure that the images we have here have appropriate documentation. --JN466 09:54, 14 May 2010 (UTC)[reply]

OPPOSE!

Starting this as a clear tally of users opposed to either:

a) Jimbo's choice to impose this fait

b) This proposal

c) all of the above

{{Comment}: this proposal violates the basic principles of commons is NOT censored, & it makes a mockery of all the fine, pretty words wikimedia is build on (& uses in fundraising!)

the way in which this proposal was creates also violates the most basic principles of the wikimedia community.

the actually wording of the proposal is a bunch of mushy mealhy-mouthed nonsense, aka WEASEL WORDS

either "commons is not censored" means something, or it does not

if it doesn't then we have reached a parting of the ways, & the time to fork the project is now.

if this goes through, in anything like its present form, my future contributions of time (etc.) to wikimedia are going to be extremely limited; it is a very big universe & i shall seek another home, with regret.

as for Jimbo, i have attempted, repeatedly, to try & find "the best" in his actions, but i am now left wondering if any of the "pretty words" in wikimedia's basic declarations of principles actually mean anything to him, or if everything is "negotiable, subject to change without notice"?

after this latest "action from above", i no longer have confidence in his ability to "lead" this project, even symbolically.

(final note: i am already encountering admins who are treating this proposal as policy; that needs to be stopped. now.)

Lx 121 (talk) 09:01, 11 May 2010 (UTC)[reply]


  • I'd ask that you reread it: I've put a lot of work into trying to remove all censorship not required by law. It basically now works out to "Is it educational?" - which is a reasonable criteria for an educational archive, and has nothing to do with how explicit the content is. I do agree that the way this was implemented by Jimbo, and the language he used had the problems you state - which is why I rewrote it nearly completely. Adam Cuerden (talk) 09:10, 11 May 2010 (UTC)[reply]
    There's something I've never understood : Wikimedia is not only educational, so why does it have to be educational ??! Such a word doesn't seem to be so pertinent. --TwoWings * to talk or not to talk... 09:14, 11 May 2010 (UTC)[reply]
I think it is "educational" in the broad sense, which includes artistic education, sexual education, historical education, and so on. It may need a bit more work to explicitly include some types of educational material, to make sure they aren't deleted, but it's probably better to have a firm statement of community values, which we can point to as our reason for including explicit images, than to put us in a situation like with Jimbo, where someone attempts to delete all explicit images, because we don't have a policy explaining why we keep them. Adam Cuerden (talk) 09:10, 11 May 2010 (UTC)[reply]

unindenting

my problem is this: "community values" is where the weasel-words start to creep in.

commons is a media repository

commons purpose is to provide educational materials (broadly defined)

commons is NOT censored

these are basic fundamental principles.

kind of like "freedom of expression", "bill of rights"

or "open-source"

when "community values" & various other weasel-words start to creep in, that gets lost.

whose values? which communities?

once we start down this path, where do we stop?

the legal arguements cited by Wales et cie. are basically rubbish

we already have Commons policy for dealing with this stuff!

we already delete materials that are illegal and/or copyvio

we do not need jimbo wales to come sweeping down off the mountain & rewrite everything, on a whim, because "Larry Sanger said... "!

for that matter, do we even know what's GONE?

has anyone started a tally of missing & deleted files?

Lx 121 (talk) 10:17, 11 May 2010 (UTC)[reply]

I think you are forgetting one thing: Commons aims is not to collect materials that are borderline to its scope. In other words, it does not exist to ensure that you can publish any problematic material under 'freedom of speech'. I'll support any kind of policy if this last one is discussed, contains some logic so it can avoid problems while not giving away our freedom of action. Also no need to make a drama of Jimbo's actions, you are coming after the battle, they were either undone or checked against the old policy since. edit: I might have talked a bit fast, What I mean is that those issues will be solved in due time when reported or fixed... Esby (talk) 10:53, 11 May 2010 (UTC)[reply]
2 [problems i have with this
1. "materials that are borderline to its scope" - who gets to decude that!?
I think you perfectly know what I mean, let's suppose a given given limit is decided, you'll find people that decides to produce something that will be at this limit. This usually results in changing how the policy is applied or rewriting it. To simply answer the question: Policies are applied by the admin who handle a case when speedy deletion is applyable and by the people participating the Deletion Request. Esby (talk) 12:52, 11 May 2010 (UTC)[reply]
2. the actions of jimbo & friends have not been "undone or checked against existing policy"; i personally know of at least one case where a file has been (speed) deleted, citing only jimbo's new "policy", the deleting admin's actions are not listed in their "contributions", & the only record of any of it is in the watchlist email notices i've gotten. when i have the time, i'll be combing through the rest of my email @ that account, to see what else has been "disappeared" Lx 121 (talk) 11:32, 11 May 2010 (UTC)[reply]
Because you think the admins of commons have been sitting idle since Jimbo removed the rights on his founder flag? There might be a few 'in use' files that were not restored, but yelling at that and not giving a list of such cases does not help much. I am assuming that most of the images were undeleted, but there might be a few cases where it is still discussed or contested against the old policy (eg: the educationnal usage of a vaginal tooth photograph.), relinking as far I checked might still be unknown, especially on the others projects. Esby (talk) 12:52, 11 May 2010 (UTC)[reply]

The Commons does exist to give editors on its sister projects the range of materials they need. The Commons does not determine or limit the scope of these projects. Jimbo knowingly damaged a load of projects including those even he accepts are legitimate and valuable. Thank you to those admins who have tried to limit this (a great deal of effort!!!), but the damage has not all been undone: useful media are (probably?) still missing and editors at sister projects want to know if they need to host their own images. Jimbo's "policy" changes have so little thought put into them that they add nothing constructive just problems to be coped with. In this context "porn" is something just projected by the accuser, like "witch" - and we have to a policy to block it! If you check User talk:Jimbo Wales he's still on the censorship kick and may be pulling the board behind him. --Simonxag (talk) 11:55, 11 May 2010 (UTC)[reply]

Notable people

Lx 121 wants to make an exception for notable people: sexually explicit content involving those would be allowed without their consent, when allowed by law. I left that exception out myself because I felt there might be situations where such images are allowed by law, but I could not imagine any case where I would feel a legitimate need to have those pictures.

What about somebody performing in a place where taking photos is allowed and him/her becoming notable later. Wikipedia may very well tell he/she made money as erotic dancer as young, but do we need the pictures to illustrate that?

Or a politician going out drunk and having sex somewhere where he/she should have understood there is no privacy. Do we want to share those pictures, if the law allows?

I have a hard time imaging the need. I think we have a lot of situations where decency tells us not to host the images. See Commons:Photographs of identifiable people#Moral issues.

--LPfi (talk) 13:37, 11 May 2010 (UTC)[reply]

An existing policy arrived at after much thought!!! I think there is a problem on the Commons of genuinely private photos with no hint of consent from the clearly identifiable model. So long as an admin was quite clear about what they were targeting, I would personally have no problem with a deletion blitz on these. --Simonxag (talk) 14:18, 11 May 2010 (UTC)[reply]

Wrong direction

I think the proposal as it has been put forward here is misguided and headed from the wrong direction. It seems to be wanting to say there should be no pornography on Commons, then turn permissive by specifying that all kinds of things are not porn. In my opinion, the first thing to look at is: What do we want in Commons, not what do we not want in Commons. And as to me Commons is still in the first place, though by no means only, a repository for the other projects, what we do want is in the first place pictures that are used or usable by other projects. That is the first thing I want to judge inclusion or exclusion by: is the file useful for other projects? If a picture has no use at all in a project (rare, but it does happen), or if there is a replacement that is just as good or better for any purpose, then I feel no remorse at letting it go. If not, then in the name of being educational, by all mean save it. To this we can add files that Commons would like to have for its own reasons. This would include most things already mentioned: high-quality images and works of art. The important thing is, upto this point terms like "sexual" and "porn" do not enter the issue. Only at this time they will: While it has no use to keep useless pictures of cats or faces, there's also little reason to actually delete them. If it's sexual, we have more reason to avoid becoming a dumping ground for images, so we more actively delete useless and unwanted material. - Andre Engels (talk) 10:49, 11 May 2010 (UTC)[reply]

I largely agree with this. If I may raise an analogy, I think "sexual content" an ideal sexual content policy is a bit like the "biography of living persons" (BLP) policy on English Wikipedia. That policy does not exactly say any special rules apply to BLPs, but rather that existing policy is enforced more stringently and rapidly than it would be for a typical article. I have no problem with out-of-scope sexual images getting gunned down at 10 times the rate of other types of out-of-scope media - this was already happening before Jimbo came around. Certain legal concerns are unique to sexual content, including child pornography law. Other than that, there's nothing here that doesn't apply to our enormous collection of adorable kittens. Dcoetzee (talk) 11:36, 11 May 2010 (UTC)[reply]
respectfully disagree with your analogy & analysis of WP:BLP. BLP policy has been slowly (& painfully!) worked out, step-by-step, & it is by no means either perfect, or complete & "unchanging". this "new policy" is being jammed down our throats by jimbo, basically because "larry sanger said bad/mean things about us!"
we already have policy (& guidelines) for dealing with sexual content, & they work reasonably well. commons is not "flooded with child pornography" (or etc.). we don't even have a disproportionately large collection of legal sexual materials, as compared to our collections for "non-controversial" subjects! it was our beloved SABDFL who upset the apple-cart & now wants a "re-boot".
i cannot support that Lx 121 (talk) 11:58, 11 May 2010 (UTC)[reply]
I've revised my statement. Jimbo's attempt at policy writing was little more than moral panic and dubious legal speculation. I was thinking more of how this policy should be written, with an emphasis on enforcement of existing policy for this type of image. Dcoetzee (talk) 12:19, 11 May 2010 (UTC)[reply]
I also agree with Dcoetzee here. Esby (talk) 12:54, 11 May 2010 (UTC)[reply]
Well, I do not. From what Mike said, Jimbo's actions were not so much dictated by moral panic than by his wishes to out-PR Larry on the Fox News field. I have further speculations on other motives but that's not my business. Of course, that changes nothing to the question as to whether these actions were rash, overkill, overzealous, misguided, or anthing of the sort. Rama (talk) 13:48, 11 May 2010 (UTC)[reply]
here is the problem i have with the (original) above arguement: if the only (real) purpose of commons is to host files for the other wikimedia projects, then why have commons at all? commons uses up a lot of time, man-hours, & other resources; why bother? if that's all we're doing, the project should just be folded up, & media file-handling could just be done through the various wmf projects. at most, we'd need an automated framework for co-ordinating the file databases. if commons is that unimportant, the wmf & the community would be better off focussing our effort elsewhere...
anyone who contributes significant amounts of time & effort here @ commons, however, would (i think) feel that the commons project is more than just that.
food for thought, anyway
Lx 121 (talk) 11:58, 11 May 2010 (UTC)[reply]
This is being slowly thought out now. There is something good in all bad: Jimbo's cowboy mentality towards this page caused people to start reading it and started fixing what was bad and rewriting it. Certainly not all policy pages are well thought out from the start. It's just good that they are given some attention now and then. /grillo (talk) 12:05, 11 May 2010 (UTC)[reply]
  • Lx 121, where do you get the idea that the "only (real) purpose of commons is to host files for the other wikimedia projects"? Certainly that is not the purpose of most of what I upload. I expect perhaps 10% of what I upload to be useful for other wikimedia projects. For example, if I take 10 photos of a moderately notable building, my guess is that one "identifying" photo will be used in Wikipedia and the rest will be here in a Commons category for someone who wants more visuals. And I've certainly uploaded a lot of images of Seattle (where I live) that I don't expect to be of any near-term use: it is more a matter of recording in some detail what this city and its culture look like now because I imagine that several decades from now that documentation will be of great interest. - Jmabel ! talk 16:18, 11 May 2010 (UTC)[reply]
hi: you're reading me wrong, i was disagreeing with the comment of the person who started the topic. my first paragraph (in the above comment about the original post for this topic)) is meant as a logical analysis of the position (i.e.: if that is all that commons is for, then we don't need commons as a standalone project). my second (much shorter) paragraph states my opposition to this viewpoint. i too contribute without expecting it to all be use @ wmf projects. Lx 121 (talk) 06:15, 12 May 2010 (UTC)[reply]

I agree with the contributors who say we've already got a policy. I agree there is a problem of unwanted low quality dick shots - at times like the uploaders are flashing at us - these can and should be speedied - but that's existing policy. --Simonxag (talk) 12:20, 11 May 2010 (UTC)[reply]

Perhaps this is getting too toned down

I'd probably generally be counted in the anti-censorship camp, but I think this is getting too toned down. I do think it is important not to let Commons be overwhelmed by sexual imagery and to minimize the chances that people run across this when they are not looking for it. In particular:

  • The following statement was dropped, and I think it should be restored: "salacious and pornographic image names and descriptions should be avoided."
  • If I read the current proposal correctly, there is no longer any explicit warning that accounts that are used mainly to upload inappropriate sexual content are subject to warning and, ultimately, blocking. Yes, it is technically redundant to other policies and guidelines, but I think it is important to reiterate here.
  • There is no longer a statement that the policy against personal snapshots of participants and their friends "is enforced more strictly for sexually related imagery." This is longstanding practice, and I think it should be stated explicitly.

- Jmabel ! talk 16:12, 11 May 2010 (UTC)[reply]

Perhaps put some of that together with "commons is not a porn host", under new section: "Rules for uploaders" (as opposed to rules for content). TheDJ (talk) 16:22, 11 May 2010 (UTC)[reply]
I think there is a possible problem with the warning against accounts for sexual content. If I contribute sexual content I might very well do so from a separate account, not to make the connection to my real account (or identities of participants) too obvious. That account may look like a user only uploading a collection of pornography. I would not like to argue to defend the account (as I would be recognizable) and somebody would easily stop those contributions the first time an admin accused them for uploading inappropriate content.
The warning is good to have, but it must be very carefully worded so that this kind of account is clearly accepted.
--LPfi (talk) 18:06, 11 May 2010 (UTC)[reply]
I strongly agree that "salacious and pornographic image names and descriptions should be avoided", unless we are quoting an external source. This kind of text both denigrates the subjects of the image and obscures any legitimate educational purpose. I also mostly agree with your other two statements, although I think "personal snapshots of participants and their friends" can sometimes serve a valuable educational purpose, if they're either good quality or illustrate a distinctive type of sexual act. Dcoetzee (talk) 00:54, 12 May 2010 (UTC)[reply]
I strongly support the idea of having a policy which explicitly requires impersonal and academic names for illicit files.   — C M B J   02:10, 13 May 2010 (UTC)[reply]
"illicit"? Adam Cuerden (talk) 14:10, 13 May 2010 (UTC)[reply]
A definition of illicit is entirely aside the point--I am vehemently opposed to censorship. All I meant is that we should discourage people from exploiting our resources to draw attention to themselves or to further a specific point of view. We don't want Joe Schmoe uploading a series of slang-hyped self-titled promotional images (e.g., Joe Schmoe money shot 4.jpg) with the sole intent of leeching Commons popularity to launch a career as an amateur pornstar. Selfless contributions are much more true to what we're trying to do here.   — C M B J   03:59, 14 May 2010 (UTC)[reply]
Well, yes, but one definition of "illicit" is "illegal", "forbidden", and so on, so it's a somewhat confusing word choice. I had to look it up when you continued using it to discover it had a milder meaning which simply meant "breaking social norms". Adam Cuerden (talk) 04:05, 14 May 2010 (UTC)[reply]

Agree with Jehochman. Speedy desysopping threat is a bad idea

For the record, I agree with Jehochman's removal of the "hyperventilating", as he puts it. "Summary desysopping" is not an accurate threat. Wknight94 talk 18:13, 11 May 2010 (UTC)[reply]

I have no problem with that, but he removed more that just that of course. TheDJ (talk) 18:22, 11 May 2010 (UTC)[reply]
Yes, I only noted here instead of warring. Back to the page itself... Wknight94 talk 18:28, 11 May 2010 (UTC)[reply]
I think what's there now is sufficiently nuanced. Jehochman (talk) 19:03, 11 May 2010 (UTC)[reply]
Well this version has too many "must"s and "only"s. I disagree that there is only one case where speedy is appropriate. If someone uploads blatant hard-core porn, then adds it to one page 60 seconds later, it does not immediately become un-speedy-able. That would cause wars all the time. There are always going to be other exceptions - to every rule really. Wknight94 talk 19:50, 11 May 2010 (UTC)[reply]
http://commons.wikimedia.org/w/index.php?title=Special:Contributions&dir=prev&limit=20&target=Jehochman. Erik Warmelink (talk) 22:01, 13 May 2010 (UTC)[reply]

Sexual content PROD

Example of an unused image that probably would not qualify for speedy deletion.

Has one been proposed (IMO, it should be similar to {{BLP-PROD}} at en-wp)? It would seemingly be a good compromise for speedy deletion opposition. Sincerely, Blurpeace 18:20, 11 May 2010 (UTC)[reply]

I originally proposed the BLP-PROD idea. I think what's needed here is to liberally apply speedy deletion to sexual content (not merely nude content) that fails the tests outlined in Commons:Sexual content. When such speedies are proposed, an administrator will decide what to do (thankfully, I am not an administrator here). If the deletion needs to be appealed, the matter can go to deletion review where the burden should be on the appellant to show a reason for the image to be included. Frequently, images fit for deletion are low quality, and have obviously inappropriate file names like "My beautiful ass :-).jpg". A properly composed image, in focus, with a serious, neutral title like "Buttocks of a Human Female.jpg" has much more potential for educational use. Jehochman (talk) 19:13, 11 May 2010 (UTC)[reply]
I agree, but I don't think the inclusionist community at commons-wiki would agree to something so "radical" (at least not in my experience). I would support if this were proposed. Blurpeace 19:18, 11 May 2010 (UTC)[reply]
(after edit conflict) Deleting an image and then discussing it afterwards is very much the wrong way to do things. In all cases where an image is not subject to speedy deletion, the onus must be on the person wanting to delete the image to explain why it should be. If that reason is that it is redundant to a better image, that better image or images must be linked to so that others can decide if they agree or not. If the issue is with the file name, then the image should not be deleted but proposed for renaming. Because Commons is not censored, images of sex and nudity that are legal to host should be treated no differently to any other images. Thryduulf (talk) 19:21, 11 May 2010 (UTC)[reply]
There are many ramifications to hosting massive amounts of sexual content on Wikimedia Commons. Of course these photos should be treated differently. It's ludicrous to think otherwise, in my view. The law sees them in separate categories; why don't you? Blurpeace 20:58, 11 May 2010 (UTC)[reply]
Probably because Thryduulf knows that there are more laws than "the law" which includes the "The Hague invasion act" to protect war criminals such as the occupation forces in Iraq. Erik Warmelink (talk) 22:11, 13 May 2010 (UTC)[reply]
I'm not sure if MORE process is a solution here though. Commons already has trouble keeping up with Deletion requests. Something we could consider is using a special template, that somehow places all of these deletion requests into a separate category, so it is easier to see how much "up-to-date" we are on porn-delete's TheDJ (talk) 19:31, 11 May 2010 (UTC)[reply]
If we are going to keep a porn image, there needs to be a demonstrable educational use, not just a nebulous theoretical one, like somebody might want to use this image some day. It takes less than a minute to upload an image. The usual deletion process takes several orders of magnitude more effort and volunteer time. It is unacceptable to put the burden on the community when porn is uploaded without a valid rationale. When images do not have a proper explanation of copyright status, they can be speedy deleted. Same should apply to porn. Regrettably, there is an unusually strong interest in uploading porn images to Commons. As a matter of public relations, it is very damaging for Commons to be viewed as a free-porn hosting provider. That's not our purpose. For non-porn images, we can probably afford to be more relaxed in our standards because there is less risk, and less of a problem. Jehochman (talk) 20:07, 12 May 2010 (UTC)[reply]
Realistically, if the "prod" works like a Wikipedia prod, then no one is going to see it or know about it until the image is gone. I think this would be even more disruptive than allowing admins to speedy-delete things at will, because anyone can put a prod, criteria may vary widely, and non-admins have less to lose for misconduct. Without some special restriction it could even be used by vandals against perfectly non-sexual content from throwaway accounts or IP addresses. Wnt (talk) 19:54, 14 May 2010 (UTC)[reply]
Commons doesn't really have enough people to do a Wikipedia-style PROD. Just look at how many deletion requests have been up for days or weeks with no responses - and those are considerably more visible. Dcoetzee (talk) 02:40, 21 May 2010 (UTC)[reply]

Pornography

It seems apparent that ideas of what pornography is vary from country to country. For reference, in the UK:


Possession of extreme pornographic images

(1) It is an offence for a person to be in possession of an extreme pornographic image.

(2) An “extreme pornographic image” is an image which is both—

(a) pornographic, and

(b) an extreme image.

(3) An image is “pornographic” if it is of such a nature that it must reasonably be assumed to have been produced solely or principally for the purpose of sexual arousal.

(4) Where (as found in the person’s possession) an image forms part of a series of images, the question whether the image is of such a nature as is mentioned in subsection (3) is to be determined by reference to—

(a) the image itself, and

(b) (if the series of images is such as to be capable of providing a context for the image) the context in which it occurs in the series of images.

(5) So, for example, where—

(a) an image forms an integral part of a narrative constituted by a series of images, and

(b) having regard to those images as a whole, they are not of such a nature that they must reasonably be assumed to have been produced solely or principally for the purpose of sexual arousal,the image may, by virtue of being part of that narrative, be found not to be pornographic, even though it might have been found to be pornographic if taken by itself.

(6) An “extreme image” is an image which—

(a) falls within subsection (7), and

(b) is grossly offensive, disgusting or otherwise of an obscene character.

(7) An image falls within this subsection if it portrays, in an explicit and realistic way, any of the following—

(a) an act which threatens a person’s life,

(b) an act which results, or is likely to result, in serious injury to a person’s anus, breasts or genitals,

(c) an act which involves sexual interference with a human corpse, or

(d) a person performing an act of intercourse or oral sex with an animal (whether dead or alive),and a reasonable person looking at the image would think that any such person or animal was real. [11]


Note that this appears to imply that by British standards an image can be found to be pornographic if taken by itself, i.e. if it is provided without context (as tends to be the case in Commons), even if the original overall context might not have been found pornographic.

In the United States, the 1986 Attorney General's Commission on Pornography defined pornography as, "Material that is predominantly sexually explicit and intended primarily for the purpose of sexual arousal." [12]

"Obscenity" is defined by the Miller test:


  1. Would the average person, applying contemporary community standards, find that the work taken as a whole, appeals to the prurient interest?
  2. Does the work depict or describe, in a patently offensive way, sexual conduct specifically defined by the applicable state law?
  3. Does the work, taken as a whole, lack serious literary, artistic, political, or scientific value?

If all three questions are answered "Yes", this makes the work obscene, i.e. not protected by the First Amendment, and liable to be prohibited. According to this webpage, there is a version of the Miller test for minors, serving to define "material harmful to minors" in the US:


It is illegal to sell, exhibit, or display "harmful" ("soft-core") pornography to minor children, even if the material is not obscene or illegal for adults. See also Com. v Am. Booksellers Ass'n, 372 S.E.2d 618 (Va. 1988), followed, American Booksellers Ass'n v Com. of Va., 882 F.2d 125 (4th Cir. 1989), Crawford v Lungren, 96 F.3d 380 (9th Cir. 1996), cert. denied, 117 S. Ct. 1249 (1997). "Harmful to minors" means any written, visual, or audio matter of any kind that :

the average person, applying contemporary community standards, would find, taken as a whole and with respect to minors, appeals to a prurient interest in nudity, sex, or excretion, and

the average person, applying contemporary community standards, would find depicts, describes, or represents, in a patently offensive way with respect to what is suitable for minors, ultimate sexual acts, normal or perverted, actual or simulated; sadomasochistic sexual acts or abuse; or lewd exhibitions of the genitals, pubic area, buttocks, or post-pubertal female breast, and

a reasonable person would find, taken as a whole, lacks serious literary, artistic, political, or scientific value for minors.


Here is a little bit on German legislation, from [13]:


Media with pornographic content are regularly considered to be obviously and severely harmful to minors. Pornography itself is defined by the German High Court as a presentation of sexuality that is not connected to any kind of psychologically motivated human relationship and which glorifies sexual satisfaction as the only reason for human existence, often accompanied by grossly depicted genitals.

Distributing those objects to minors is illegal (§ 15 I and III-VI JuSchG) and will be punished by law (§ 27 JuSchG). In addition, the German penal code (Strafgesetzbuch - StGB) penalizes the dissemination of pornographic content (§ 184 StGB).

Completely prohibited - even among people of legal age - are the depictions of sexual acts involving children, animals or violence. Similar regulations prohibit media with explicitly violent content.

The spreading of pornographic content and other harmful media via the internet is a criminal offence under German jurisdiction. A pornographic content on the internet is legal only if technical measures prohibit minors from getting access to the object (AVS = Age Verification System or Adult-Check-System).


There is no true definition of pornography in Netherlands law; the following is a translation of article 240 of the Dutch penal code [14] by TheDJ (talk). The definition of what is offensive is undefined and has had different interpretations throughout time, usually depending on the current zeitgeist.[15]


With a prison sentence of at most two months or a fine of the third category will be punished any person who knows or has substantial reason to presume that an image or object is offensive to the 'eerbaarheid' [loosly translated as decency] and who

1 presents or offers this image or object in or on a place that is public.

2 sends this image or object to anyone, other than upon that person's own request.


Let's look at these (and equivalent definitions in other countries) and figure out how they factor into what we are doing here. --JN466 20:07, 11 May 2010 (UTC)[reply]

Does it matter, for our purposes, whether an image is pornographic or not? Chock value may matter, when an image is not clearly of educational value, but that does not correspond to the image being pornographic. The value for Wikipedia articles about physiology is smaller if the image is pornographic, but in other contexts, such as in articles on sexuality, this is not clear. --LPfi (talk) 21:28, 11 May 2010 (UTC)[reply]
I made a translation of the most relevant Dutch law on this. If anyone can do better, please do so. TheDJ (talk) 23:11, 11 May 2010 (UTC)[reply]
A reminder: we want pornographic images on Commons. Why? Because pornography is itself one of the topics described by Wikipedia and other educational projects. We have legions of articles on porn stars, porn movies, and so on. We would not want to render these all text-only. Dcoetzee (talk) 23:37, 11 May 2010 (UTC)[reply]
Agreed. I am only worried about the minors issue. --JN466 00:25, 12 May 2010 (UTC)[reply]
Seconded. Those definitions of what constitutes "pornography" are those from Western liberal democracies (US, UK, Germany, Holland). What the definition would be in places that aren't that (Saudi Arabia perhaps?) I don't know. But my gut feeling is that we ought to largely sidestep the issue of trying to settle for a definition of what "pornography" is as it would necessarily be a somewhat subjective and individual definition. Tabercil (talk) 03:20, 12 May 2010 (UTC)[reply]

I would like to provide three generalized scenarios that I observe here:

  • #1. We base our censorship policies on all applicable national laws and censor illicit content if it is required or requested of us
  • #2. We base our censorship policies on some national laws and disregard those that infringe on popular views of free speech
  • #3. We base our censorship policies on no national laws as an uncensored educational project, except those which we are unequivocally subjected to and disclaim as such

In scenario #1, we would be forced to delete images like Tianasquare.jpg and Flag of Nazi Germany (1933-1945).svg--obviously, we've reached a clear consensus that it simply isn't ever happening. In scenario #2, we go down a slippery slope of picking-and-choosing what laws we do (France, United Kingdom, ...) and don't (China, Iran, Saudi Arabia, ...) want to obey. This absolutely robs us of our neutrality in the grander scheme of things, thus making us out to be ethnocentric hypocrites. In scenario #3, we place ourselves in the line of fire and accept the risk of occasional retaliation from governments around the world, but we retain our status as an intellectually libre compendium of free knowledge that knows no borders.   — C M B J   11:44, 12 May 2010 (UTC)[reply]

Generally speaking, our policy so far has been to only follow United States law, since Wikipedia servers operate in the United States (i.e., they are the only ones who can come with guns and seize the servers Wikipedia is running on). I think this is an entirely practical strategy. As a courtesy, in matters of copyright, we also follow the law of the source country. Dcoetzee (talk) 12:33, 12 May 2010 (UTC)[reply]
That's true, but CollateralMurder.ogv and Virgin Killer.jpg are both arguably in violation of U.S. law. Even so, our pragmatic approach has historically recognized that we are under First Amendment protection when we operate transparently, educationally, sensibly, and in good faith. If, however, we are going to preemptively and methodologically censor "obscenity", then we need to be very explicit about our definition of censorship in all related policies and claims. I failed to construe my original message properly, and it has been edited accordingly.   — C M B J   09:19, 16 May 2010 (UTC)[reply]

Art

We have a text that says that all artwork is in scope, but a footnote saying that the full deletion process is preferred for such artwork. On what grounds would the file be deleted?

We also have a text about media (widely regarded as) being of artistic, literary or historical merit. Is this about media other than artwork?

We probably want some art to be deleted, at least user made mediocre art without educational value. Art that is notable, made by a notable artist or of historic value is clearly to be kept. Likewise art that is regarded as otherwise being of educational value. But less known art by less known artists, without special historical or educational value, on what grounds is that to be kept or deleted?

--LPfi (talk) 22:01, 11 May 2010 (UTC)[reply]

I think that is very much a debate. But clearly we don't recognize something as art, just because a flickr uploader stated something. On the other hand, the example image shown higher up clearly has high artistic properties, even though the author might not be a well recognized artist. I don't think we can ever create fully functioning rules on this one. Deletion debates are the only thing we can do. TheDJ (talk) 22:43, 11 May 2010 (UTC)[reply]
Concur with User:TheDJ. There is no community support for retaining pretty innocent artwork like the art discussed in Commons:Deletion requests/File:Woman Posing in Abandoned Site.png; on the other hand there is community support for retaining photos of nude children by notable artists. Sexual content is a much smaller factor than other concerns when it comes to retaining art, so it's not worth spelling out on this page. Dcoetzee (talk) 23:35, 11 May 2010 (UTC)[reply]
I think that there are things that wouldn't be considered artwork, but do come under the "part of a work of artistic merit". For instance, still photographs from the necrophiliac ending, or the Dance of the Seven Viels" from Oscar Wilde's Salome may not be artworks in themselves, but are part of one. I think there's overlap between the two requirements, but both uniquely include things that the other doesn't. Plus, a little emphasis on a key point never hurt. Adam Cuerden (talk) 14:14, 13 May 2010 (UTC)[reply]

the desired direction of commons - small question

okey dokey - so my longstanding interest in 'wiki porn' occasionally leads me, for purely research purposes of course, to check out other free porn providers to see what's what. I recently discovered a free porn site called 'xhamster.com' which I think is described as a 'Porn 2.0' site, because it seems to feature thousands of videos uploaded by people who register with the site. My understanding is that such sites have somewhat exploded over the last few months / years. So anywhoo... among the rather likely copyright violations, I saw this video, which is a blowjob and facial cumshot, and this video of an amateur handjob, which I have the feeling would be welcomed on commons in the current climate, if they could be shown to be freely licensed.

For me this neatly illustrates what I feel is a current, and certainly believe will be a future, issue of exactly how commons sees itself. I don't think having a large volume and wide variety of photos and videos of all types of sexual activity would be desirable or sustainable for commons, I also feel I'm in a small minority. I believe the effect on the reputation, and utility of wikimedia projects would be extreme, and very very far from the asserted educational benefits. Thoughts? Privatemusings (talk) 01:30, 12 May 2010 (UTC)[reply]

Okay. I am worried about the minors issue and about Commons turning into a mirror of xhamster. --JN466 03:52, 12 May 2010 (UTC)[reply]
I understand your feelings here, but as you expected I do disagree - I think amateur pornography videos like these could have a valuable educational purpose in a variety of contexts, and that hundreds of them demonstrating a diversity of types of pornography, sexual positions, orientations, particular kinks, camera angles, etc. would be quite helpful for illustrating a variety of sex-related articles. On the other hand, I think it'd be more valuable if we could get our hands on more objective educational sexual material, like high-quality illustrative photographs and videos of sex positions taken in professional studios, since porn as we know is rarely a realistic depiction of anything except porn. I've had thoughts about arranging a shoot for this type of media myself but would need access to models. Dcoetzee (talk) 04:56, 12 May 2010 (UTC)[reply]
Under your logic nothing could ever be excluded because anything has some sort of remote potential for educational use. Since your logic when generally applied leads to an anti-social result (Commons full of low quality, irrelevant images), then we must reject your assertion. Jehochman (talk) 12:55, 12 May 2010 (UTC)[reply]
Commons has a defined project scope, and much material was deleted by Jimbo Wales and others and was not restored because it isn't really useful for educational purposes. The key here is to recognize that although Commons is not a free porn provider with the usual endless dreary variety of unphotogenic sex acts, it is also not censored and needs to illustrate articles for a large number of projects, including articles on many variants of human sexuality.
I should also point out that there is not much difference between the freedom that allows you to link to such photos as you've selected above, and that which allows the archival of photos — in fact, attempts to censor material both in the U.S. and Australia would count these the same.
When you speak of Commons going in a certain direction, it's worth pointing out that the outcome of the deletions and undeletions is that people want it to stay where it is. We want to stick with old policy and keep the existing balance. When you take it in another direction, people start agitating for deletions of Muhammad cartoons, photos of corpses, stories from 1895 that glorify rape and so on. It turns into a censorship gold rush and everybody wants to stake their claim. Wnt (talk) 14:59, 12 May 2010 (UTC)[reply]

Until a few days ago, we had a number of explicit photographs (probably amateur photographs) on penetrative use of dildos. Typically, you saw the rear of a kneeling woman, with a dildo inserted in the vagina or anus. (Although deleted here, some of the images are currently still in the google cache, so if you switch safe search off, you can see them here: [16]) How will Commons:Sexual_content#Evaluating_sexual_content apply when we next get a dozen uploads of such images, and several editors argue at DR that they are of potential educational value? It feels quite off to build up another collection of these images like the one we had. Same with amateur cumshots, of which we had a few. I propose we incorporate and expand the wording of COM:PORN here. COM:PORN links to this proposal, but the amateur porn issue is not currently addressed in the draft. --JN466 20:11, 12 May 2010 (UTC)[reply]

I've added some of the wording from COM:PORN. [17] --JN466 13:32, 14 May 2010 (UTC)[reply]
Responding to Jehochman: like most admins, I do favor the deletion of low-quality images for which high-quality substitutes are readily available. We don't need webcam shots of everybody's penis, but a sex video that is in some way distinctive or unique in its subject matter or presentation as compared to other is not so easily supplanted. Dcoetzee (talk) 06:22, 21 May 2010 (UTC)[reply]

This proposed guideline

I think we're getting to the situation where this guideline is ready for a community vote. Thoughts? Adam Cuerden (talk) 13:04, 12 May 2010 (UTC)[reply]

I think it is still evolving and that there might be issues not already dealt with. Let us give the proposal another week, we aren't in any hurry, are we? --LPfi (talk) 09:24, 13 May 2010 (UTC)[reply]

Proposal

I propose adding a sentence saying that editors who have uploaded sexual images of themselves may ask for them to be removed again, and that we will comply with such a wish. --JN466 19:45, 12 May 2010 (UTC)[reply]

People often demand material is deleted when leaving the project in anger, knowing they're causing damage. This could give them the right to delete important images and derivatives in which they're in no way identifiable. We have quite a good record of behaving morally when an image is causing a problem for someone: I think we got rid of a Suicide Girl image because it's subject wanted it gone (not actually Sexual Content under our current definition). --Simonxag (talk) 20:38, 12 May 2010 (UTC)[reply]
There are also another imaginable scenarios – say, where someone was drunk when they uploaded the material, or did it as a dare, and later wishes they hadn't. The decent thing is to delete in these cases, if the uploader is the person depicted in the image and wishes it gone. --JN466 20:49, 12 May 2010 (UTC)[reply]
Agreed, in those cases it should go and that also goes for simple nudes, private situations and anything embarrassing. I'm not sure here is the right place to put such a stipulation or how to word it so there isn't an open right to vandalize what you've given under a free license. Also, unless the request is shortly after the upload (when we'd accept the upload as a mistake anyway) I don't see the need for speedy deletion. --Simonxag (talk) 22:21, 12 May 2010 (UTC)[reply]
What about images which were improved by other users, or incorporated into later derivative works? I agree that we should consider this type of image for deletion, as a courtesy, but it should not be speedyable. Dcoetzee (talk) 00:36, 13 May 2010 (UTC)[reply]
OTRS could handle it. --JN466 12:25, 13 May 2010 (UTC)[reply]

How about including a paragraph like this:

Deletion at the subject's request

If Commons hosts a nude image of you, or an image of you engaged in sexually explicit conduct, and the image was uploaded without your consent, and/or you are unhappy with the image's continued presence on Commons, please send an e-mail to permissions-commons@wikimedia.org. --JN466 12:30, 16 May 2010 (UTC)[reply]

  • Yes, I think something like that would be in order. It should be clearer that we will normally with such requests, assuming that there is reason to believe that the person who contacts us is, indeed, the subject of the photo and that we do not have their permission on file (since, if we have that, they presumably cannot withdraw it, any more than others can withdraw licenses). I also think we should link COM:OTRS and clarify that only a small number of trusted participants will see the email in question. - Jmabel ! talk 18:24, 16 May 2010 (UTC)[reply]
Well, that's a generally a nice idea. But how will OTRS verify that the emailing person is the person on the picture? Require a scanned ID document? IMHO everybody can send an email saying that a picture has to be removed because it is depicting the sender. I do not see how we will protect us against vandalism using this deletion method. Cheers --Saibo (Δ) 23:36, 17 May 2010 (UTC)[reply]
It is on the person uploading to make it clear that everything is in order. If there is a question about permission to upload the image then the material should be removed and only added back later after it is clear that everything is in order. On Wikimedia project's we do this all the time with requests to remove material due to potential libel or defamation. The people that do this work are pretty savvy about sorting it out. FloNight♥♥♥ 00:07, 21 May 2010 (UTC)[reply]
I suggest that the email go to the oversight volunteers on Commons to their mailing list. They are experienced in handling requests for deletion or suppression. On Wikipedia English OTRS often contacts Oversight for assistance in removing content and I'm guessing that it happens on Commons, too. If the person wants the content removed them it should most likely be suppressed so that it would be available to the fewest people possible. FloNight♥♥♥ 00:07, 21 May 2010 (UTC)[reply]

Illegal material

We currently only mention child pornography as a type of illegal material. I believe there are a few others. I propose it would be worthwhile to spell these categories out, as per currently applicable US law, so uploaders checking this page can find out exactly which types of content are illegal to upload before they do so. --JN466 20:24, 12 May 2010 (UTC)[reply]

sounds like a good idea. --Simonxag (talk) 20:44, 12 May 2010 (UTC)[reply]
Any help in researching this further would be appreciated. Note the correction above; the Miller test defines obscenity, not pornography. --JN466 23:53, 12 May 2010 (UTC)[reply]
Actually, Jayen, pornography is pretty well defined in US law, see 18 USC 2256. TheDJ (talk) 00:12, 13 May 2010 (UTC)[reply]
I do not see any definition of pornography on the page linked, only of explicit whatever and of child pornography. --LPfi (talk) 09:29, 13 May 2010 (UTC)[reply]
DJ, I'm familiar with that page. What we need to find out is the types of pornography that are actually illegal in the US. Does this include "pornography featuring violence, bestiality, and incest", for example? --JN466 12:00, 13 May 2010 (UTC)[reply]

According to en:Zoosexuality_and_the_law, it is determined by state law, and Florida has no law against bestiality and definitely not against depictions of bestiality. It might fall under obscenity law however. TheDJ (talk) 19:56, 13 May 2010 (UTC)[reply]

See Commons_talk:Sexual_content#Beastiality_and_other_paraphilias discussion below. - Stillwaterising (talk) 17:33, 24 May 2010 (UTC)[reply]

Evaluating sexual content - More about BDSM and violence

In Commons:Sexual_content#Evaluating_sexual_content, under 1, I propose we mention the Abu Ghraib images as a specific example of images that are within scope because of their notability and historical impact. --JN466 20:53, 12 May 2010 (UTC)[reply]

We seem to have lost the BDSM exclusion completely, which is good as it targets harmless stuff that's just a bit bizarre and is great for illustrating the dress up and play aspect of modern sexuality. I think we should replace this exclusion with something worded like the UK definition of extreme pornography with an exception for images of "notability and historical impact": the Abu Grahib stuff could stand as a paradigm of what should stay as a historical document but should be ditched otherwise. If there's one thing apart from kiddiporn that's going to land us right in it, it's photos, videos or recordings of real harm taken as erotica: if there's any law we might have broken a jury would cheerfully find against us. --Simonxag (talk) 00:18, 13 May 2010 (UTC)[reply]
I'm not sure how I'd word the exclusion but I think we should ban erotic photos and videos of violent harm. Jimbo wanted to ban sadistic and masochistic abuse: that was literally translated into the deletion of images of fetish clothing eg. File:Kaalos g locked-in.jpg. As a result it's not surprising that this whole section got rejected. But I don't think a photo of a cigarette on genitals or anybody's Max Hardcore type video should be on the Commons. Violent imagery has its place in art and historical documents of crimes and the horrors of war are important educational tools, but to have photos of violence in the form of erotica is enough to sanction that violence. I think there are all sorts of complex issues here and I'm not advocating speedy deletion. But I do think this should be a policy. I think this issue needs talking about before we go to any voting stage. --Simonxag (talk) 03:05, 13 May 2010 (UTC)[reply]
This has been on my mind a bit as well. Some standard for violent abuse is needed. Is there a more general one for violent photos of any kind? SJ+ 05:24, 13 May 2010 (UTC)[reply]
I think the current standard is The Commons is not censored. I want it to stay that way, especially for horrors of war, racial violence and holocaust photos: these tell truths we shy away from but must never be suppressed. I think there are standards we need to stick to in the face of a very pervasive and utterly amoral porn industry. I don't think anybody would allow Max Hardcore type stuff even without this policy, but since we're producing a policy, we need to put it in. --Simonxag (talk) 11:06, 13 May 2010 (UTC)[reply]
What we currently have is Category:Urolagnia. I note we don't have Category:Coprohilia and Category:Coprophagia (and whatever the term would be for vomiting videos) as in 2 Girls 1 Cup. Again, are these legal in the US, and would we want to host them even if they are? --JN466 12:08, 13 May 2010 (UTC)[reply]
I don't know for sure - but my impression is pornography is almost certainly legal in the US (I certainly haven't seen any crackdowns on distributors of kinky pornography, coprophilia websites, etc.). Although many people find this type of media offensive, I think Commons would absolutely want it (for example, as an illustration of coprophilia, etc.) Whether or not any particular local project feels comfortable displaying such media in their articles is their own decision. Dcoetzee (talk) 15:21, 13 May 2010 (UTC)[reply]
It's worth repeating that the scope of the project only includes material "realistically useful for an educational purpose". I don't see why these topics shouldn't be illustrated in some way, but total gross-out material (eg. heavy taboo-breaking realistic-seeming porn) is surely unusable: we're running educational projects not shock sites. Also editors and admins should expect some material not to their taste, but shouldn't need a sick bucket to hand while browsing sexuality images: it's worth noting that illustrator Seedfeeder refused to illustrate coprophilia (while accepting that it should be illustrated), beacause of personal limits. Some issues of taste create issues of usability. --Simonxag (talk) 20:10, 13 May 2010 (UTC)[reply]
I've checked out some of our medical images and have now to admit that some genuinely disgusting stuff can be useful. Categorization could help. Even so I doubt that much of the heaviest taboo breaking stuff from the sex industry or those that copy it, would be realistically usable. --Simonxag (talk) 11:00, 15 May 2010 (UTC)[reply]
Keep in mind we do have articles about shock sites, which presumably could be illustrated with examples of those sites; likewise certain types of pornography are so shocking that nearly any image portraying them would be shocking to an average person (e.g. illustrations of necrophilia). We do want to avoid gratuitously offensive media, but not where it's necessary to illustrate the subject. Dcoetzee (talk) 05:49, 16 May 2010 (UTC)[reply]
In general, though, a screenshot of a shock site would raise copyright issues & wouldn't belong on Commons. - Jmabel ! talk 18:29, 16 May 2010 (UTC)[reply]

More Protection Please

I'm a member of en:Wikipedia:WikiProject Sexology and sexuality. I have transwikied a fair amount of material (I suppose admins can check). I also persuaded other users that the right and proper place for their work was the Commons. That's right, it's me to blame!!! I was pleased to see the material used in other projects. In fact I was quite proud of what I've done (or should that be unrepentant? :-) ). Only now, I find that I've exposed other user's work to summary covert deletion and Sexuality project pages to damage. On the Wikipedia, users are asking for local upload of images. The Commons has let the projects down and I have personally let my project down. Is it safe to transwiki images? Will it be safe when these guidelines are in place and an admin says not realistically useful for an educational purpose to justify a covert deletion? We know there are those who want to.

I need:-

  1. An acceptance in the guidelines that sexuality material is valuable not just an embarrassment ("impact our reputation"). AIDS worldwide is now the no.1 killer of women of a childbearing age and expanding, the drugs and vaccines won't work, knowledge is the only defense. Google any sex topic, you get a load of porn (with its take on factual accuracy and social responsibility) preceded by a Wikipedia page. Please help improve these, not damage them. The position of the Wikipedia sexuality pages on the web (like the medical pages) may be problematic, but it's the way it is.
  2. Protection from speedying for Transwikied material. People who produce stuff for projects should not see it deleted if they (or idiot Simon here) decide to share it. Some rubbish will sneak through this way, but it can be put up for deletion and deliberate vandals can always be blocked.
  3. Protection for user generated artwork. This stuff is created by editors hard work. Some is poor, some is brilliant: some of the best was among what vanished from pages at the weekend. A variety of artwork has given editors the option to illustrate articles where photographs were not available or judged by consensus to be inappropriate. The insensitive judgments on this and its summary deletion is insulting and particularly demoralizing to those who are willing to produce it. People are actually being treated as vandals or criminals because they tried to (and in many cases did) improve articles.

--Simonxag (talk) 22:02, 12 May 2010 (UTC)[reply]

I guess you refer in part to the artwork generated by User:Seedfeeder. While I think his account name is daft, I am in agreement that every one of his images I have seen to date is well executed, has potential educational value (independently of whether it is used in any WP project or not) and is within the scope of this project. I would be happy to include a generally worded statement to that effect. I guess the current applicable wording in the draft is
The material is "realistically useful for an educational purpose" (ref Commons:Project scope), such as diagrams, illustrations, photographs, medical photographs of diseases, high quality images of body parts,[4] and illustrations of the various styles of erotic art. The material can have an educational value even if it was not created for that purpose (for example File:Masturbating hand.jpg illustrating female masturbation.).
How would you want to strengthen it? --JN466 22:24, 12 May 2010 (UTC)[reply]
At the other end of the spectrum, for me, is an amateur shot like this. Would you agree that the educational value to the project is nil here? And if so, how can we make clearer to the reader that this is out of scope? --JN466 22:30, 12 May 2010 (UTC)[reply]
That image seems a perfect candidate for speedy deletion. Just another flashing shot, not transwikied or created to help any project, just an "oh no not another one image". --Simonxag (talk) 23:27, 12 May 2010 (UTC)[reply]

Simon: I think the current text is good - not about embarrassment. I agree that it could say a bit more about the tremendous value of good sexual educational material for people of all ages. You are right that transwikiing deserves special note - since cross-project communication is hard. And yes, there should be a stronger clause in support of user-generated artwork that was made to illustrate an article; there is no point in driving our best illustrators away. (It would be nice to track down the illustrators affected by recent deletions and thank them.) SJ+ 00:58, 13 May 2010 (UTC)[reply]

Yes please review our new proposal of Commons:Sexual content. I hope that you will find enough protection in that. Please note that what Jimbo did was deeply tragic, but that many people at Commons share your concern. If you have any material that is still missing, please report it, so that the action can be looked into. TheDJ (talk) 23:34, 12 May 2010 (UTC)[reply]
Nice wording update, Dcoetzee. SJ+ 00:58, 13 May 2010 (UTC)[reply]

Possible use of Commons as a "Porn Gallery"

Restrict the use of Commons as a possible Porn Gallery of erotic images not used elsewhere on other projects & does not provide any "educational" value. See User:Max Rebo Band for an example —Preceding unsigned comment added by Tyw7 (talk • contribs) 02:31, 13. Mai 2010 (UTC)

The current proposal states that excessive sexual content isn't appropriate for user pages. But Max's comment below is relevant. SJ+
hmmm.... I haven't checked when / who added that, but if the rationale / feeling is that the linked user page guideline prohibits the use of sexual content on en wiki user pages then I wouldn't say that's really the status quo. See this section of the rejected en discussions, or this rejected proposal here. I fully support this measure, but had previously removed it as a suggestion, because it was so unpopular, and I didn't really feel it was all that important. Glad to see it back, mind! cheers, Privatemusings (talk) 10:06, 13 May 2010 (UTC)[reply]
The user in question created all the images he links. I think that, if the images are Commons-worthy, which they probably are, that he has every right to say "Hey, I made this!" Adam Cuerden (talk) 22:21, 13 May 2010 (UTC)[reply]
Agree with Adam here. --JN466 09:56, 14 May 2010 (UTC)[reply]
just to reiterate (per sj above) - the proposal in its current form would prohibit such userpages. Privatemusings (talk) 00:22, 14 May 2010 (UTC)[reply]
So the proposal is that everybody is allowed to have a gallery of their uploads on their userpage, except people who have uploaded a picture containing swimwear? Excellent idea, I don't see how that could possibly be a slippery slope. Here's a better proposal, how about the 15 "problem users" in this whole Jihad Against Things I Personally Dislike just agree that if they don't want to see an image illustrating w:Suspension bondage, they not visit Category:Suspension bondage...that seems fair. Max Rebo Band"almost suspiciously excellent" 00:28, 14 May 2010 (UTC)[reply]
I've added an exception for users like yourself who wish to document their own Commons contributions. [18] --JN466 13:31, 14 May 2010 (UTC)[reply]
Documenting of self made photos on user pages should never be a problem, and generally be allowed. But the existance of a personal gallery of this kind should not be the only reason to prevent questionable images from deletion. -- smial (talk) 17:27, 14 May 2010 (UTC)[reply]
The proposed restriction on user galleries makes no sense to me. The User page contains no pictures, it is only a list of file names. If you are worried that users like Max Rebo Band will become the poster children for Wikipedia pornography, bear in mind that if you do ban them, people will simply use various categories about sexual content for the same purpose. This may well be worse for Wikimedia's "image", because real humans have some aesthetic preferences when setting up a gallery. There's also the question of whether you'll then ban user pages that link to just a few favorite categories. Also whether users will start fiddling with the categories to make them more usable as porn galleries. (Would that even be wrong? Should we have a "Category:Tasteful Nudes (female))"?
Bottom line — with the user page restriction you've broadened your focus from deleting pictures to banning text, and that's a huge mistake. Wnt (talk) 20:21, 14 May 2010 (UTC)[reply]

Consent clarification

The line about requiring consent from the parties involved in media isn't clear about what constitutes sufficient confirmation of consent. It's worth being explicit - an OTRS release by the subjects, perhaps. SJ+

There is the problem that for verifying consent you usually need to identify the person. Is it better to leave them anonymous if they want, when there is no reason to doubt them having given their consent? May be it is good to have OTRS handle the thing, but I think the word of a serious contributor to any of the projects might be enough in most cases. Then we are of course left with the problem of judging who is "serious" enough. --LPfi (talk) 10:01, 13 May 2010 (UTC)[reply]
I think requiring consent in all cases is too strong - particularly if the picture was either taken in a public place, or obviously taken in a photo studio for distribution. But it's a good idea generally (and yes, should go through OTRS). Dcoetzee (talk) 08:03, 14 May 2010 (UTC)[reply]
NB: a few months ago, a file had been kept after the OTRS received a mail from the author saying "this is me on the picture". But it's not a sufficient proof and it has been showed afterwards when another picture from the same "author" was featuring another woman ! (I'll try to find the DR about it) So how can the OTRS deal with such mails ? --TwoWings * to talk or not to talk... 08:12, 14 May 2010 (UTC)[reply]
I agree that the present wording, "Sexual content uploaded without the consent of the participants should be speedy deleted, when these are living or recently deceased people. Such pictures would usually unreasonably intrude into the subject's private or family life. See Photographs of identifiable people." does not make sense as it stands. There is no way for an admin to know whether or not consent has been given, unless we require an OTRS release. I am in favour of requiring an OTRS mail with the person identifying themselves -- it would probably reduce the "here is a nude picture of my (ex-)girlfriend" uploads, in favour of material from professionals who are used to dealing with record keeping requirements. Even so, as TwoWings points out, the method is not foolproof. --JN466 13:43, 14 May 2010 (UTC)[reply]

This is quite a difficult matter to get right. Some thoughts on this:

  • It seems appropriate that any image showing an actual living person engaged in sexually explicit conduct should not be uploaded without that person's consent. Some people will definitely not be fine with a former partner or someone they only briefly met uploading images of a sexual encounter. If done without consent, this is analogous to a BLP violation ("Do no harm"). There is a very eloquent post from DGG on the Foundation list about this here which I would invite you to read.
  • Some sexually explicit images show faces, others do not. It is tempting to say that we should only need OTRS consent for images where a person is identifiable, and that consent does not matter for images where people's genitals are shown from behind, in close-up, or their face is out of shot. However, there are obvious risks in yielding to that temptation. A person whose image was uploaded without their consent will still be harmed if the uploader tells everyone, "Look, I uploaded an image of so-and-so's genitals, and this is the URL".
  • Some people may be fine with, or enjoy, having a picture of their genitals uploaded, or a picture of them engaged in sexually explicit conduct, but they may at the same time be reluctant to mail OTRS with their name and e-mail address saying, "The person demonstrating oral sex in that picture is me, and I am fine with that picture on Commons". By requiring people to mail their consent to OTRS, we may be reducing the number of valuable sexually explicit images we get.
  • It is impossible to verify that the person sending the OTRS mail is actually the person shown in the picture.
  • All of this is much less of a problem with professional porn actors. We have some professionals who contribute their images to Commons, and I am beginning to think this is something to be encouraged. Obviously, such material needs vetting for educational value (to avoid Commons being used for self-promotion), but many of the above problems disappear with professionals -- professionals in this field are by definition fine with images of themselves engaged in sexually explicit conduct being publicly available, and are also used to complying with all the relevant record keeping requirements. I even wonder if we should do an outreach to the industry.
  • Actually, this does not just apply to sexually explicit images. It also applies to nudity. --JN466 06:40, 15 May 2010 (UTC)[reply]
    • I disagree with that last statement. At least in the U.S., if an adult is nude in public (e.g. a parade, a stage performance, or even in the audience at a music festival) there is absolutely no need to get the subject's consent for a photo. Consider Category:Solstice Cyclists. Most of those people are not wearing clothing (although they are mostly wearing body paint). Under U.S. law, there is no more need to get individual consent for these photos than any other photos of the parade in which they were participating. (Of course, personality rights still apply.) - Jmabel ! talk 16:20, 15 May 2010 (UTC)[reply]
      • I'd agree with you where it concerns a public event like that. It is different with photos from a nudist beach though, or a photo taken in someone's home. --JN466 20:00, 15 May 2010 (UTC)[reply]
  • In the past I have suggested that such people simply should have a verified and enable email account. Easy to check for all users, and traceable for law enforce agencies. TheDJ (talk) 17:56, 15 May 2010 (UTC)[reply]

There was a change in the wording so that consent is needed not only for living and recently deceased people. Models that died before the Internet will not have given their consent to uploading, to give a clear example that this is a bit too strict. But I do not know of any working criteria other than "recently deceased". Any ideas who should be protected? --LPfi (talk) 12:02, 21 May 2010 (UTC)[reply]

Filenames

Another example brought up recently: file:Taric Alani Cum shot.jpg -- using porn slang to describe sexual acts.

I fully agree with this — making sure the filenames stress any potential educational use would greatly help our cause with no loss of content nor quality. I recently proposed a rename for File:Cumshot in Super-Slow-Motion (270 fps).gif, which sounds purely pornographic but which could be seen as an educational resource. Wnt (talk) 20:26, 14 May 2010 (UTC)[reply]
I've added a note about files needing to be renamed and descriptions rewritten as necessary. [19] --JN466 11:33, 16 May 2010 (UTC)[reply]

Edit summary screwup

I accidentally hit "enter" and messed up this edit summary. I meant to say "pedantic ==> academic: pedantic is mainly a pejorative." - Jmabel ! talk 00:14, 14 May 2010 (UTC)[reply]

"Other laws" section revised

I've incorporated Mike's input on record keeping requirements above. --JN466 10:35, 14 May 2010 (UTC)[reply]

Good work on that. TheDJ (talk) 20:39, 14 May 2010 (UTC)[reply]
I've added in a little more specificness. Adam Cuerden (talk) 20:51, 14 May 2010 (UTC)[reply]
Adam raised an important concern. § 2257 does not apply to classic art, and we need to fix this, as Adam tried to do here. However, I am uncomfortable saying that "most such laws apply solely to photographs and film". The actual wording in § 2257 is "any book, magazine, periodical, film, videotape, digital image, digitally- or computer-manipulated image of an actual human being, picture or other matter". It also states that the requirement only applies to material produced after November 1, 1990. To get our wording more in line with § 2257, I've incorporated that date (which automatically makes clear that this does not apply to models who sat for a painter in 1890), and paraphrased some of the language from § 2257. [20]. --JN466 06:06, 15 May 2010 (UTC)[reply]
Looks good to me. Dcoetzee (talk) 01:27, 15 May 2010 (UTC)[reply]

Scope of guidelines

What is the relationship of this guideline to Commons:Nudity? It currently includes "media that prominently include genitalia", but aren't those covered by the other guideline?

Perhaps it would be better to merge the two guidelines, to avoid redundancy, arguments about which one to use, and having parallel ongoing disputes in two different places.

Also, you seem otherwise to be going by various unfortunate "exceptions" to First Amendment rights, but w:Miller v. California also speaks of excretory functions. In theory, a photo of someone defecating might not even be "nudity" let alone sexual content as defined so far. We might as well have that debate now, because few of us have any desire to have to go through a bunch of "scat" pictures making deletion and undeletion arguments! Wnt (talk) 20:14, 14 May 2010 (UTC)[reply]

Merge the bits about use of sexual images for "shock" vandalism (including, perhaps, a mention of the "restricted use" list). The rest of Commons:Nudity is largely redundant to the material here and can be eliminated. As for Miller, my general impression is that we are ignoring obscenity law - as far as I know there is no case law on what the "community standards" for an international Internet community would be. Dcoetzee (talk) 01:37, 15 May 2010 (UTC)[reply]
Comment: is there a general Commons policy touching on shock images and the 'principle of least surprise'? --SJ+ 03:44, 17 May 2010 (UTC)[reply]
I thought we abided by Florida law and community standards. --Simonxag (talk) 10:29, 15 May 2010 (UTC)[reply]
We do abide by Florida law, but it's unclear how obscenity law applies to Commons, since it has traditionally been applied with respect to community standards of the target audience (and obviously Florida is not the target audience of our work). Should this page include the Miller test? I don't really know. Dcoetzee (talk) 19:08, 15 May 2010 (UTC)[reply]
One of the documents I've read over the past few days (can't remember which, sorry) stated that in the case of Internet materials intended for an audience beyond the population of the state, the community standards to be applied in the Miller test were those of the entire United States. There is something on this in the Internet Law Treatise wiki, but this needs more research. It looks like there have been conflicting decisions in recent months, so it may be safest to assume Florida community standards will apply. --JN466 11:17, 16 May 2010 (UTC)[reply]
I guess we should add a section on the Miller test in the "Other considerations" section. As for shock images, note that Paul Little aka en:Max Hardcore is currently in jail on obscenity charges. He was convicted in Florida. According to Gawker, "The jury ruled the films, which include scenes of vomiting, violence and urination, were criminally obscene... Little apologized to the court and said the videos and DVDs in question were labeled and intended for the more permissive European market, not for sale in the United States." Tampa Bay article. Scholarly article on the case. --JN466 11:50, 16 May 2010 (UTC)[reply]
I'm concerned about us limiting our content based on obscenity law when there's no case law to indicate how it should be addressed for a website like Commons. I admit though there's reason to be concerned when physical media and paper versions of Wikipedia are produced how obscenity law will come into play. This is a difficult issue - maybe Mike will say something about it. Dcoetzee (talk) 05:13, 17 May 2010 (UTC)[reply]
Why don't you drop him a note? Wikimedia may be a less likely target for law enforcement than Max Hardcore, given that the problematic content represents only a tiny proportion of what we have, but we should endeavour to abide by the law; firstly because it's just ordinary citizenship to do so, and secondly because it would make very poor PR if it were found that we had knowingly ignored obscenity laws.
Note that depictions of rape are apparently also not protected by the first amendment, regardless of the consensual nature of any staged scene. [21] --JN466 18:15, 17 May 2010 (UTC)[reply]
Okay, I've dropped him a mail - I'll post a new thread when I get a response. :-) Dcoetzee (talk) 22:31, 18 May 2010 (UTC)[reply]
I agree with the proposed merger of COM:NUDITY and this policy. We should add a consent requirement for images showing non-public nudity. I propose public nudity should include parades and stage performances. I am in two minds about audiences at a pop concert. If I get naked at a concert, it does not necessarily mean that I want a photograph of myself naked released via Commons for the world to reuse under a free licence. Non-public nudity would include media recorded in someone's home, at a private event, in nature, or on a nudist beach. --JN466 12:18, 16 May 2010 (UTC)[reply]

Active discussion about use

The guidelines currently include "in use for a substantial time on some Wikimedia project for educational purposes" as a reason why an image is likely to be acceptable. However, I think we should also say that media that is the subject of an active discussion on a project should not be deleted (legal issues aside) while that discussion is ongoing.

I'm thinking about cases where for example there is a talk page discussion about which of 2-3 images should be used in the article (w:talk:Naturism has seen a few such discussions for example). Sometimes the images under discussion are shown as thumbanils (in which case they should show up on global usage, and we should just clarify that this sort of talk page usage counts as "in use for educational purposes") but sometimes they are only linked to. In the latter situation I don't know whether it's easy to find out the existence of such discussions or not? Thryduulf (talk) 08:23, 15 May 2010 (UTC)[reply]

I strongly sympathize, but I don't know how that could be implemented. Anger at the damage to existing pages stopped the deletion process, but I fear useful images being quietly deleted when they're not currently in use. --Simonxag (talk) 10:47, 15 May 2010 (UTC)[reply]
It's impossible, at present, to identify all images being considered in active discussion - there is no "whatlinkshere" across all projects. If they actually display the image in the discussion, we will find it when reviewing uses, and in this case it may be possible by looking at the timestamps to tell that an active discussion is going on, even if you don't speak the language. This is a bit of an edge case though and even if it's deleted it would be undeleted for a protesting user coming from another project. Dcoetzee (talk) 19:13, 15 May 2010 (UTC)[reply]

"Non-promotional" tweak

I'm a lil' concerned about how the last clause in the "Evaluating sexual content" currently reads:

The material is non-promotional and the specific content or the content's creator is notable.

I don't have any problems with the text from the word "and" on, but it's the first part "material is non-promotional" that concerns me. Does "non-promotional" mean something that is not an advertisement? That can't be the case as we have now-PD Coca Cola ads (e.g., File:Cocacola-5cents-1900 edit1.jpg) on Commons, and for a time we had PETA advertisements on Commons (at least until Commons:Deletion requests/Images in Category:People for the Ethical Treatment of Animals cleared most of it out). In fact, the ability for a given image to commercially re-used is a prerequisite for uploading to Commons. So I'm going to be bold and strike that part. Tabercil (talk) 17:22, 16 May 2010 (UTC)[reply]

I undid your edit because the phrasing you chose — "The specific content (e.g., the people depicted) or the content's creator is notable" — would have implied that any freely-licensed photo of notable porn stars having sex would be acceptable. Notability should refer to the work itself (the painting or the photograph), not to its subject matter (the people depicted). — Tetromino (talk) 17:52, 16 May 2010 (UTC)[reply]
Probably not acceptable for pornographic reasons, but a picture of a notable porn star having sex would otherwise certainly be in scope. Picturing someone doing his job is usually a good way of illustrating an article. Just looking at the images, Category:Porn actresses looks quite different to Category:Politicians, so a good image often depicts somehow the identity of the person and his (desired) role. A politician wants to look important and integer, a porn actress sexy. Remember what happens if pictures of politicians in "certain" poses are leaked... --PaterMcFly (talk) 19:40, 16 May 2010 (UTC)[reply]
Sigh... try to kill two birds with one stone. Okay, the "non-promotional" bit I still think should be gone for reasons stated above, so I'll pull that out. As for your statement "Notability should refer to the work itself (the painting or the photograph), not to its subject matter (the people depicted)" - I do think that you're wrong otherwise we'd probably have to pull out 2/3rds of the photos we have of well-known people as they simply represent to illustrate said celeb and not make a point in and of themselves. As an example, File:Dee Roscioli.jpg... what is she doing in the this photograph that would make it notable? Tabercil (talk) 21:30, 16 May 2010 (UTC)[reply]
She is not having sex on camera. I have no problem with non-sexual photos of porn stars, but I do have a problem with sexual photographs that are not useful for illustrating encyclopedic material. The whole point of COM:SEX is to ensure that useless photographic porn gets deleted; allowing sexual images of anyone notable would mean allowing any porn that has at least one notable performer, and that would be such an enormous loophole that we may as well delete COM:SEX entirely. If we explicitly permit pornographic photos, we are basically turning Jimbo's proposed guideline on its head, completely subverting it, and I don't think that's a direction we should be taking. — Tetromino (talk) 22:27, 16 May 2010 (UTC)[reply]

I'm the one that added the non-promotional clause, because we need something in place to prevent people from exploiting the project's popularity to launch an amateur career as a pornstar. Coca-Cola and PETA were not what I had in mind.   — C M B J   23:18, 16 May 2010 (UTC)[reply]

Remember that things not covered by the list may be speedy deleted. Speedy deletions leave little record of their happening, and should thus only be used for clear cases. If we want to discuss sample reasons why something might be considered for a nomination for deletion, I think that's best done in a linked essay. Adam Cuerden (talk) 00:47, 17 May 2010 (UTC)[reply]
An essay is a good suggestion, but those files already are nominated for deletion, and consensus suggests that they will be kept (and rightfully so) because they were nominated in bad faith. Similar outcomes can be expected to occur in the future if policy does not discuss the use of material that is overtly promotional in nature.   — C M B J   01:19, 17 May 2010 (UTC)[reply]
Well, even so, speedy deletion isn't the right way forwards, really, since, well, we don't want a creator of good-quality material to get driven off because of speculation. For instance, Max Rebo Band's work, discussed briefly above, really is in quite good taste. One wouldn't want him being driven off because of speculation that he might be getting some benefit from his work - we could say the same about any of our photographers.
Let me try something. Adam Cuerden (talk) 01:35, 17 May 2010 (UTC)[reply]
[22] - see what you think of that? Adam Cuerden (talk) 01:38, 17 May 2010 (UTC)[reply]
That works for me. The occasional explicit photo of a notable porn star is fine, but wholesale uploads of porn would likely be caught by the two clauses Adam's pointed out in COM:NOT. Tabercil (talk) 12:32, 17 May 2010 (UTC)[reply]
I'm concerned that a non-promotional clause is simply misplaced here. We don't want people using Commons to launch their career as a garage band, or a street performer, or a professional photographer, or a clown. I don't think this is a substantially greater problem in practice in this area. We need a general policy to address this issue, not just a clause in Sexual content. Effectively, you're asking for Commons to have its own version of Notability, which so far has been quite ad hoc (e.g. deleting "personal" photos of non-notable people, etc). Dcoetzee (talk) 05:09, 17 May 2010 (UTC)[reply]
Woof. How many different projects do we feed, and each with their own slightly different answer to the question of "what does 'Notable' mean?" Good luck trying to distill an answer to one common definition that does not contain some variant of the phrase "see other Wiki". Tabercil (talk) 12:32, 17 May 2010 (UTC)[reply]

Calling the Cops—how is this addressed?

Because the guideline speaks of child pornography, it should speak of what the reader is advised to do if he is unlucky enough to find some honest to God kiddie porn uploaded here.

From the abundance of horror stories about bizarrely interpreted laws (e.g. AA BBS[23]), I don't think a viewer should count on a friendly reception. My uninformed guess is that if you dial 911 to your Podunk Police Department and tell them you just saw some horrible child pornography on Wikipedia, what they're going to do is come down, arrest you, search your house, and ransack your computer for the image so they can charge you with it. And if your browser cache was purged, etc., then they'll undelete it and they'll charge you for possessing it and for tampering with the evidence afterward. And if you still were minded to ask them about the image on Wikipedia they'd tell you "they'll get to it..." I hope I'm wrong, but I wouldn't count on it.

Now of course if the user doesn't call the police then he's just one of the sickos who went to our pedophile site to view the image, and he's liable to be caught in a roundup also.

So what I'm wondering is whether the user in that situation should be encouraged to contact some sort of national hotline in collaboration with people from the Wikimedia office, where the cops at least have the jurisdiction to prosecute someone other than the person making the report. In theory, if there were actually some particularly sadistic kidnapper uploading images here to taunt a family, you might even make a timely response to save a child that way.

But as I said, I don't really know. What would you say? Wnt (talk) 19:52, 17 May 2010 (UTC)[reply]

P.S. Before the whole thing with Jimbo I had this weird discussion at Wikipedia where some people were trying to interpret their policy to actually ban users if they called the police.[24] I think that was an absurd reading of it but it may be of interest to people here. Wnt (talk) 19:55, 17 May 2010 (UTC)[reply]
I'm not an admin, but I understand from comments I've seen in the past on the village pump, that when child pornography or similar is uploaded it's immediately deleted (which does leave the material still available to investigating authorities) and the FBI is called. Maybe I'm misremembering the organization contacted, but admins do not sit on their hands in such circumstances. There's been a problem, in the past of pro-pedophile activism on Wikipedia, which has been thoroughly stamped on by action from below and above: this remains controversial but is accepted by a general consensus, because pedophiles practice what they preach and children use the Wikipedia. Editors and admins have had to deal with the real thing long before Fox started their imaginings. --Simonxag (talk) 21:25, 17 May 2010 (UTC)[reply]
Report are to be made here from what I understand. - Stillwaterising (talk) 11:16, 18 May 2010 (UTC)[reply]

Are Wikipedia policies Commons policies?

This policy suggests that w:WP:User pages be "transwikied". That is a very long and in my view unduly harsh policy. Does a Wikipedia policy just automatically carry over here, or do people have a chance to consider whether they really want something like that?

  • If Wikipedia policies do just automatically apply, then shouldn't this page be added to Wikipedia and merely referred to from Commons?
  • If Wikipedia policies don't just automatically apply, people deserve a chance to have a vote on a version made for Commons.

Wnt (talk) 20:15, 17 May 2010 (UTC)[reply]

No the suggestion is that we should probably create something like that for Commons as well. Basically, it's a comment in the policy atm. TheDJ (talk) 20:26, 17 May 2010 (UTC)[reply]
You could put a proposal to Village pump, maybe a simpler version would be more appropriate here. - Stillwaterising (talk) 23:49, 19 May 2010 (UTC)[reply]

2257 aka picture prevention mechanism

Regarding this edit: [25]: "However, editors who have produced and wish to upload such media may have record keeping obligations and are encouraged to submit 18 U.S.C. § 2257 record keeping information in photo description."

Why the word "encouraged"? Is it needed or not? What's the consequence of one is not doing so? Will the image be deleted if it 2257 info is not submitted? Does this only apply to US citizens? Please do not let us use WP:FUD tactics here to apply censorship to commons. Cheers --Saibo (Δ) 18:49, 18 May 2010 (UTC) fixed wrong "D" in my comment --Saibo (Δ) 21:58, 18 May 2010 (UTC)[reply]

Well, I think the statement was formulated this way, because (see reference 10) Mike Godwin, the wikimedia lawyer, argues that 2257 wouldn't really apply to commons at all. --PaterMcFly (talk) 20:08, 18 May 2010 (UTC)[reply]
Thank you for this hint. I missed the ref in this paragraph. He writes in this ref: "those who actually produce images such as those described by Secs. 2257 and 2257A may have recording obligations". That is not a satisfying statement, we should not leave our contributors in uncertainty (by saying that thay may" have to keep records). In the current way, we scaring contributors by mentioning that thay may have to. So we expect that our contributors either be sure that they do not have to keep records, contact a lawyer themselfes, keep records or knowingly take the risk and do not keep records. I think we should move our servers to a more free country. ;-) --Saibo (Δ) 21:58, 18 May 2010 (UTC)[reply]
2257 is a stupid, stupid law, but unfortunately migrating the data and staffing the office in Canada or Denmark would be quite the undertaking, and they have their own set of unsavory legislations. Nobody's going to take this suggestion seriously. Maybe Wikipedia should operate in international waters like a pirate radio station. ;-) More seriously, we can't specifically recommend whether or not contributors should keep records, as we should not be providing that sort of legal advice - it's better to say they should familiarize themselves with the laws and talk to their lawyer. Dcoetzee (talk) 22:20, 18 May 2010 (UTC)[reply]
As you can see (and may have seen) by the ";-)" this wasn't meant seriously by me, too. At least this notice of 2257 isn't mentions in the upload form, so only a few people will see it. But this neither fish nor fowl. If we would link this notice in the upload form, uploads merely only be made by anonymous socks, I guess. I would appreciate it if we could be a bit clearer (for example: does it apply to non-US citizens? Will they be imprisoned on their next vacation to the US?). Or is this sort of advice forbidden in the US? I do not know a solution to this (,admittedly, small) problem currently. Cheers and have a good night. --Saibo (Δ) 23:12, 18 May 2010 (UTC)[reply]
I am all in favour of making contributors think twice about what they are doing before uploading sexually explicit content. --JN466 01:18, 19 May 2010 (UTC)[reply]
  • I am in favour of including something about nude and sexually explicit images in the Upload template. Not just in terms of record-keeping, but also (and perhaps more importantly) in terms of personality rights. E.g., if you upload a private (non-public) nude or sexually explicit picture, have you sought the consent of the person depicted before uploading? This is really a discussion for another day, and another page, but the case described here for example by George William Herbert indicates it may be advisable. --JN466 01:14, 19 May 2010 (UTC)[reply]
  • I think that "sexual content" needs to be an option when choosing an upload. Pictures uploaded to this category should remain hidden from the general public until reviewed first for legality and scope. If the picture is not legal, then it should be reported to the proper authorities (procedure needs to be worked out for this). If it is "out of scope" it should be speedy deleted and the uploader notified. If the picture is "in scope", then the uploader should be asked to provide identifying information. See this posting to foundation-l for a suggestion on how this can be done. - Stillwaterising (talk) 23:33, 19 May 2010 (UTC)[reply]
Hide first, until some admins have decided it is "okay". Sounds like a great mechanism for censorship. :-( --Saibo (Δ) 00:26, 20 May 2010 (UTC)[reply]
Sort of like an image equivalent of flagged revisions? I dunno, maybe... but we'd need some dedicated admins to keep the queue time down, and looking at the average time images sit in deletion requests, I think that could become a serious problem, depending on the volume of images. Dcoetzee (talk) 00:34, 20 May 2010 (UTC)[reply]
Yes, basically flagged revisions. We could start a new class of user called "sighted user" that gives access to these images. Perhaps 200 revisions minimum and administrator approval. Would also be nice if user confirmed by email that he/she were over the age of 18 (as required by law in several countries). OTRS should have access too of course. - Stillwaterising (talk) 01:28, 20 May 2010 (UTC)[reply]

My proposal

  1. Porn files will be flagged, so users will be prompt "This image contains sexually explicit...". People who don't want to see this kind of pics won't see it.
  2. Sexually explicit media (excluding illustrations) made from 1990/1995 will must have 2257 records. (Since Wikimedia servers are on the USA and this is the law there). Nudity and simulating sex ("softcore") are not considered "sexually explicit".
  3. Wikimedia needs porn, Users may ask porn companies to release some of their stuff under a free license. Since they have the 2257 records, we can add this info to the pic or clip and there will be no problem.

Sorry for my bad English. — Preceding unsigned comment added by R1Y4C2 (talk • contribs) 23:53, 18. Mai 2010 (UTC) (UTC)

Just an info: Wikimedia does not have to keep 2257 records. Please see Statement by Mike Godwin, 14 May 2010 and maybe the previous section. Cheers --Saibo (Δ) 00:16, 19 May 2010 (UTC)[reply]
Actively inviting contribution from porn producers has the advantage that their records will be in order. If we want two or three good shots of cream pies e.g. (currently we have a bad shot of one that editors didn't want to delete, because we only have one other one and editors wondered what we will do if that one turns out to have licence problems [26]), then getting those two of three high-quality shots from a porn producer, with proper documentation, seems like a reasonable idea. On the other hand, in terms of PR, it makes Commons vulnerable ("Commons get its educational material from porn film makers"). I still think there is merit in the idea though, as long as we make sure we are selective, only keep the photos that have the highest quality, and don't end up hosting dozens and dozens and dozens of these shots. --JN466 01:51, 19 May 2010 (UTC)[reply]
I think there are some confusions here. As discussed elsewhere on this page, WMF's lawyers have advised that it is not clear that we ever need 2257 records, because we are not a commercial site; also, because we (Commons) are basically a host/carrier the issue would presumably be the uploader's not ours. Also, while it is easy to say "Porn files [should] be flagged", and in principle I agree, it is not easy to say what is a porn file. That is why most of us who are in favor of tagging wish to see, at the very least, some way to indicate the nature of the concern on the level of individual files or categories. For example, someone in a workplace might wish to avoid all depictions of a female breast, even classical paintings, but clearly (for example) Botticelli's Birth of Venus is not "porn". - Jmabel ! talk 03:59, 19 May 2010 (UTC)[reply]
I know that Commons is not required to keep records. But we are still saying it is best practice if uploaders provide age, identity and consent information to OTRS. I am thinking of personality rights here, Jmabel. A porn professional will have signed a consent form. Joe Bloggs's (ex-)girlfriend giving head may or may not be fine with her picture being on Commons. [27] --JN466 14:02, 19 May 2010 (UTC)[reply]
Getting our material from porn producers is not only bad PR. A bigger problem is that the pictures the porn producers want to provide are possibly – porn. We all know that you don't get a good idea about sex or about normal anatomy from porn. I am reluctant to trust that there would not be a problem with a bias in the pictures, would we rely on this model. For some uses the material will probably be very good, but I think we will need also other sources. --LPfi (talk) 20:55, 19 May 2010 (UTC)[reply]
Pornography is a poor depiction of anatomy and sexual acts, but is a great depiction of pornography, which we have plenty of articles about, seeing as it is one of the world's largest industries. Dcoetzee (talk) 00:18, 20 May 2010 (UTC)[reply]
The problem here is that few non-professionals will have a consent form to sign or any place were it will stay available. The procedure must be such that we can get pictures from regular Commons contributors. As long as we do not have the resources to check signatures on (possibly photoshoped) consent forms against passport copies, there has to be some amount of trust involved. Pretending this is not so will turn away contributors for no good reason. This is a real issue if the present "assume consent if (stated and) no reason to doubt" is to be changed and we want contributions from non-professionals.
I do not know whether the ex-girlfriend-without-consent photos are one every couple of years or every couple of days. Are the uploaders regular Commons contributors or one-off accounts? Somebody with a touch might wish to comment.
--LPfi (talk) 11:29, 20 May 2010 (UTC)[reply]

Mike on obscenity law

Hi all, I consulted briefly with Mike regarding obscenity law and our obligations - as usual his response provides frustratingly little direction, but nevertheless I hope it will help inform development of the policy.

My questions to him: "Is there any content that should be excluded from Commons by administrators on the basis of obscenity law? If so, what test should be applied to determine which content is likely to be in violation and which is likely to be okay? Should Commons administrators familiarize themselves with and apply the Miller test? If not, might it be excluded by office action? What is the effect of this type of law on physical media and print distributions of content? (Should potentially criminally obscene content have suitable warnings affixed?)"

Mike Godwin's response: "I think you're looking for a general answer, and the problem with obscenity law is that it doesn't lend itself to general answers. The Miller test is that standard for obscenity determinations in the USA, and it's community-based, and there's no generally accepted rule about how to apply community standards to content that is online and available everywhere."

Thoughts? Dcoetzee (talk) 00:37, 19 May 2010 (UTC)[reply]

We knew this really. Some cases have gone for local community standards, others have gone for nationwide community standards. I am in favour of including a short section of the Miller test -- just what the three questions are, and perhaps one or two examples of obscenity convictions in a footnote (e.g. Max Hardcore). --JN466 01:10, 19 May 2010 (UTC)[reply]
I agree with that, although leaving the question of community open makes the Miller test very difficult to apply. I think it would also be useful to have a template called e.g. {{Obscenity}} - it would say something like this: "Commons believes that this material is suitable for the general world community. However, in some communities, publishing this media may violate obscenity law. Content reusers are advised to seek legal consultation before republishing potentially obscene content in local communities." Similar tags (e.g. {{Personality}}) have been valuable when seeking compromises in deletion discussions. Dcoetzee (talk) 01:29, 19 May 2010 (UTC)[reply]
A couple of thoughts: (1) On the one hand, we should be going by US community standards. (As we have seen, some courts might interpret this to mean Florida standards, others might take it to be nationwide US standards.) At any rate, if Commons were to be prosecuted, this would always be under US law. (2) If we were using the general worldwide community as a benchmark, then we would end up with more restrictive standards than in the US, bearing in mind the 1.x billion Indians, 1.x billion Chinese, 1 billion muslims, etc. I think an obscenity template is a good idea, but we should not be referring to world community standards. Perhaps "is suitable for some parts of the general world community." --JN466 01:39, 19 May 2010 (UTC)[reply]
Not applying any standards because we don't know which ones to choose is pretty risky. Imagine standing up in court and trying to justify such a position. In other cases (copyright) we use a "precautionary principle" (ie. play it safe). Florida and US standards are pretty much the same (certainly in terms of online behavior). A Florida court is not going to apply Iranian standards (or, I suspect, allow more liberal Dutch ones). Outside the courts we've had a fairly sympathetic hearing from the press, because our material is in fact either art or sex educational. Max Hardcore realistic rape porn might conceivably have an educational use, but it definitely would be of great use to anyone attacking Wikimedia (and we know there are those who want to). Apply the Miller Test with Florida standards as a basic safety precaution. --Simonxag (talk) 11:13, 19 May 2010 (UTC)[reply]
One problem I see with trying to define "obscenity" here is that we could be making the ammunition that a local prosecutor could use against editors. Right now this Miller Test is something that everyone should recognize as a bad law (i.e. not even a lawyer can give you any idea what images are legal or not, which pretty much guarantees a huge "chilling effect"). The only reason it was ever promulgated is because the courts were ignoring the First Amendment entirely before that, and they didn't want to seem too radical. But if we here set up some kind of process to tag certain files as "obscene" and/or delete them because they may be "obscene", then that can be used as evidence that the files actually did violate some sort of "community standard" and then we've just created the case for action against editors that didn't exist before. And if we only "delete" those files (make them off-limits to those without the admin tag) then Wikimedia still possesses them and they are still viewable by thousands of people. That means that the action of deleting files, far from precautionary, could actually make them criminal and put Wikimedia at risk. Though I don't claim legal expertise, does anyone see an error in this logic? Wnt (talk) 17:15, 19 May 2010 (UTC)[reply]
Are we sure law should be the only guideline here? If Wikimedia is mostly supposed to have an educational purpose, having 400 blow job pictures may be legal, but is it really furthering that educational goal? Fully naked strip clubs are legal in various places, but if the local news photographs a kindergarten teacher walking out of one, it's going to look pretty bad. Wknight94 talk 17:49, 19 May 2010 (UTC)[reply]
The only obscenity laws that I know of that matter apply to bestiality, fisting, and other "extreme acts". - Stillwaterising (talk) 23:47, 19 May 2010 (UTC)[reply]
I'm willing to run with a statement that we should apply the community standards of the United States (and possibly also Florida?) when making judgments of obscenity law. Dcoetzee (talk) 00:20, 20 May 2010 (UTC)[reply]
I'm thinking that it's probably better not to tag potentially obscene images for the reasons listed above. If someone were to download the image, then we would be increasing the chance of them being convicted by basically 'proving' that the image was obscene according to community standards, even if it were something relatively tame or artistic. Our wiki probably shouldn't be involved in making that judgment.Chri$topher (talk) 02:54, 27 May 2010 (UTC)[reply]

Comment

Great. I'm want good sexual material useful for educational projects. I'm also aware that Wikiland is not a little sovereign state floating off in the ocean somewhere. I think Jimbo panicked because he thought the whole project was vulnerable: we should make sure we don't actually make it vulnerable. In the west in the 21st century, sex education and coverage of sex in an encyclopedia is almost uncontroversial. On the other hand the belief that kiddiporn (and some other material) is evil and criminal, is totally uncontroversial. Users may not have formally voted to fight Fox on the issue of free speech, but that is effectively what they chose to do: we need to make sure our position is defensible. --Simonxag (talk) 01:33, 20 May 2010 (UTC)[reply]

"In photo description"

I don't quite understand this wording:

"However, editors who have produced and wish to upload such media may have record keeping obligations and are encouraged to submit 18 U.S.C. § 2257 record keeping information in photo description.[9] If 18 U.S.C. § 2257 is unavailable, it is good practice to forward age and identity documentation for sexual content, along with statements of consent by the persons depicted, to OTRS to help avoid any later problems."

What does "submit record keeping information in photo description" mean? In the description of the media file? This might be a bad idea if it includes the model's legal name -- OTRS provides some anonymity, whereas the file description is public. And why would we say "If 18 U.S.C. § 2257 is unavailable"? It is linked on the page. --JN466 01:32, 19 May 2010 (UTC)[reply]

I agree, any 2257 documentation the uploader wishes to supply should be submitted through OTRS, not the image description. I think they mean "if official 2257 documentation is unavailable", i.e. if the producer doesn't keep 2257 records. Dcoetzee (talk) 01:36, 19 May 2010 (UTC)[reply]
Doesn't 2257 (if applicable) require an actual copy of the driver's license or such document? Complete with a license number and (sometimes) a Social Security number? Wnt (talk) 04:45, 19 May 2010 (UTC)[reply]
It requires an affidavit (here) and a photocopy of a state issued ID card for each model. Consent forms are not required by US federal law, but may be needed in certain local jurisdictions. (Source: phone conversation with Drew Sabol, Esq on 5-19-10). - Stillwaterising (talk) 23:45, 19 May 2010 (UTC)[reply]

Community standard

Based on some of the conversations above, I added a section "Community standard", in which I say that our community standard is not to prosecute our editors. I believe this is true based on such principles as COM:NOTCENSORED; however, as written those policies are circular, because they don't (can't) defend the right of editors to upload illegal material. So I think it's important to say here that our community standard is to avoid having obscenity prosecutions. Wnt (talk) 17:48, 19 May 2010 (UTC)[reply]

I think the whole section on Community Standard needs to go (or be revised). It says almost nothing useful. Miller test, in my opinion so is so subjective as to be beyond useless. I prefer the Dost test, so do many courts. - Stillwaterising (talk) 23:26, 19 May 2010 (UTC)[reply]
It seems that's a test of what's lascivious (and even a clothed person being flirtatious can be). I don't see it's relevance except for determining whether a picture of a child is porn or not (which is not what was being discussed). --Simonxag (talk) 01:38, 20 May 2010 (UTC)[reply]
I'd hoped I'd been clearer. The objective of the section is not to explain to the reader what the Miller test is. The objective of the section is to say that if local community standards affect whether one of our editors is thrown in prison, then we want our local community standards not to prohibit anything - not a criminal sense, that is. We want the prosecutors to leave our editors alone. If there's really a great problem with so-called "obscene" pornography, we can settle the issue amongst ourselves perfectly well by our own mild processes of evaluation and deletion. And we do not want our internal processes, used to set the boundaries of an encyclopedia, to be misinterpreted by anyone to be the same as community standards for prosecution. Even if someone spams us with the crudest low-grade porn imaginable and we have to block their account, we still don't want them being jailed for obscenity. It's our problem, a content dispute, not a police matter — or at least, that's how we want it to be.
I understand that in general, since Miller, few if any communities have actually tried to lay down a community standard for use. Nor is a "Wikipedia community" very likely to be recognized by the courts. But if we have the remotest chance to keep a bunch of goons from locking up one of our editors because some prosecutor singles out his picture from all the other explicit images, or offer him a legal argument that can be used to bargain for a lesser sentence — then by God we should do what we can. Wnt (talk) 02:53, 20 May 2010 (UTC)[reply]
Ugh. I understand what you're trying to do, but I think the section needs to go. The policy we're discussing should be focused specifically on sexual content. Besides, it wouldn't matter either way what our stated policy is "community standards" - if a prosecutor wants to go after us, they will do. In fact, stating that we have a different "community standard" than the norm could be seen as waving a red flag in front of them. Tabercil (talk) 03:31, 20 May 2010 (UTC)[reply]
I have to agree. This section is verbose, opinionated, and aimed at an audience who will never read it. This policy is for Commons users and administrators, not for law enforcement. What we need to emphasize here is the facts: what obscenity law is, why the situation on Commons is so murky, and what users should pay attention to when evaluating images for obscenity. I'll take a stab at this. Dcoetzee (talk) 04:16, 20 May 2010 (UTC)[reply]
I think Wikipedia users deserve better than this. I have at least amended the text you submitted to remove what could be interpreted as voluntary approval of such obscenity standards. I still believe we should make an explicit statement of a community standard favoring freedom. Opposition to censorship has been a fundamental principle of Wikipedia from the day it began. The most censored communities don't hesitate to stand up for their standards — why should we? Wnt (talk) 08:50, 20 May 2010 (UTC)[reply]
Nobody disagrees with obscenity law more strongly than me - I think it unjustly infringes on free speech, should be protected in all cases by the First Amendment, and that Max Hardcore's imprisonment was a grave miscarriage of justice. I likewise disagree with a lot of copyright laws, but we still have to follow them or risk exposing the WMF and content reusers to legal risk. I'm fine with the present wording, as I felt my previous wording was weak anyway. Dcoetzee (talk) 19:02, 20 May 2010 (UTC)[reply]
We should also be mindful that, as pioneers of a new frontier, we are going to get sued by someone for some reason at some point. And when that inevitably happens over our sexual content policy, there's a fair possibility that it's going to make its way to the Supreme Court and become a landmark case with precedent capable of altering the Internet as we know it. Only one thing is for sure: we've got one hell of an important juggling act in our hands right now.   — C M B J   10:51, 21 May 2010 (UTC)[reply]
Let's be clear here: as I understand it, none of us wants to see Wikipedia editors prosecuted for "obscene" material. We'd like to have a highly permissive community standard and establish any potential limits of taste by AfD discussions rather than prison sentences. But you're afraid that if we say we'd like a community standard looser than anyone else's, that we'd only increase the risk of actually getting editors prosecuted. Now there's no ambiguity in the other direction - any small-town official can stand up and say he doesn't want pornography in his town and that's a "community standard" that may or may not count in a future court decision. So community standards run only in one direction, toward whatever is most prohibitive, and Wikipedia dare not try to set its own. Is that a valid conclusion to take here? Wnt (talk) 15:57, 23 May 2010 (UTC)[reply]

Auto-archiving this page

I think we should set up Miszabot to archive any thread that hasn't been active in 7 days. - Stillwaterising (talk) 23:39, 19 May 2010 (UTC)[reply]

I'd say that 7 d is a bit short - it is annoying if you want to check back a thread you recently read and already have to search the archive. My proposal is 14 d (and maybe 28d if things have settled down more) with no hard agrument for it. Cheers --Saibo (Δ) 00:16, 20 May 2010 (UTC)[reply]
14 is ok with me. - Stillwaterising (talk) 01:18, 20 May 2010 (UTC)[reply]
I'd prefer 28. And an indexing bot taking a pass as well, if one can be configured to run here. ++Lar: t/c 05:25, 21 May 2010 (UTC)[reply]
28 days sounds like a fair proposal to me. Seconded.   — C M B J   10:56, 21 May 2010 (UTC)[reply]
I added the code for Miszabot with 28d counter. Could somebody please review and tweak if needed. Thx. - Stillwaterising (talk) 15:20, 21 May 2010 (UTC)[reply]

Beastiality and other paraphilias

See also: Commons Talk:Sexual content#Illegal material.

User:Stillwaterising added this line to "Prohibited content":

  • Actual or simulated sexual activity between humans and animals (zoophilia).

He claims this is "illegal in all states." While I can't argue that many (all?) states in the US have laws against the act of beastiality, I don't know of any laws against the depiction of beastiality, whether actual or simulated, and we have an entire category dedicated to illustrations of it, consisting mostly of classical artwork like "Leda and the Swan" (Category:Zoophilia). To say nothing of the many depictions of human-animal sex on television, including among other things a South Park episode that showed a handicapped child being raped by a shark (no seriously). Dcoetzee (talk) 02:56, 23 May 2010 (UTC)[reply]

Stillwaterising also reverted several small changes with this edit. I object to both the revert itself and the accusative edit summary. "Paraphilia" is an abstract term which, by its very definition, has the same meaning as the three concrete terms that it replaced. The rationale for substitution cited empirical evidence; because two of the specific examples undermine an illustration and and an old artistic depiction already used by Wikipedia articles, a more general term should be used. Moreover, to somehow suggest that such a superfluous literary substitution (e.g., "Dogs" in place of "Border Collie", "German Shepherd", and "Pitbull") could even hypothetically be performed in bad faith is unreasonable.   — C M B J   05:09, 23 May 2010 (UTC)[reply]
These sound like questions for Wikimedia's general council, Mike Godwin. - Stillwaterising (talk) 05:32, 23 May 2010 (UTC)[reply]
If you're trying to argue artworks are illegal, I'm pretty sure you're wrong. If you're trying to argue about film and photographs, I suspect that, generally speaking, you're right, though there's probably a few exceptions where simulated could slip through as part of a dramatic work, or one of those gross-out comedies. Adam Cuerden (talk) 07:14, 23 May 2010 (UTC)[reply]
Ok, let's separate the act of having sex with animals, from photographs of same, to graphic representations such as artwork/sculpture/computer animations. From what I understand, anime of children with sexual content became *restricted* with the passage of the Walsh Act in 2005. First person convicted was here, AND this person was also convicted of bestiality, but this individual (understandably) made a plea bargain agreement with the court. Now, zoophilia (bestiality) pornographic images from what I understand, are considered obscene and often prosecuted under Obscenity laws, HOWEVER I can not yet find any hard proof of that, just unreferenced mentions of such. I would challenge somebody to find a zoophilia pornography commercial website that is based in the United States. However, drawings/written depictions/etc. are covered under the First Amendment. And lastly, the actual act of zoophilia is illegal in some states but not in others. In Florida, where some of WMF servers are located, there's no specific law. In California, where WMF is headquartered and registered, there is a law which makes the act a crime (misdemeanor). Also, bestiality pornography has been made illegal in the Netherlands where most of the sites have been hosted, and by this law in the UK which makes it a crime for a citizen of the UK to possess extreme pornographic images, even if they are living out of the country. Given all of the potential trouble with bestiality images, I recommend we limit to historical images and illustrations. - Stillwaterising (talk) 11:21, 23 May 2010 (UTC)[reply]
I don't know much about the Walsh Act or its implications, but obscenity is already addressed quite thoroughly by the proposed policy, regardless of whether it depicts beastiality or not. In particular, while I rarely encounter actual photographs or film of beastiality, I remain skeptical that publication of such photographs (as opposed to production of them) would be in violation of any law except possibly obscenity law (and then only if it manages to fail the Miller test). Law outside of the US is not a reason for excluding useful content - for example we don't comply with the French Article R645-1 banning the display of Nazi symbols - but may be a reason for adding tags to caution content reusers. Also, I strongly prefer to avoid the term zoophilia, which depending on the speaker may refer only to a passive (unfulfilled) sexual desire for animals. Dcoetzee (talk) 11:33, 23 May 2010 (UTC)[reply]
I agree that hosting of these photographs would not violate any law other than obscenity. My mention of the Adam Walsh Child Protection and Safety Act (which deals with strengthened laws for sex offenders) was inaccurate, as was the link for case (now changed to this url). The law in question is PROTECT Act of 2003. BTW, I saw that episode of South Park (Crippled Summer) and I thought it was one of South Park's best episodes. - Stillwaterising (talk) 13:27, 23 May 2010 (UTC)[reply]
Just an information: The distribution (not possession) of porn(! - probably not art) containing zoophilia is prohibited also in Germany (see de:Zoophilie#Rechtliches; according to the law § 184a Strafgesetzbuch) - relicts of the last millenium. Cheers --Saibo (Δ) 16:41, 23 May 2010 (UTC)[reply]
Possession of bestiality (zoophilia) images not legal in the UK either (see Commons_talk:Sexual_content#Pornography above and w:Section 63 of the Criminal Justice and Immigration Act 2008). I can't see any reason to allow these images when illustrations are sufficient. - Stillwaterising (talk) 17:37, 24 May 2010 (UTC)[reply]
Policies based on such laws pave way for a slippery slope. The same rationale could then be used against Tianasquare.jpg, Flag of Nazi Germany (1933-1945).svg, and Jyllands-Posten-pg3-article-in-Sept-30-2005-edition-of-KulturWeekend-entitled-Muhammeds-ansigt.png which are not legal in China, Germany, and Iran, respectively.   — C M B J   21:08, 24 May 2010 (UTC)[reply]
Hmm, must be Godwin's law, but this has nothing to do with political censorship. Looking through Category:Zoophilia I do not see any images that could be considered off-limits, and while I think this collection is sufficient, it could be added to in the future. What I do not think is appropriate is allowing actual images of human-animal intercourse. - Stillwaterising (talk) 21:23, 24 May 2010 (UTC)[reply]
I hardly see how reductio ad Hitlerum relates to contemporary laws in Austria, Brazil, the Czech Republic, France, Germany, Hungary, Poland, and Russia. But even so, the rationale of "X content must be prohibited because it is illegal in X extrajurisdictional place" does have something to do with political censorship, and will be cited as either supportive precedent or hypocritical behavior in such future discussions. Simply put, this is the wrong way to go about prohibiting content, even if it is unequivocally abhorrent to us.   — C M B J   23:04, 24 May 2010 (UTC)[reply]
For lack of a US law regarding actual depictions of beastiality, I think the best policy is to do what we feel is most moral. In this case, the question would be whether distributing depictions of actual beastiality promotes or encourages animal abuse (which I think is a difficult question that I don't know the answer to). If we do accept them, we should have a tag indicating the legal danger to people in other countries. It's clear that illustrations are okay. Dcoetzee (talk) 23:22, 24 May 2010 (UTC)[reply]
I find it hard to see very many ways that bestiality photographs and film would improve the encyclopedia in a way that artworks would not do far better. There might be very rare cases where it might be justified - to document a crime or scandal for Wikinews, say, or (theoretically) as part of a collection by a major photographer, but even then, I find it unlikely. Perhaps we should just say something along that lines?
Our problem here is that Commons needs to be consistent, because that's what protects us against things like the Muhammad controversy. If we will not allow Muslims to censor images of Muhammad, we can't censor things simply for being offensive or shocking to us: We need to try and do it from Commons' core principles, such as scope.
That said... God, if we ever did host a photo or film of bestiality, we'd better have damn good educational reasons.
I'm going to wax a bit long here: Fox's attempt to slime us for hosting artworks seems to have mostly blown over, which was pretty predictable, and would likely have happened far faster if it weren't for Jimbo's attempted coverup. I think that, so long as we can point to a compelling educational value in our controversial images, it would be hard to really attack us and make us stick. Indeed, had Jimbo acted sensibly, he'd have made a press release pointing out the educational value of hosting notable works by French Decadent artists, and this would've been more-or-less over with.
But our defense depends on educational value. If the images don't have a legitimate educational value, if they're only intended to provoke, shock, and push the boundaries, with no artistic, historical, or educational value, then we're in a very bad situation. Competent behaviour by Jimbo would've resulted in a review of the educational value of our images, and I'm sure that we'd have found some which should've been deleted. I haven't really reviewed everything affected: My "dog in this fight" is having seen notable artworks attacked and deleted, including some from my own field of expertise, engravings, which... well, let's put it this way. Gustave Doré's engravings to Dante's Inferno. I only own a very beaten-up copy of an American edition of the work, but the engravings are all in good condition. They're considered some of the best illustrations produced for that work.
But the Inferno contains nudity and a lot of extreme violence. Doré depicts that.
Now, imagine my reaction to seeing Jimbo deleting artworks as pornographic, at the same time as advocating for sadomasochistic works to be included in the list, and refusing to talk about where the line's going to be drawn until he's deleted everything - and that immediately after he had seemingly agreed artworks should be protected, and is now deleting artworks.
That's pretty much how I got involved with this. Artworks I love, which are considered some of the masterworks of engraving, contain some incidental nudity, and some fairly extreme violence, because that's what the book they're illustrating is like.
Who knew where it was going to end?
I think that that's why we need to be careful. It's very hard to rebuild a destroyed collection. If we delete, as Jimbo did, Felicien Rops and Franz von Bayros - notable artists with their own articles on English Wikipedia - can we ever get them back? My experience has shown that we don't have a lot of things you'd think we would in engravings, and have very few contributors of new scans of them. Once we decide artworks are pornography based on Jimbo deciding so unilaterally, even if notable, and even if art critics disagree... well! where would it end? Adam Cuerden (talk) 06:51, 25 May 2010 (UTC)[reply]

A velvet divorce?

My goodness, there's a lot of words here! If this has already been suggested, please disregard, but this is something that has been in my mind ever since this debate in 2007. This was where Commons hosted (and declined to remove) an unauthorized picture of the Wikipedia mascot Wikipe-tan in a soft-porn kiddie cheescake pose. This was in my view a hostile act against Wikipedia by Commons. In the course of the debate, the attitude "we are Commons, we don't give a rat's ass about Wikipedia or what happens to it" was generally expressed.

Which, you know, makes sense, I guess. Wikipedia's mission is to make an encyclopedia. Commons's mission is, I gather, to host basically any image or media that it is physically possible to host. These are two very different missions. The Wikipedias have indeed made use of Commons media, and this relationship has been fun and useful, but isn't really critical to the Wikipedias. The various Wikipedias could host their own media, for instance, and while this would perhaps not be as efficient as the current system, it would not be crippling to the Wikipedias. So I put this out as a tentative proposal:

Would it perhaps be best if Wikimedia Commons was (gradually) spun off into an entity entirely separate from the Wikipedias and other entities of the WMF?

If this occurred, Commons would be entirely free to host anything and everything they like without interference from the WMF. I think this would make the core of Commons, its volunteers, quite happy, which is a good thing. At the same time the WMF would no longer have to worry about being responsible what Commons does. And I'll bet this would inspire Commons to advertise its services more broadly to the many non-WMF entities that could use them.

Obviously Commons would have to create its own organizational structure, find its own sources of funding (after of course a transition period), and probably change its name to something like "World Commons" or "Net Commons" or whatever. None of this should be particularly difficult, and I think that if anything Commons would find its (reduced) funding needs easier to fill than the WMF does.

It's extremely probable that the two entities could continue to work well and closely together, to the extent that the current easy protocol for including Commons media in the Wikipedias could be continued. But if not, so be it.

Anyway, just a thought. Herostratus (talk) 17:54, 23 May 2010 (UTC)[reply]

See COM:SCOPE. Commons really isn't a free photo archive to the world. The situation is that it is so difficult to decide what is definitely out of the project's scope, that it often is not done, except for things people tend to complain about. (Which probably is the most workable outcome anyway). In any case, it would be an unfortunate amount of trouble and expense to separate the two projects, when the issues of freedom and censorship are the same for both. Wnt (talk) 01:11, 24 May 2010 (UTC)[reply]
Agreed. At this point, the notion of the WMF hosting Commons is okay as Commons still has heavy overlap with what the various Wikipedia projects mean. If that ever changes, I expect the WMF would be taking action before the bulk of the people involved with Commons would realize there was an issue. Tabercil (talk) 01:25, 24 May 2010 (UTC)[reply]
I think what ultimately ties Commons to Wikipedia is the technical feature that allows Wikipedians to directly transclude Commons images in articles (there is nothing similar for, say, Flickr images). Hosting everything together on the same set of servers is helpful for ensuring, for example, that downtime of one project does not affect the other adversely. Additionally, Commons is in a number of ways tailored to Wikipedia: for example, we're much stricter about copyright issues than almost any other free media repository on the web. Finally, Jimbo's surrendering of privileges makes me hopeful that there won't be a repeat of the image purge, so I don't see a strong reason for disassociation. Dcoetzee (talk) 01:37, 24 May 2010 (UTC)[reply]

Moving forward on adoption

Hi all, I think the proposal as it stands today has stabilized. I think it's about time to solicit wider feedback and possibly to begin a straw poll on whether to adopt this as policy. Would anyone like to voice any remaining concerns before we do so? Dcoetzee (talk) 19:06, 25 May 2010 (UTC)[reply]

The "and we may be compelled to remove some works on this basis." line should probably be rephrased, so as to prevent a policy-justified repeat of what just happened.   — C M B J   21:01, 25 May 2010 (UTC)[reply]
I rephrased to emphasize that speedy deletion for obscenity is not permitted - I think the Miller test is too complex for a single administrator to evaluate, and unlike issues of privacy or child pornography, it poses no immediate threat to anyone. Dcoetzee (talk) 21:08, 25 May 2010 (UTC)[reply]
I'm happy to put a notice in the sitenotice, when it's agreed we're ready. Adam Cuerden (talk) 22:52, 25 May 2010 (UTC)[reply]
I think there are still two major issues that need to be resolved first:
  • The text describes a proposed policy. Before people can vote on it, they need to read it as it would actually be written, i.e. not talking about proposals. Also there's considerable question in my mind whether this is a new policy, or a new guideline, or (in my view) a "supplement" of the type described in w:Template:Supplement.
  • I still have considerable objections to how my section on our community standard has been rewritten by Dcoetzee. My original intent was to make clear that anyone is free to join the Wikipedia community and comment on the content we keep; that whatever boundaries we choose to draw or not to draw can be done by us by civilized discussion; that none of us want editors prosecuted for what they contribute; and thus that our deletion processes do not represent a "community standard" that someone should be imprisoned for breaking. Every time Dcoetzee messes with the text it sounds like it's saying that Wikipedia will cheerfully accept whatever "community standard" of restrictions the courts throw at it; that we will use our deletion discussions as an opportunity to mock-convict editors of uploading obscene material and render evidence that a community finds it offensive and without redeeming features; and that we have nothing to say about the thought of someone prosecuting our people. I don't think that is the way people here feel at all. I think even those who would approve of censorship by us to make Wikipedia somehow "family friendly" still would recognize that we don't need external prosecutors singling people out for prison time. Local discussion and ordinary deletion is the only effective way to enforce any line in the sand on content, and it is a way that doesn't have anywhere near such a drastic harmful effect on freedom of expression or the willingness of contributors to participate. To give the opposite impression solely out of cowardice - out of the fear that if we say we like freedom someone is going to stomp our people down - that is not the philosophy that Martin Luther King, Jr. taught us. Wnt (talk) 23:48, 25 May 2010 (UTC)[reply]
I think this is best to state that this is a policy and not a supplement - we need this to have some teeth to it in order to help prevent a recurrence of the mess in the first place. As for the Wikipedia community standard versus a court-imposed standard, I'm sorry but the courts will win that battle every time. And do you know what happens to the losers? It's called "jail time". And I'm in agreement with Adam Cuerden in that I'm satisfied with this; the main points which I wanted to see in this policy got put in place a while back so my concerns are met. Tabercil (talk) 03:24, 26 May 2010 (UTC)[reply]
I tried to make it clear that any kind of deletion due to obscenity should be a rare event that is conducted only for works that are actually illegal to distribute, in the jurisdiction of the servers. Nor is it intended to "single out contributors for prison time", but rather to isolate media that may be illegal for WMF to distribute, so we can, you know, cease distributing it. Lots of people upload copyvios but we're not trying to report them and get them arrested either. It's the same deal here. I'll try to reword this to make this clearer. Dcoetzee (talk) 06:13, 26 May 2010 (UTC)[reply]
I'm glad this topic has come up because I do believe this proposed policy/guideline is nearly ready to go. My understanding was that this proposal is to become policy. I support an announcement at MediaWiki:Sitenotice and as well as Village Pump.
From what I understand, our responsibility for reporting illegal sexually explicit content is outlined in 18 USC 2258A "Reporting requirements of electronic communication service providers and remote computing service providers" and is limited to apparent violations of child pornography laws. This is not limited to actual images, but under 18 USC 1466A includes "a visual depiction of any kind, including a drawing, cartoon, sculpture, or painting" that either depicts a minor engaged in sexually explicit conduct and is obscene. There is an exception for images that aren't "obscene" and have "serious literary, artistic, political, or scientific value." All of this involves legal nuances that are way over my head and are best interpreted by Wikimedia's legal team. I have asked Mike Godwin to review this proposal and he has indicated the he would prefer that such a request be made through consensus decision rather from a lone editor. - Stillwaterising (talk) 14:33, 26 May 2010 (UTC)[reply]
Who is this "our" you are talking about? I'm neither an "electronic communication service provider" nor a "remote computing service provider", and I doubt anyone else here is, either. As such, my legal responsibility is exactly nil. --Carnildo (talk) 21:44, 26 May 2010 (UTC)[reply]
Legal responsibility for reporting lies with WMF not the user. - Stillwaterising (talk) 03:03, 27 May 2010 (UTC)[reply]

I would like to propose that a subpage (/Legal considerations) be created and the text of this section moved there. I see the current contents of this section to be more along lines of suggestions and should be called guidelines rather than policy. A link to the subpage can be included on the mainpage as well as a brief summary of its contents. - Stillwaterising (talk) 15:27, 26 May 2010 (UTC)[reply]

Done.   — C M B J   20:56, 26 May 2010 (UTC)[reply]
I think that any formal guideline would deserve to have at least its own page, rather than a subpage of a policy (which would be confusing). I'm also not sure that anything stated on this page is new policy, rather than interpretation of existing policy. (the original Commons:Sexual content containing the Legal Considerations subsection, that is) Can you point to one thing that requires a "policy" status for this document? Wnt (talk) 21:00, 26 May 2010 (UTC)[reply]
Can somebody please cite some kind of documentation on what makes a policy/guideline and how to proceed please? - Stillwaterising (talk) 21:40, 26 May 2010 (UTC)[reply]
The advice to the uploader surely does not need status as a policy. What concerns the community and the admins is media that might be illegal to provide, and that is dealt with in Prohibited content.
One thing that is useful in the policy or in an official guideline is the recommendation to give information to, avoid problems in the future or for certain reusers. I am not sure I am content with how that is dealt with on the page. Anyway it should be clearly separated from info about the uploaders' legal obligations, which may vary depending on local laws and where we do not want to give any promises.
--LPfi (talk) 08:23, 27 May 2010 (UTC)[reply]
I object to this change. This section contained specific requirements about how obscene material should be handled (e.g. they cannot be speedy deletion, should be evaluated using the Miller test) which have now been hidden away on a guideline page. This is part of the policy. Dcoetzee (talk) 08:34, 27 May 2010 (UTC)[reply]
Oddly enough, I'm actually not very fond of this idea of saying that obscene material can't be speedy deleted. If you could know that a picture was the one-in-a-billion that would be prosecuted, then speedy deletion might make sense as a protection, and making a policy that it can't be might be seen as one of those things to get Wikimedia in trouble. For some reason you and I were editing that section at about 90 degrees to each other, neither as allies nor true opponents, and we keep missing each others' points somehow. I've made some substantial changes recently, including to the speedy delete section - let's see if we can reach agreement about those points and then we can get back to the disposition of this other section. Wnt (talk) 18:43, 27 May 2010 (UTC)[reply]
We apparently have time for deletion discussions about works that are obvious copyright violations - the penalties for distributing obscenity are (while criminal instead of civil) not really sufficient to justify rapid unilateral action, in my opinion. Dcoetzee (talk) 19:15, 27 May 2010 (UTC)[reply]


"Renamed on sight"

This policy says that files with non-encyclopedic names should be "renamed on sight" without further explanation. I commented that a file needed to be renamed during an undeletion discussion on May 10,[28] proposed a rename on May 12,[29] and it still has the same name, because the rename requests have a huge backlog. Should we direct users to the existing rename request process, and warn them that there's a wait involved, or should we invent a new process (and who will do it?). Wnt (talk) 21:12, 26 May 2010 (UTC)[reply]

An admin can do it very quickly, users, not so much. I'd say encourage admins to rename on sight (reminding them to use CommonsDelinker to move any usages to the new filename), everyone else, use the slow rename process. Adam Cuerden (talk) 22:43, 26 May 2010 (UTC)[reply]
Perhaps the renaming template could be modified to include an optional priority parameter?   — C M B J   00:14, 27 May 2010 (UTC)[reply]
One suspects that'd be abused more often than it's used appropriately. Adam Cuerden (talk) 09:01, 27 May 2010 (UTC)[reply]
Possibly yes, but is that a real problem? If some files get renamed unfairly quickly, that is no big deal. The users can be warned and dealt with as in the case of any abuse. On the other hand, as long as the parameter is not abused very much it will get the appropriately prioritised files dealt with much quicker. For some files staying in the backlog for a few years is ok. If priority=low is used as much as priority=high is abused, then there even is no difference for the "normal" renames. A good documentation will help. --LPfi (talk) 09:31, 27 May 2010 (UTC)[reply]
True. If someone wants to write some basic guidelines, I'll gladly add the functionality and announce it. Altrhough perhaps a better plan might be to make a header for Commons:AN that automatically listed infomrmation about various admin tasks. Want to try that first? Adam Cuerden (talk) 11:01, 27 May 2010 (UTC)[reply]
I've started a thread at Commons talk:File renaming#Sexual content, which is an existing guideline that currently lists six accepted reasons for a rename. Sexual content would make the seventh. I've also invited comment on certain aspects of the wording of the guideline there (i.e. is it only new files to be renamed?), and mentioned the priority proposal. Wnt (talk) 15:55, 27 May 2010 (UTC)[reply]
I think this advice should ideally appear in both policies/guidelines with links between the two. This is not just a policy but also a reference of relevant related policy. Dcoetzee (talk) 18:08, 27 May 2010 (UTC)[reply]

Media uploaded without consent

The section about content uploaded without the subjects' consent needs improving. One problem is the phrase "will be speedily deleted", which is very unclear. Many people will read that as speedy deletion, while others will not.

I think such content should be speedy deleted on request if there aren't any special considerations. If the file is important we might hope that somebody notices the speedy deletion and asks for undeletion. Vandals asking for deletion of important files can be dealt with by the "special considerations" clause.

On the other hand files that have been on the net for a long time before an editor starts to think that they might have been uploaded without consent should go through the full deletion request process. Otherwise we will have some admin thinking that every anonymous image must go and making a mess.

The other problem with the section is that we have old pictures for which we will not get any statement of consent. Many of those are harmless. Either the subject is dead long ago or the image does not "unreasonably intrude into the subject's private or family life". I rewrote that part but think it still should be improved.

--LPfi (talk) 09:19, 27 May 2010 (UTC)[reply]

"Old" works

This part: "Old works and media of historic significance might be kept without the consent of the subjects, in special cases even when the subjects ask for deletion." gives me a bit of the heebie-jeebies. While it's clear from reading Commons:Photographs of identifiable persons that there is a snake's nest of local censorship laws being applied, I think it would be best to leave all of that guideline to itself, and if we were going to pick any particular part of it to repeat on our own, I wouldn't make it the part about keeping sex pictures of an old lady online when she is furiously arguing against it. Wnt (talk) 17:30, 27 May 2010 (UTC)[reply]

I think the main concern here is that if we publish (say) a famous public domain artistic photograph of a nude woman, we shouldn't delete it just because she doesn't want it spread around. Realistically the chances of this occurring are very low and can be dealt with on a case-by-case basis. Dcoetzee (talk) 18:38, 27 May 2010 (UTC)[reply]
I understand this concern, and I agree that such cases exist. After just recently seeing it, I also believe that the current Commons:Photographs of identifiable persons is alarmingly over-restrictive, imposing the laws of three countries and a succession of "moral rights" against the entire database, and if vigorously applied likely could impose far worse censorship (against a much wider variety of things) than any version of this proposal made so far all the way back to the Jimbo ad hoc deletions. But I also don't want to get into a looking-glass dispute where inconsistent guidelines make it easier to justify keeping sexual content than non-sexual content. I want to keep policy and guidelines absolutely modular, with all the rules about a topic in one place, and no running back and forth between policies. I've seen a case of this on Wikipedia between w:WP:BLP and w:WP:Attack page, and it's just infuriating to argue according to one policy that something belongs only to get pointed to another policy that says it doesn't. Wnt (talk) 19:08, 27 May 2010 (UTC)[reply]
I agree that consistency is important - however I think policy should also generally act as a summary of other related policy as it applies to this policy, with links. This is particularly helpful for visitors from local wikis who are not familiar with all our policies. Dcoetzee (talk) 19:12, 27 May 2010 (UTC)[reply]
That's true. But we should always make clear which policy is in the driver's seat. In this case I provided a sort of summary just by listing all the different things they came up with in that other policy, but summarizing it properly would take a tremendous amount of space. And as I said, I wouldn't pick that one bit to be the thing we repeat here. Wnt (talk) 19:19, 27 May 2010 (UTC)[reply]

"Pornographic depictions of X"

I've already tinkered recently with the section about categories, removing a hypothetical about "pornographic depictions of Saint Theresa" and sticking with the real example and "Caricatures of Saint Theresa". But I'd like to check whether people would agree to dropping altogether the suggestion to use "Pornographic depictions of X" as a category name.

My concern is that this is a highly contentious and probably personal judgment by an editor regarding the nature of the material, serving also as a direct confession to some that Wikipedia is hosting "pornography" per se.

There may also be some need to figure out what the philosophical goal is here. I think the idea of the "pornographic depictions of X" may have been to warn users of indecent content, whereas the idea of the example given is to avoid confusion with general purpose photos of the subject.

There is a Commons:Categories. Though it is labeled as a help file rather than a guideline, it seems to be where issues of what categories should be made have been handled. I think it may be desirable to get input there about how to handle these issues, since categorization is an art that takes a certain knack to get right. Wnt (talk) 16:13, 27 May 2010 (UTC)[reply]

It's a fair point. I rather quickly rewrote that section basing it on a revised version of a section of Jimbo's, which wasn't that great to start with. Delete anything in there you feel lacks consensus. Adam Cuerden (talk) 17:53, 27 May 2010 (UTC)[reply]
As stated on the Village Pump, I agree that we should not be categorizing anything on the basis of "pornographic depictions of," simply because "pornographic" is a subjective term - we should categorize according to content, not offensiveness. Re your statement "a direct confession that Wikipedia is hosting pornograpy": we should be (and are) hosting what is generally understood to be pornography, since we have many articles on pornographic topics. Dcoetzee (talk) 18:05, 27 May 2010 (UTC)[reply]
Well, if they're written according to NPOV, they're pornographic according to a source, not according to us. But that's a quibble. ;) In any case I've rewritten and substantially reduced this section. I've also removed the mention of the specific image in favor of the general category, just in case it gets deleted as out of scope and because it may not be technically sexual content as currently defined here. Wnt (talk) 18:50, 27 May 2010 (UTC)[reply]

Non-human animals

I just deleted the stuff in the definition about non-human animals. (I'm not speaking of bestiality, which is another issue, but just animal matings) The reason for this is that animals are not covered under any of the three categories of prohibited content. The only point that would apply to them is that you should name the file "Monkeys mating" and not "monkeys screwing", but that is an issue I'd like to see settled under Commons:File renaming, a different guideline, as I described above; and there the proposed seventh category of renamings could have a different scope. Wnt (talk) 17:59, 27 May 2010 (UTC)[reply]

Agreed. In the absence of legal prohibitions, there is no need to exclude content in this area. Issues of description and scope may still apply, as they always do. Dcoetzee (talk) 18:07, 27 May 2010 (UTC)[reply]
Who added non-human animals?
User:Jmabel at 17:02, 9 May 2010. Dcoetzee (talk) 18:36, 27 May 2010 (UTC)[reply]
Probably a Jimboism. I recall something like that in his draft. Oh, well! =) Adam Cuerden (talk) 18:48, 27 May 2010 (UTC)[reply]

Userpage policy revisited

Despite repeated proposals, Wikipedia consensus has, to the best of my knowledge, thoroughly rejected the idea of prohibiting sexual content on userpages. This subsection should be reconsidered before we put the policy up for a vote.   — C M B J   21:03, 26 May 2010 (UTC)[reply]

I agree that this section should stay out. The underlying issue is that Wikipedia has a very strict policy on userpages which (according to previous discussion) is apparently not policy on Commons; in fact I think there is no policy on userpages on Commons. If such a policy exists or is proposed, then this would be a point for that. But the userpage per se contains no images; it contains at most a list of images, and so it is not "Sexual content" as defined in the first section! (See also Commons:Deletion requests/User:Max Rebo Band). Wnt (talk) 21:20, 26 May 2010 (UTC)[reply]
I think you are arguing is that procedure calls for images to be displayed (internal links) are, in some way, protected speech. What the code on the userpage does is call the image to be displayed, which is inserting/displaying images on a userpage, not "text", not "a list" (unless it really is a list instead of a gallery), and not "protected speech". - Stillwaterising (talk) 21:39, 26 May 2010 (UTC)[reply]
I don't understand the relevance of "protected speech" here. Sexual content appearing in a user page gallery is presumably content people have chosen not to delete, so none of it should be anywhere near being illegal. And if it were the point to address would be the image, not one page in which it is referenced.
I also don't see your distinction between the list and the gallery. With something like w:WP:POPUPS a list is something like a gallery, as the links expand on mouseover. It's all a matter of software interpretation. But the pictures shown are only the pictures that Wikipedia serves no matter what document they're in.
Suppose user page galleries are banned. Then are editors allowed to have one sentence, "See my Wikiporn gallery at my homepage on Encyclopedia Dramatica [link]" - with links back from thumbnails on ED to Wikimedia Commons images? If yes, you've gained nothing but embarrassment; if no, then the policy you're describing is a total ban on links to any "pornographic" site from the userpage. Wnt (talk) 03:36, 27 May 2010 (UTC)[reply]
I think the problem is getting to disturbing images without warning, like in the case of categories ("People eating"). I think a user should be polite by not upsetting people visiting the user page without a good reason. Sexual content does surely upset a lot of people and I see no good reason to have such a gallery directly on the user page. I see no problem in having the gallery on a subpage, with a link from the user page properly stating its nature.
There are other categories of upsetting content that might be put on user pages, so a policy about user pages would be good to have, but I think that sexual content on user pages can be handled here until there is such a general policy.
--LPfi (talk) 08:42, 27 May 2010 (UTC)[reply]
If someone's going to be upset, won't they be upset on the first image? If someone goes through the whole gallery before getting upset, I find myself skeptical of his aversions. Also, I'm not sure why a subpage is different; nor does the proposed wording say that subpages are OK. Wnt (talk) 16:17, 27 May 2010 (UTC)[reply]
Yes. I do not think it is polite to have even one sexually explicit image directly on the user page. User:Max Rebo Band has only one "tasteful nude" visible (with my settings), which I find borderline. Nudity is not covered by this policy, but you might get my point.
I think you should be able to click on the signature of any user or otherwise visit their page without having to be upset by chocking pictures. Even if you are a child! (Or from a culture were sex is taboo.) Subpages with suitable names and with suitable links to them is not a problem, as you can choose to avoid them, without harm to your ability to communicate and act on Commons on questions not related to such content.
The userpage wording should be changed. Allowing excessive sexual content in some cases does not feel right.
--LPfi (talk) 17:43, 27 May 2010 (UTC)[reply]

Commons:Sexual content#Sexual content on userpages

I've added a little bit of the discussions above to this. Feel free to delete it (but, if you revert, note that the possessive of "Commons" is "Commons'", not "Common's": We are not Wikimedia Common; Commons is used here in the sense of communal property, which anyone may use to their ends.) I think it's reasonable to suggest good practices, so long as we don't force them on people. Adam Cuerden (talk) 18:44, 27 May 2010 (UTC)[reply]

Archiving

I'm going to archive from "In photo description" up. We don't need all the Jimbo incident stuff anymore, and nothing above that's active, as far as I can tell. Adam Cuerden (talk) 20:47, 27 May 2010 (UTC)[reply]