Talk:2010 Wikimedia Study of Controversial Content

From Meta, a Wikimedia project coordination wiki

Questions for Discussion -- General Comments (see Main Page for Complete List)

I've kicked off with some brief answers below, and will likely reply to further comments - incidentally, I think it may be a good idea to copy the entire question to this page? - I think it'd be clear enough - and would probably make it easier to follow. If your idea was for answers to be posted to the 'main page' and discussion to occur here, then I can happily shuffle things around - I just wasn't sure if that was the case :-) Privatemusings 06:55, 22 July 2010 (UTC)Reply

I've taken your suggestion and am posting the complete set of questions here. I'm hoping all of the discussion will take place on this page. Robertmharris 11:30, 22 July 2010 (UTC)Reply

Question 1 Wikipedia has put certain policies and procedures in place to deal with special contentious categories of articles, (controversial articles, biographies of living people, e.g.) (see Wikipedia: Controversial articles http://en.wikipedia.org/wiki/Wikipedia%3AGFCA) Do you think Commons could or should institute similar procedures with some classes of images?

I have no problem with policy emerging for images based on classes (like violence, sexual content, religious imagery etc.) - in fact I think it's probably a good idea. Privatemusings 06:55, 22 July 2010 (UTC)Reply

Reasonable enough, but we must remember that everyone sees different things as controversial, or would like to stay away from them. For instance, the flat earth society might want to keep away from globes, arachnophobes from spiders and the amish from buttons. This would essentially boil down to everything being categorised as controversial, which undermines the whole point. -mattbuck (Talk) 11:24, 22 July 2010 (UTC)Reply
Sounds reasonable, but we need to make sure these are very limited (like those that would causes massive resistance from multiple communities, i.e. extreme sexual content, violent material that has no historical importance, etc.)Sadads 12:34, 22 July 2010 (UTC)Reply
Reasonable in theory as long as things are narrowly and objectively defined. As soon as you get anything subjective, such as categories like "sexual content", "violence" or "religious imagery", you'll just end up with acrimony and edit wars about what should be in them (is a photograph of two men kissing "sexual content"? What about a labelled line drawing of female genitalia? Should a Tom and Jerry-style cartoon, a video of a professional boxing match and a photograph of the aftermath of a suicide bomber all be included in a "violence" category? Is Scientology a religion? Are photographs of a vicar at a church fête religious images?). Should you tag any image or category as "likely to offend" then you'll very quickly end up with eveyr image/category so tagged defeating the point. As for Sadad's comment, that sounds like it's okay to cause massive offence as long as it's only to one community - how does that square with NPOV? Thryduulf (en.wikt,en.wp,commons) 13:22, 22 July 2010 (UTC)Reply

"Some classes", such as depictions of living people in the sense of en:WP:BLP, sure. But certainly not special procedures for violence, sexual topics, religion, and the like that seems to be the underlying point of this question. Note that Wikipedia doesn't have special procedures for that sort of issue either. It does have special procedures for biographies of living people, because those people might sue for libel or slander or invasion of privacy if we publish untrue rumors. As far as I know, Wikipedia doesn't actually have any "special" procedures for especially-controversial articles besides the standard policies/guidelines (e.g. neutral point of view and verifiability) and normal dispute resolution; controversial articles just tend to come to the attention of more people who try to enforce the standard policies/guidelines more strictly to keep disruption and edit warring to a minimum. IMO, Commons should generally just enforce their scope, without specific consideration of whether any particular group would be offended. Anomie 16:14, 22 July 2010 (UTC)Reply

I think it is worth addressing these topics individually, even if in most cases the key statement is that "we don't censor this." In particular, though, there may be a lot of subject matter areas where images ought to get some sort of template tag or a special "hot issue" category. - Jmabel 17:04, 22 July 2010 (UTC)Reply

What is appropriate / controversial / objectionable to some one will not be completely acceptable to someone else. There for I think commons needs two things. The first thing is rules based on objective criteria. Examples are "compliant with law in USA state xxx", copyright, and so on. We largely have them in place, though we might consider adding rules against media which stimulate people to violence or discrimination, such as calls for Jihad. NPOV? I feel some doubt when applying this to images. We should be able to host images of for example historic WWII nazi and communist army posters, even though these are obviously not NPOV. The second thing we need I hope to detail further down. TeunSpaans 19:06, 22 July 2010 (UTC)Reply

Obviously not. As long as his purpose is to illustrate concepts encyclopaedically relevant, there should be no censorship. Wikisilki 10:52, 23 July 2010 (UTC)Reply

Nobody is forced to view images of commons, and no one is required even on the computer. More explicit images are all over the net. Any censorship is ethically counterproductive in the field of free information.

I feel bad English is a translation of google. Regards. --Ensada 16:13, 23 July 2010 (UTC)Reply

  • I don't have any problem with this general idea, which I think we already do. But I don't think this idea can help the controversial content debate much, since 'offensiveness' or 'pornographicness' aren't observer-independent. --Alecmconroy 15:55, 24 July 2010 (UTC)Reply
  • Please note that the linked item on Wikipedia is an essay, which is simply a personal opinion. It is not a guideline and it is not being enacted. If it were to be enacted, it would encounter serious practical problems. Even so, the controversy of Wikipedia articles can be measured objectively — to some extent — by looking for "edit warring", RfC's, ANI reports, and so on that spill out from them rather consistently. Commons images only become controversial if you create some category to designate them as such, allowing debates over their censorship - so such a process would define what is controversial, not follow it. Wnt 02:28, 27 July 2010 (UTC)Reply
  • Yes; policies can help by reducing the load on case-by-case frictional dispute over images, by development of policies covering general principles and recurring situations. It also often provides a calmer and more thought out dialog, helps ensure that future cases match general expectations, and reduces anomalies.

    However note that policies on enwiki are very much creatures of spirit and used with commonsense; it's recognized judgment and exceptions have their place and situations do not always fall into neat categories. The same would apply here - for example whether a file falls into some category or has some impact or value. Overall and with thoughtful drafting, yes, and I'm surprised Commons doesn't have communally developed guidance on certain areas already given its size. FT2 (Talk | email) 03:10, 27 July 2010 (UTC)Reply

Question 2 -- If yes, how would we define which classes and kinds of images were controversial?

No need to restrict to 'controversial' - I believe it's perfectly achievable to construct rational, useful policy based on classes of images. I do however believe there may be systemic / structural hurdles in doing so through a wiki 'community'. Privatemusings 06:55, 22 July 2010 (UTC)Reply

Controversial is subjective, and as I noted on another question, if we define anything which could be controversial as "controversial", then we'd end up with the whole archive so defined. -mattbuck (Talk) 11:43, 22 July 2010 (UTC)Reply
Need a checklist with logical factual statements describing features of objectionable content, and a template that you could choose yes or no. That could be verified by an admin,Sadads 12:36, 22 July 2010 (UTC)Reply
See also my above comment. Sadads - such would have to be worded very carefully to avoid all cultural bias and interpretation. I also don't understand what you would use the checklist for - e.g. an image has checkmarks for "photograph", "automatic machine guns", "clothed male" and "posed", is this a controversial image? It could apply equally to a child member of a rebel militia proud to be showing off the rifles he used to kill people, and to the Chief Constable of a police force at a press conference following the end of a gun amnesty. How do you distinguish between them objectively? Thryduulf (en.wikt,en.wp,commons) 13:30, 22 July 2010 (UTC)Reply
That would require extensive research methinks. Sadads 15:45, 22 July 2010 (UTC)Reply

That is exactly the problem with trying to treat controversial content specially: every group has a different definition of what is controversial, and favoring one viewpoint over the rest will result in a large group of dissatisfied people. Anomie 16:16, 22 July 2010 (UTC)Reply

I would enumerate broadly, but in most cases the policy would be "we don't consider this a special case". - Jmabel 17:05, 22 July 2010 (UTC)Reply
As Matt says, controversial is subjective. There for any classes should be defined by the public / visitors. If we enable visitors to define their own categories of "objectionable". Some might see a pic of Muhammed as objectionable, some might see a topless woman as objectionable, others might see detailed violence as inappropriate. TeunSpaans 19:07, 22 July 2010 (UTC)Reply
  • Rather than making up our own rules, we could do a survey to identify what types of media are subject to age restrictions, or illegal, in parts of the world. --JN466 01:25, 23 July 2010 (UTC)Reply
  • err you don't none of wikipedia's special cases for general categories of articles are to do with controversy.Geni 01:35, 23 July 2010 (UTC)Reply
  • Can't be done. We have 12 million editors, 375 million readers. An english-speaking minority can never fairly decide what is and is not controversial to that many people spread across the globe. "Controversial" will, in practice, mean "controversial to the people doing the labeling", i.e. controversial to English-speaking admins. The US/UK POV of controversial would, of course, be adhere to, while the Indonesia or Swedish concepts of controversial won't be. --Alecmconroy 16:06, 24 July 2010 (UTC)Reply
  • I agree: it's a culture war. You have to decide that some cultures are worth "protecting" and others aren't. Bear in mind that because this would stir up a slow, cumulative hatred against the "protected" cultures, you're not really doing them all that much of a favor. Wnt 02:30, 27 July 2010 (UTC)Reply
  • Look at the enwiki policy Criteria for speedy deletion. The policy defines a set of the most clearcut cases which all (or almost all) can agree on. It's not long, and from time to time talk page discussion develops or improves it. Words like "blatant, clear, unambiguous" are used to avoid encroachment into legitimate material by people trying to push any envelope.

    For this, set up a poll to get ideas and a sense of views of the basic areas which might be involved (sexual imagery? Shock images? Images of poor quality and easy reproducibility? Images of contentious subjects which are already as well or better represented?). Then a second stage poll to decide specific wording and/or ratification of any with strong support. Something like that maybe. FT2 (Talk | email) 03:16, 27 July 2010 (UTC)Reply

Question 3 -- What policies and procedures might be put in place for those images? How would we treat them differently than other images?

Essentially 'as required' - I support raising the bar on criteria for inclusion for example to respect personal privacy to a greater degree than the law may require - I feel that we should require the consent of a victim of violence, or participant in sexual content to publish media in which they feature rather than just a narrow consideration of copyright. Basically I support 'descriptive image tagging' - there are tons of ways of doing this intelligently, and without the sky falling in ;-) Privatemusings 06:55, 22 July 2010 (UTC)Reply

How does NPOV allow us to treat subjects that we personally like or don't like, but which are equally legal, differently? How do we objectively define what images need the consent of the subject and which don't? You suggest the consent of a victim of violence should be required to give consent, which sounds fair enough, but what about when they are dead? What about where it is not certain whether the victim is alive or dead? If the photograph was taken last week it's almost certain they are still alive, if it was taken 100 years ago it's almost certain they're dead, but what about if it was taken 70 years ago? What about recent images where it is not known who the subject or subjects are, e.g. a photograph of people seriously wounded in a market place bombing?
I think the only policy we should have is "Wikimedia Commons is not censored and [i]does[/i] contains images and other media that may offend you." Thryduulf (en.wikt,en.wp,commons) 13:44, 22 July 2010 (UTC)Reply
Hear, hear! Trying to treat some images differently from others will lead to endless wars over whether the "special" treatment should be applied. We already seem to have this in regard to images in the grey areas of copyright law, no need to drastically increase that. Anomie 16:20, 22 July 2010 (UTC)Reply
As remarked above, I think the right solution is a template or category indicating the possible issue. This would readily allow someone else to filter on these criteria, either to seek these images or to avoid them. It is also possible that in certain areas we might want to keep images out of category thumbnail galleries except for specially tagged categories, and might want to set a policy of requiring an extra click-through to see certain images in certain contexts. - Jmabel 17:09, 22 July 2010 (UTC)Reply
And then you have people fighting over whether various images should or should not be tagged with the special template or category. 65.188.240.254 03:10, 23 July 2010 (UTC)Reply

I think it would be worth considering a policy on limiting the use of controversial images on the Main Pages of Wikimedia projects, perhaps at least excluding images that would be considered "porn" (as opposed to say, anatomical illustrations, etc.). Our Main Pages are our public face and should be welcoming to as wide a variety of readers as possible. This isn't about censorship, but about being tactful and respectful to our potential audience (which is extremely diverse). If someone wants to look up fisting pictures, more power to them, but we shouldn't showcase them on our Main Page, IMO. Kaldari 17:51, 22 July 2010 (UTC)Reply

Thats somewhat outside commons' juristiction.Geni 01:37, 23 July 2010 (UTC)Reply
  • I think the most we can do, while preserving NPOV, is implement a highly-verifiable labeling scheme (i.e. "descriptive labeling") combined with filter suggestions. So long as independent editors can agree on which label goes with images. "Controversial" isn't a valid label, but "full frontal nudity" might be. --Alecmconroy 16:14, 24 July 2010 (UTC)Reply
  • The only effective way to censor various images is by a third-party site, such as that already mandated under w:CIPA in the United States. Otherwise you're trusting a group of raters with widely varying opinions to do the tagging, and you'll end up with endless judicial processes to punish those who don't use the right ratings. It won't satisfy anyone. Wnt 02:35, 27 July 2010 (UTC)Reply

Question 4 -- Could uploading practices differ for these images than other Commons images?

Maybe - in truth the uploading experience is rather painful anyways, so I see the issue more as relating to what 'we' decide to do once the media is on wmf servers. Privatemusings 06:55, 22 July 2010 (UTC)Reply

I don't see how this would work - it would rely on honour and people reading the upload form, which clearly never happens, so people would still need to troll normal uploads for these images, thus removing the whole point of the segregated system. -mattbuck (Talk) 11:22, 22 July 2010 (UTC)Reply
No. Or at least not until you develop a 100% effective and 100% reliable computer program that can accurately describe all the content of every image prior to upload (I don't foresee this being possible in my lifetime). Otherwise you have to rely on a self-selecting system which requires everybody to read and understand objective and culturally neutral descriptions of what exactly constitutes each category of images (I've never seen any evidence that suggests this is possible). Furthermore it then requires everybody to abide by this every time. Given that well meaning people do make mistakes (click the wrong image to uplaod for example) and there are a lot of people out there who are not well meaning (the existence of the bad image list is more than enough evidence of this), you will still need humans to view every upload as Mattbuck says. Thryduulf (en.wikt,en.wp,commons) 13:52, 22 July 2010 (UTC)Reply
Probably not workable. I'd rely more on getting rid of "bad" content quickly, as we do for everything else. - Jmabel 17:10, 22 July 2010 (UTC)Reply

I'm not sure wikimedia should take any extra measures when media are uploaded. Respect for privacy, in some cases, as mentioned by privatemusings, I would support. TeunSpaans 19:20, 22 July 2010 (UTC)Reply

  • While we can build custom uploads people tend to respond to greater complexity by lying.Geni 01:38, 23 July 2010 (UTC)Reply
  • In theory, yes. Controversial types of images could be passed through a queue where they are vetted and approved by an OTRS volunteer before they appear on Commons. --JN466 01:51, 23 July 2010 (UTC)Reply
  • "Controversial" doesn't exist except in the eye of the beholder. So we can't discriminate based on that. However, I'm okay with the idea that we can afford to be a little 'extra careful' about issues of sourcing and copyright in any image we think might pose legal problems. But only within reason-- it can't be a blanket excuse to deter content that causes offense. --Alecmconroy 16:19, 24 July 2010 (UTC)Reply
  • This would unreasonably burden the Commons upload process, and what would the benefit be? Wnt 02:38, 27 July 2010 (UTC)Reply

Question 5 -- If we assume that sexual images might be one of the categories included, or even if we don’t, do you think we have adequately defined sexual content historically in Wikimedia policies? (see Commons: Sexual content http://commons.wikimedia.org/wiki/Commons:Sexual_content)

Well we haven't really tried, have we? - my feeling is that the linked proposal page, whilst representing some attention on these issues, hasn't really (yet?) delivered any outcomes - again, I believe this may only really be a challenge because we've previously tried to discuss definitions, policies, practices, and philosophies all in one go - that's perhaps part of the wiki discussion weakness? Privatemusings 06:55, 22 July 2010 (UTC)Reply

Not even close! What the linked page details is a general class of images that a self-selecting group of people believe include a subset of the content that falls into one culture's definition of "sexual content" in some contexts on more occasions than they don't. You'd have to try hard to get woolier than that. Thryduulf (en.wikt,en.wp,commons) 13:57, 22 July 2010 (UTC)Reply

Not really. "Certain types of paraphilias" is quite vague, as is whether a depiction is sadistic or masochistic abuse, as is whether an image "prominently" features the pubic area or genitalia. But as of the current version, the instructions under Prohibited content don't actually care whether something actually is sexual content or not: Illegal is illegal regardless, and both commons:Commons:Photographs of identifiable people and commons:Commons:Project scope apply to all media. Anomie 16:31, 22 July 2010 (UTC)Reply

  • Undoubtedly we are not there, but I think we are close. If you think of consensus not as unanimity, but as 90% agreement, lopping off the most extreme 5% on either side, I think we have something like consensus. - Jmabel 17:12, 22 July 2010 (UTC)Reply

Not really. People were rather conservative on the linked page. TeunSpaans 19:27, 22 July 2010 (UTC)Reply

Is the definition accurate? Sadly no, although it's a laudable effort. But it reflects the inescapable cultural biases of its editors, which are, ultimately, arbitrary. Two individuals kissing ISN'T sexual, unless they appeared to be sadomasochists, but in which case it IS sexual. Similarly, urination is, to most of us, a completely nonsexual phenomenon. But if someone appears to be sexually enjoying the sight of said urine, suddenly it's a controversial sexual image? How long will it take someone to accuse w:Piss Christ of promoting urine-fetishism? And god only knows how many other weird taboos and fetishes are out there we haven't even considered yet.
Meanwhile, homosexuality, controversial in most of the world, illegal in the rest, isn't even singled out for special attention, even though it's the most controversial taboo there is. This too reflects (admirably I'd say) the cultural biases of our Wikipedia users, who have concluded homosexuality shouldn't be disparaged on Wikipedia. But if we're really going to start using cultural taboos to decide content, surely NPOV requires us to considered homosexuality as one of the most taboo and thus, most deletable. In short, while I personally might agree with most of these definitions, they're just NOT consistent with a truly globally-neutral POV. --Alecmconroy 16:34, 24 July 2010 (UTC)Reply
  • I've tried to improve Commons:Sexual content, but I should point out that it makes no attempt to be culturally unbiased. The definition is based on U.S. legal distinctions. Fortunately, the proposed policy doesn't do very much more than simply state existing Commons policies - most notably, that content illegal in the U.S. can't be kept here - and give users references to existing tools and policies to deal with such content. It seems to me that Commons already has sufficient policies in place to prohibit the things that it was alleged to support (e.g. child pornography, or pornography collections outside of Wikimedia's educational mission), and this proposal documents this. Wnt 02:43, 27 July 2010 (UTC)Reply

Question 6 -- If not, how might the definitions be extended, or otherwise changed?

Ask someone sensible, like yourself, to provide a workable definition, then progress discussions to a point where something is resolved. This requires leadership and / or authority :-) Privatemusings 06:55, 22 July 2010 (UTC)Reply

As far as I can see there are two possible options

  • Option A - 1. Abandon the principle of NPOV. 2. Pick somebody at random. 3. As them to write their definition. 4. Get them to delete all media that doesn't fit their interpretation of their definition. 5. Get a lawyer look at every other image and delete any images that in their opinion are illegal for Commons to host. 6. Get a different lawyer and repeat step 5. 7. Repeat steps 5 and 6. Block everyone who doesn't agree with any decision made.
  • Option B - 1. Declare that Commons is not censored. 2. Tell people who complain that Commons is not censored, and why. 3. Explain to the newspapers that Commons is not censored and why. 4. Repeat as required. Thryduulf (en.wikt,en.wp,commons) 14:07, 22 July 2010 (UTC)Reply

Is there any actual need for a definition? Or can we just impartially apply the tests of legality, commons:Commons:Photographs of identifiable people, and scope that apply to all media? Anomie 16:33, 22 July 2010 (UTC)Reply

  • well they could be flat out deleted. Otherwise the different projects have varying processes in this area. They do tend to be documented if you really need to know.Geni 01:43, 23 July 2010 (UTC)Reply
  • Don't define sexual content, don't define controversial content. Just label the verifiable aspects, and then let individual groups of users do their own flagging of 'controversial images' via third-party filtration. Or, create a new set of projects that don't have to be NPOV/NotCensored, and let them figure out their own standards. Or, don't change anything and watch it continue to succeed. --Alecmconroy 16:46, 24 July 2010 (UTC)Reply

Question 7 -- Commons identifies two goals for itself: supporting other Wikimedia projects, and creating an archive of visual images for use by all. Are these goals compatible with each other? Do they contradict each other?

Dunno - probably not, because the wmf project goals are generally broad and inclusive too - folk will argue that the later behaves synonymously with the former, and I'm not sure they're wrong (or if it matters) Privatemusings 06:55, 22 July 2010 (UTC)Reply

They are contradictory - goal 1 says we should host unfree images (such as the wikimedia logo) which contradicts goal 2. -mattbuck (Talk) 11:45, 22 July 2010 (UTC)Reply
That is a good example. Commons should further relax its extreme free-content rules for hosting images. For example, there should not be a problem with images that are free for educational use or that were released for use on wikipedia. /Pieter Kuiper 11:56, 22 July 2010 (UTC)Reply
I agree with Pieter here. We have the technical means to tag content according to several copyright status. There is no valid reason, apart for political correctness, we should not use this possibility to include more content. Yann 12:31, 22 July 2010 (UTC)Reply
I agree with Pieter, the larger the repository, the better it will be used. What communities are we reaching out to right now: GLAM and University communities. Who can use the educational use stuff: GLAM and University communities. Might be much better way to get cooperation. Sadads 12:39, 22 July 2010 (UTC)Reply
I disagree with Pieter - to me, the main goal is a free repository, helping other projects is just a side-effect (god knows they seem to do their best to screw us most of the time). I don't think we should host ANY unfree images, including Wikimedia logos. -mattbuck (Talk) 00:24, 24 July 2010 (UTC)Reply
agree strongly with matt. Privatemusings 01:27, 24 July 2010 (UTC)Reply

As long as Commons is censored in any way (by content and/or by copyright status) neither goal is possible. Without such censorship I cannot see any conflict. Thryduulf (en.wikt,en.wp,commons) 14:10, 22 July 2010 (UTC)Reply

Isn't the second goal "creating an archive of media freely usable by all"? There is a conflict in that some Wikimedia projects allow media that is not freely usable by all (e.g. fair use images), but by and large they are compatible. Anomie 16:36, 22 July 2010 (UTC)Reply

I see them as compatible but would rephrase:
  • Provide a well-indexed common repository of media freely reusable for other Wikimedia projects.
  • Provide a well-indexed repository of media conceivably usable for other educational purposes, or educational in its own right.
- Jmabel 17:20, 22 July 2010 (UTC)Reply

I see two conflicts and some practicle problems:

  • Some wikipedias want images that are not free (copyright)
  • Commons deletion processes are seperate from those of the other wiki-projects. Only after deletion on commons, references to these images are removed from other projects. They are not informed beforehand
  • Commons tends to delete images, even free ones, which they deem non-educational. For other projects, such as schoolbooks, one needs funny pix which do not have any educational value in themselves but which help to create a schoolbook which reads fun. Such images got deleted in the past, not sure about them now. TeunSpaans 19:46, 22 July 2010 (UTC)Reply

TeunSpaans 19:46, 22 July 2010 (UTC)Reply

Commons is a repository of Free media, just like the Wikipedias are repositories of Free articles. From its very existance, Commons supports the other Wikimedia projects, just like the other Wikimedia projects support Commons (you can see Wikipedia as a project that provides encyclopedic background to the Free media hosted on Commons, it works both ways). There are no contradiction. Commons is a "service project" exactly as much as all the other projects are.
The fact that Commons has a strict copyriht policy is an excellent thing which enables it, rather than hinder it. If Commons started to host unfree images, it would become difficult to manage for its contributors, and hopelessing confusing for its users (nobody suggests that Wikipedia should start "quoting in extenso" work in murky copyright earas to better fulfill its role). Commons is a repository of Free media rather than a mess of "get-away-with-it" media, and it should stay that way if we want it to work optimally. Which we want. Rama 08:10, 23 July 2010 (UTC)Reply

The two are contradictory only insofar as deletion is considered essential to forming a "high quality image archive". If commons is to continue to exist as a shard host, it's essential that the only things that get deleted are stuff NOBODY is using. If we can't promise that to our chapters, we should just let them host their own images and let Commons be the image-archive-project. --Alecmconroy 16:50, 24 July 2010 (UTC)Reply

  • I think that Commons has very clearly drawn back from the idea of simply housing an unlimited set of free images, with Commons:COM:SCOPE. It may be that this is simply too large a mission for Wikimedia — traditionally, an encyclopedia is a map, not the whole territory. I feel that certain aspects of Commons policy, such as "moral" restrictions in photographs of identifiable people, and "courtesy" prohibition of material that has not entered the public domain in its home country, are very serious flaws that need to be remedied, perhaps even greater in scope than some of the proposals for censorship of sexual content. Wnt 02:48, 27 July 2010 (UTC)Reply

Question 8 -- One of the participants in the recent discussion on a Commons sexual policy noted that as Commons more fully fulfills its mission to be an archive of all freely-usable visual images, the question of relevant educational scope becomes increasingly difficult to apply or at least begins to change? Do you agree? Is this a problem?

Educational scope has not been competently considered by wiki 'communities' (particularly commons) in my view - it's actually currently synonymous with 'everything' at the mo. - again, a sensible, workable definition would be required if the 'scope' is to be a reality, and not an arbitrary notion. Privatemusings 06:55, 22 July 2010 (UTC)try and give an example of something which could not be 'educational' if you like :-)Reply

About the one thing which can be out of scope is user-created artwork, and even then that's not hard and fast. -mattbuck (Talk) 11:42, 22 July 2010 (UTC)Reply
"All freely-usable visual images" and "Only freely-usable visual images with an educational scope" are clearly mutually incompatible except where "educational scope" is defined as "any image that educates, or could be used to educate, one or more people" at which point the definition is identical to "all freely-usable visual images". As soon as you try to limit the scope of "educational" then you must sacrifice NPOV. Even user-created artwork is educational, for it has a subject and could be used to illustrate an encyclopaedia article, a dictionary entry, a phrasebook, a novel, a textbook, a quotation, an academic study, etc. about that subject. Say for example User:Example (a 17 year old from Corsica) uploads a watercolour painting of a dragon. This could be used in any of the above contexts, it could also be used as part of a studies/nooks/articles/etc about paintings of dragons, dragons in contemporary Corsican culture, watercolour painting in Corsica, watercolour painting in the early 21st Century, etc, etc. It is possible that User:Example could in later life become famous or significant in any number of ways that would make them the subject of an encyclopaedia article and/or academic study, for which an early example of their art might be significant. Thryduulf (en.wikt,en.wp,commons) 14:29, 22 July 2010 (UTC)Reply
I take a broad view of "educational" and don't see this as a problem. In the area most under discussion, I see human sexuality as being as valid a subject of education as any other. I'm going to again mention an institution I've mentioned before: the Foundation for Sex Positive Culture, which is a 501(c)(3) here in Seattle, and which is one more group you might want to talk to. They have a large library on human sexuality and BDSM, which provides a counterexample to a lot of the more dismissive remarks that have been made here about certain material being inherently "without educational potential". - Jmabel 17:25, 22 July 2010 (UTC)Reply
  • Strong yes. To me, the ONLY valid reason to delete otherwise-acceptable information is because of our shared resources. As our resources expand, the scope of what is reasonable to host also expands. In ten years, when bytes are cheaper, we'll be able to host an image of every penis on the planet without it denting our servers. There is no true "non-educational information", it's a contradiction in terms. There is, however, "information that is not sufficiently educational to justify hosting it at this time"-- i.e. "we have enough penis pics right now, thanks". --Alecmconroy 16:58, 24 July 2010 (UTC)Reply
  • I very largely agree with Alecmconroy, though I should note that finding the desired image is itself a technical challenge, and this may still be a limitation when the sheer number of bytes is no longer an issue. Wnt 02:52, 27 July 2010 (UTC)Reply

Question 9 -- Should the number of current images in any category be a factor in the decision whether to keep or delete any recently-uploaded image?

Nope Privatemusings 06:55, 22 July 2010 (UTC)Reply

I generally say no, as otherwise no one would let me upload my thousands of images of trains. However, I do feel that this has somewhat of a place where the image being uploaded is of low quality, and Commons has a {{nopenis}} template for a reason - lots of people upload bad quality photos of genitalia. That being said I'm generally in favour of pruning bad quality images of any sort. -mattbuck (Talk) 11:20, 22 July 2010 (UTC)Reply

Use and quality are a more important factor than the number of available images. Sadads 12:43, 22 July 2010 (UTC)Reply

No. If we have two images of an identical subject (not just "penises" but "close up left-facing profile photographs of clean, disease-free, average-length flaccid penises of post-pubescent young adult white males with untrimmed, dark public hair and no visible piercings, tattoos or other decoration or modifications"), and one is of significantly poorer quality than the other, then it should be nominated for deletion (not speedy deleted) unless it is under a freer or qualitatively different license than the better quality one (e.g. if the better quality image is gfdl and the poorer one is public domain or cc-by-sa then neihter should be deleted). This should happen whether we have 3 or 3000 other images in the category. Thryduulf (en.wikt,en.wp,commons) 14:41, 22 July 2010 (UTC)Reply
  • Where we have many well-made images, we should be more willing to delete poorly-made images. For example (in a presumably uncontroversial area) we have no use for a badly focused snapshot of a person or building where we have good, similar images. I think similar criteria may apply (for example) to human genitalia. If we had three good, similar images of human genitalia, it's hard to imagine the use of an additional, less well-made image. But other first-rate images should continue to be welcome, as should images that differ along lines of race, camera angle, lighting, etc. And I have no problem at all with us enforcing these rules more strictly in sexually-related areas than elsewhere. An indiscriminate set of pictures of the White House is less likely to create any problems than an indiscriminate set of pictures of penises. - Jmabel 17:31, 22 July 2010 (UTC)Reply
  • Maybe for now, but not forever, and probably not now either. Usefulness is a far better metric than images-in-category. In general, though, contested deletions, of small files at least, probably isn't a good use of our time. We also shouldn't impose a 'double standard' on controversial images vs noncontroversial images. --Alecmconroy 17:05, 24 July 2010 (UTC)Reply
  • I think we do have to recognize that users only see so many images at once in a search, and if there are very many of a given type, the bad can drive out the good. We should be very cautious about deleting potentially useful resources, but in a few cases it isn't altogether unreasonable. Though no doubt there are some academic studies that could benefit from a collection of thousands of penises from all over the world... Wnt 02:59, 27 July 2010 (UTC)Reply

Question 10 -- Images on Commons are presented, by and large, without context, often with just a brief descriptor. Should a note on context and reason for inclusion be part of a regime around controversial images (as it is, for example, on the image of the Jyllands-Posten cartoons depicting images of Muhummad, to explain deviation from normal licensing regime) ?

Don't really care - leaning towards it being irrelevant if media is responsibly curated. Privatemusings 06:55, 22 July 2010 (UTC)Reply

Thats a little silly, curation is more important than contextual descriptors. Imagine how much time that would take for volunteers to write contextual rationals for every mildly controversial image. Also, new users: image upload is already complicated, do we want to make it more so? Besides at the bottom of every page, their is a list of pages which use the image on Foundation projects, isn't that good enough for context?Sadads 12:42, 22 July 2010 (UTC)Reply

I don't really understand the question, but if you adopt the (imho) only workable policy of "Commons is not censored." then context or lack thereof does not matter (excluding legal issues, but then if an images requires context then AIUI we couldn't legally host it anyway). Thryduulf (en.wikt,en.wp,commons) 14:44, 22 July 2010 (UTC)Reply

The description pages of all images should provide sufficient context for the image itself that someone unfamiliar can figure out what it is supposed to be. I have no answer to the specific question here, as I strongly disagree with the underlying assumption that there should be a "normal licensing regime" such that any image hosted (besides the non-free Wikimedia logos) would be a deviation. Anomie 16:47, 22 July 2010 (UTC)Reply

  • No need for anything beyond the usual description, templates, and categorization approaches. I do like the idea of subcategories for sexual (or even nude) images within a larger category, e.g. that images of nude people cooking don't belong directly in the cooking category. - Jmabel 17:33, 22 July 2010 (UTC)Reply

The upload procedure is already too complicated, we should not make it more difficult and time consuming. But I would favour a possibility for visitors to tag images as objectionable with one or more tags (bestiality, islamic, pornographic, etc) and then have users select a profile (in profile for registeerd users, in cookie for non registered users) for things they dont want to see. That system works reasonable well on youtube and other sites for sexual content, and i no reason why it would not work for other types of content. TeunSpaans 19:55, 22 July 2010 (UTC)Reply

  • I think context is vital to this issue (and possibly the only thing that we may be able to build consensus around). For example, although it may be perfectly fine to include an image of nude bondage in the BDSM article, you probably wouldn't want it included in the Rope article. Similarly you wouldn't want to feature an image of Osama Bin Laden on the en.wiki Main Page on 9/11. Right now, we rely solely on the discretion of our editors to make sure images are used in proper context. It would be useful if we actually had a policy we could point to if an editor happened to have a catastrophic loss of discretion. Kaldari 22:31, 22 July 2010 (UTC)Reply
I am afraid that the example is inadequate: the Jyllands-Posten cartoons do not seem to be hosted on Commons, and as far as I know, they are unlikely to be in the short term since they are protected by copyright. There might be some confusion between controversies that occur on Wikipedia and are widely reported in the mainstream media, and the actual problems that we have on Commons.
I think that the "by and large, without context, often with just a brief descriptor" should be qualified more precisely -- or rather, that the implicit adequate media description be qualified. For instance, what is lacking with the description of this controversial image of Mohammed Mohameddemons2? Note that in practice, descriptions often contain links to Wikipedia, where the subject or author is discussed at length. Rama 08:30, 23 July 2010 (UTC)Reply
  • No descriptions or context necessary whatsoever, nor should it be. Too much bureaucracy discourages contributions. (That said, if anyone wants to go through and write up contexts, why not?). --Alecmconroy 17:10, 24 July 2010 (UTC)Reply
  • I think we should strongly encourage descriptions; and if there are a large number of images of the exact same thing, then we can prefer those with better descriptions over worse; or we can use detailed descriptions to say that they aren't the same thing. In other words, a hundred pictures of (generic) penises may be much more than we need; but a picture of one penis from each nation in the world might document more seriously educational variations of human morphology. Likewise, random images of Muhammad copied off the Facebook site may have no obvious educational point to them, but when documented to an artist or a source they become much more desirable. So as a rule, the more information available, the easier it is to justify things under an educational mission. But this should never be necessary with the first image of a given type. Wnt 03:04, 27 July 2010 (UTC)Reply

Strange set of questions

This is a strange set of questions. I wonder how much the people you talked with know about Commons. For example, the last question refers to the cartoons published by Jyllandsposten, but en:File:Jyllands-Posten-pg3-article-in-Sept-30-2005-edition-of-KulturWeekend-entitled-Muhammeds-ansigt.png is not on Commons, because it is not a free image. The note is a "non-free use rationale", which enwp requires also for other newspaper pages, for non-free science visualizations or for portraits.
And yes, the copyright restrictions on Commons make it difficult to cater to the needs of the wikipedias. It should be no problem to include publicity images that are free for editorial use in wikipedia articles. Yet Commons regards anything as not sufficiently free when images do not explicitly allow derivatives or commercial use. Is your question 7 really a part of your mandate? /Pieter Kuiper 10:59, 22 July 2010 (UTC)Reply

Puzzled

I have followed the public discussions on this subject, starting from the unfounded allegations around what is on Wikimedia Commons and ending here, and a few things puzzle me:

1) Why is the image more threatening than the word? So far as I can see, not many object to a set of words that describe an activity or thing that could be potentially obscene, but a lot more people are concerned about the image that is associated with those words. I'm just curious as to how - or why - there is a separation between these two things, and how rating systems typically deal with the problem.

2) I've tried to understand how internet material filters (or labelling systems) such as the one Craigslist uses (http://www.w3.org/PICS/) connect to censorship. Are rating systems inherently incompatible with free speech (i.e. are there none considered compatible with strong anti-censorship principles)? What is the view on Craigslist and PICS for instance?

Just trying to understand. --Aprabhala 13:55, 22 July 2010 (UTC)Reply

In response to your question2, filters and censors (both either automatic or human) must make a selection of which images to show and which to block - this is their entire raison d'être. The selection process is inextricably linked to personal opinions, cultural norms and other societal factors of what is and isn't suitable for someone else to see, and a value judgement must be made about each image to determine where it fits. For example if you wanted to block images of say "nudity", you or someone else must determine what constitutes nudity, and people from different era, cultures, religions, and even different opinions with the time, same religion and culture have different opinions about what constitues nudity. For example, I doubt whether we could get 100% agreement, even just among people viewing this page about which of the following images contain nudity:

If the filter decides that all the images should be filtered, but you believe that at least some should not be, then you would complain the images have been unnecessarily censored. Thryduulf (en.wikt,en.wp,commons) 15:23, 22 July 2010 (UTC)Reply

Well this is part of the reason why ICRA/PICS uses more descriptive labelling than just 'nudity' of course. See their vocabulary, with in addition to that a 'context label'. TheDJ 16:39, 22 July 2010 (UTC)Reply
TheDJ - I've had a look and those descriptions are so vague and culturally dependent as to be laughably useless! For example the last image above would have to be tagged as "no nudity" (her breasts are not exposed, her buttocks are not bare and her genitals are not visible) and yet that's one of the images in that group most likely to offend! 17:14, 22 July 2010 (UTC)
Thryduulf, this is very useful. But, I imagine that for a place like Craigslist, the metadata is generated (or at least monitored) by site admin and not users. On Wikimedia sites, if the metadata is generated by the users (i.e. discussed in the same manner as everything else), at least for that set of users, wouldn't that create some 'answers' as to which of the images above are to be rated - surely not perfectly, but just as well as anything else that is discussed on Wikipedia? I also don't fully agree that different interpretations of nudity are a problem that cannot be handled; at least in as much as different interpretations of Israel and Palestine are not a problem on Wikipedia. This kind of thing happens all the time and there's a working balance that's achieved. In some cases, like with 'images' of the prophet Muhammad, different language Wikipedias seem to have their own policies (debated like crazy, but they do). To get back to the original questions though, I don't understand why the image is more offensive than the text, and I wonder if the problem of arriving at ratings is not something that's already being practised on every single page across Wikimedia - to the extent that you can substitute ratings for quality, notability, neutrality, etc.--Aprabhala 17:04, 22 July 2010 (UTC)Reply
While you can word text to make it clear that some people believe X about Israel/Palestine but other believe Y, you can't do that with an image - it either is something or it isn't something, a map either shows Palestine existing or it doesn't. Thryduulf (en.wikt,en.wp,commons) 17:14, 22 July 2010 (UTC)Reply
As a follow up, and on reading David Gerard's postings on foundation-l, is there a fundamental problem with aiding filtering/monitoring software to do its job well? What I mean is this: existing images on Wikimedia have tags on them. These tags may or may not be helpful to monitoring software (which I don't know much about but NetNanny is what one hears). If new user-generated metadata is created around those images merely to enable monitoring software to do its job (and leave the experience of those not using it unaffected) then do the cultural nuances matter as much? Somehow I think that the user who is offended by the sex & violence on Wikimedia won't mind if their filtering software errs on the side of caution (i.e. perhaps classifying partially and fully exposed torsos, male and female, as one thing, etc.) And for everyone else (like me, for e.g. who thinks that most of the articles related to matters sexual are usually well written and evenly phrased) well, there's no need to use filtering software, no? So I get the point about how ratings as they exist now will always result in some kind of goof-up (aka censorship), but if the people using the ratings are (not us but) those who use NetNanny anyay, then: is it censoring Wikimedia to allow a filtering/monitoring programme to do its thing on Wikimedia? [And... this is the problem with not knowing much... is the problem at hand now that the filtering software doesn't work on Wikimedia because of the way we classify things?] --Aprabhala 19:26, 22 July 2010 (UTC)Reply

Per-user content filtering

I know that we are currently still in an information-gathering stage, but after reading the questions, the mailing list and the responses so far (Along with some divine inspiration of my own) i came up with a crude solution which may or may not be a good idea (Just one way to know - writing it down!).

The problem

Before i get into the actual idea, i compiled a short overview upon the conclusions i based the idea on. As they say - if the schematics are wrong, the result will inevitably be wrong as well. (And note that, as an en.wiki Wikipedian my entire basis will be build upon experience in regards to that Wiki). But on to the actual text.

The problem we have content that may, for various reasons, be offensive to single or entire groups of people. Such offensive content is divers and covers the entire spectrum from nudity to propaganda, and from religion to personal feelings. The difficulty is that there is no set pattern - every user is an individual with his own biases and opinions. As a result we cannot set a top level filter without hurting some other users, and equally we cannot allow certain content without insulting others.Either way, someone isn't happy.

The second issue is that there is no middle ground. We either have the images in an article, or we delete the images from an article. If they are included everyone can see them, but this includes people who do not wish to see them. If we delete them no one can see them, but that would equally include users who may not be offended by the content. Thus: We cannot exclude certain categories without calling in a censorship or bias issue - even if we did exclude images we would still be dealing with the fact that offensiveness is personal. Who could determine what others can or cannot see? And who could do it in such a way that we won't end with a bias towards the Wiki's geographical location? I would say that no-one can be granted such control on a Wiki - not admins, not stewards, not users. None.

The idea - individual content filtering

In my eyes the only way to deal with this issue is trough giving a choice to every individual user a choice as to what they want to see. By giving each user a choice of their own we could likely create some middle ground where most people are happy. My idea is a content filtering system that each user may adapt to his or her own wishes. To do that we add a tag-tree like structure that details the content of the images, and give users an interface which allows them to block trees, sub-tree's or nodes. Foe example we might have a top-level tree called "Nudity", with two sub-trees called "Sexual nudity" and "Nonsexual nudity", with each having a node (For example "Sexual intercourse" in sexual and "Topless" in nonsexual). If a user would select to block "Nudity" this would block that tree and every subtree and node. If they select to block "Sexual nudity" it would block that subtree and the "Sexual intercourse" node, but leave images containing nonsexual nudity alone. Of course selecting "Sexual intercourse" alone would block only images containing that content.

One positive aspect of this structure is that the current categories on commons might be a great starting point or perhaps even sufficient to implement such a system. In that case we only need a front-end to surround an already-present system, and some modifications to the code that loads images (Saving a great deal of time over starting anew)

Advantages

  • The fine control would allow per user settings, thus giving direct control to each user as to which he wants to see.
  • Similar systems already exist, so we won't have to invent the entire wheel - we just tweak it to our needs.
  • The tree-like structure allows users to quickly block entire content area's, with the subtree's and nodes allowing more precise control if desired (Thus speeding up the settings)
  • Settings can be saved for logged-in accounts, which would make this a one-time action
  • Blocking certain unwanted images means less bandwidth, and less bandwidth equals less costs :P.

Disadvantages

  • While per-account setting saving is possible, per-IP would be hard due to dynamic IP's. At most we could use a persistent cookie or something similar, but even then it would limit itself to a computer - and not a user.
  • Users must know about the system in order to use it. Not everyone is computer-savvy so the system has to be fool-proof, which is difficult.
  • A lot of editors land on an article page and not on the front page (For example, trough a google search), which means that unwanted images may be displayed before an editor can change the settings.

Challenges

  • Would such a system allow unwanted censorship, where users can force settings upon others?
  • Will users understand the system, and will it actually be used?
  • Is commons current categorie system usable for such a system? If not, can it be easily adapted? (Or would we have a backlog of 7 million pictures to tag?)
  • Would commons users and admins spend their time "stub-sorting" images for the system?
  • Could this system be easily integrated with local uploads (Only free media is on commons - non-free media is local on every wiki).

I made a different proposal ([1]) because I think that the Achilles' heel to your approach is that it relies on random Wikipedia contributors to agree on a set of content tags. But who decides who gets to apply a given class of tags, or enforces sanctions when someone misapplies them? The arguments all land in the hands of a small number of admins with a lot of other things to do. This is not reasonable, especially when some religious and cultural viewpoints will simply have so many internal contradictions and inconsistencies that it really isn't possible for them to get together and rate content without them falling apart in discord. They have to have their own servers, their own power structure, their own rules and regulations, all under their own control, in order to do their job (or fail to do their job) as they want it done. Wnt 03:13, 27 July 2010 (UTC)Reply

Comments

Good idea? Bad idea? Fantastic idea? Ridiculous idea? Easy to implement? Impossible to implement? Usable, Unusable? Whatever your opinion, please voice it. Excirial (Contact me,Contribs) 21:36, 22 July 2010 (UTC)Reply

all very sensible :-) - and a bit similar in flavour to ideas previously raised. Is this the sort of proposal you (Robert) find useful to raise and discuss? Privatemusings 00:12, 23 July 2010 (UTC)Reply
Sounds interesting. You could offer users a front end with a few preconfigured selections to make it easier for them (no Mohammed, suitable for under-12, etc.). --JN466 02:05, 23 July 2010 (UTC)Reply

I think it's a good idea, but I'm certain creating a parallel category structure for objectionable content will have vastly unintended consequences with the usual mass media suspects. I predict headlines such as, "Wikipedia Organizes Porn in Easy-to-Find Categories," and "All Pictures of the Prophet Muhammed Available on Wikipedia with Just One Click," etc. If we really want to do this the right way it should probably involve an entirely new database table and not the existing category schema, and not depend on extra Javascript and CSS that will slow down page downloads and only work on some browser configurations.

Also I'm concerned that the stated goal of "doing no harm" has morphed into what is evidently nothing more than a public relations CYA exercise, and I guess the discussion questions clearly imply that "Controversial Content" actually means "Controversial Images," with no regard to text at all, don't they? 71.198.176.22 03:28, 23 July 2010 (UTC)Reply

Any change will inevitably invite negative feedback, as anything can be bend to suit ones needs if enough pressure is applied. A kitten rescued from a drain pipe can be described as "Wasted money: How a single kitten cost the taxpayer over 150.000 in rescue efforts". We can also be sure that some people will still complain that we show certain content on the basis that they feel that no-one should see it, while the other side can complain that we included the means for censorship. In other words, it is nigh impossible to please the edges of each side, and neither should we try.
As for the "Image only" spectrum, it is understandable as the impact of these is much larger. They are either present, or they are not present, with no in-between where users can choose. Text on the other hand, is easily adaptable and thus it is easier to create an alternative that is acceptable (Neutral) for most people. Excirial (Contact me,Contribs) 08:11, 23 July 2010 (UTC)Reply

In theory this is a good idea. However, in order for it to be anything other than pointless by filtering too much (in which case people wont use it) or too little (in which case people wont use it) it has to offer very fine control over what people want to see (even "naked male buttocks" isn't fine control enough, as people may be fine with it in an ethnographic context but not in the context of a streaker at a football match (both equally non-sexual) and it doesn't define whether buttocks covered in body paint are naked or not. Similarly at what stage of development do girl's breasts have to be before they are classed as topless (given that toddlers generally aren't so classed, even when they have no tops on) and in some cultures it is based on age rather than state of breast development. Does a woman wearing a bikini top that covers her nipples but not the majority of her breasts have "exposed breasts" - in Saudi Arabia she probably would but in America she probably wouldn't? Then you have to ensure that every image is tagged for all content (e.g. some people will not want to see spiders, so you need to make sure that every image that contains spiders, even in the background, is so tagged). You can't rely solely on categories either, for example there is part of the hierarchy that goes Sex - Chastity - Virgin Mary [...] Churches in the Czech Republic dedicated to the Virgin Mary. So if you block all images in the "Sex" category and its subcategories you also block the pictures of churches. If you also whitelist the category "Churches" then you create a conflict as the "Churches in the Czech Republic dedicated to the Virgin Mary" is both blocked and allowed and it obviously cannot be both.
To summarise, descriptive image tagging is a Good Idea and should be encouraged, but until every image is so tagged using it for filtering is just not practical. To make a filtering system work for our purposes it needs to be as finely controllable as possible, but this unavoidably means it is complicated and time consuming to set up, which will put off the people most likely to want to use it. Thryduulf (en.wikt,en.wp,commons) 12:59, 23 July 2010 (UTC)Reply

Controversial content inside Wikipedia

I'm not sure to understand why you will focus on images, or maybe I should read "controversial" as "problematic for occidental mainstream media"?

If you want to have a first look at what are the controversial content inside Wikipedia, you should have a look at those scientific study on that subject: What’s in Wikipedia?: Mapping Topics and Conflict using Socially Annotated Category Structure & He Says, She Says: Conflict and Coordination in Wikipedia

Thus, watch out for contents about: religion, philosophy, people, science, society and social sciences, history.

References:

  • Aniket Kittur, Ed H. Chi, and Bongwon Suh, What’s in Wikipedia?: Mapping Topics and Conflict using Socially Annotated Category Structure. In Proceedings of the 27th international Conference on Human Factors in Computing Systems (Boston, MA, USA, April 04 – 09, 2009). CHI ‘09. ACM, New York, NY.
  • Aniket Kittur, Bongwon Suh, Bryan A. Pendleton, Ed H. Chi, He Says, She Says: Conflict and Coordination in Wikipedia. In Proceedings of the 25th international Conference on Human Factors in Computing Systems (San Jose, CA, USA, April 28 – May 3, 2007), CHI’07. ACM, New York, NY.

Good luck. nojhan 13:31, 23 July 2010 (UTC)Reply


Ways to solve this while preserving NPOV/NOTCENSORED

  1. Do nothing, but leave it up to the regular internet filter people. There's a whole world of people trying to protect other people from potentially offensive content. We don't need to re-invent the wheel, we don't need to build a wheel, we don't even need to provide a wheel. All we need to do is say "Wheels exist, if you want one, go google it".
  2. Allow voluntary verifiable/Descriptive labeling combined with a Wikimedia-provided voluntary filter.
  3. Let like-minded individuals collaborate to produce different subjective ratings-- but allow for a diversity of such ratings, none of them 'official'.
  4. Start allowing a greater diversity among the projects. The reason Wikipedia CAN'T be censored is that it's the only Wikipedia we have. But nothing is stopping us from making new projects that have their own policies. I think the future of the Wikimedia movement will see us hosting THOUSANDS of such projects, each with their own unique topics, readerships, and editorial policies. Muslim readers, for example, have expressed displeasure that they have no "project of their" own right now, not even in their own languages-- Well, why not try let them try to build one?? The beginnings of a kid-safe project? Why not try it? And then, when there is diversity of projects, any GIVEN project become less mission-critical.
  5. Lastly, if somehow we absolutely had to have a censored project, then they could always rename the current Wikipedia to "Uncensored Wikipedia" or something, to underscore this fundamental aspect of its policy. The new non-offensive Wikipedia can argue over offensiveness, while the rest of us continue editing unfettered by such concerns. (Though I feel "Uncensored Wikipedia" should have the claim to the Wikipedia name, but you get the idea). --Alecmconroy 17:35, 24 July 2010 (UTC)Reply
Well, a lot of ways to tackle this issue, so lets see..
1) Quite a valid option. Internet filters already exist, so it should be possible to modify them in a way that it blocks certain images. One concern is that certain images may not be blocked on default, which may lead to lots of manual configuration (And some users aren't exactly computer savvy). Another problem is that we constantly update Wikipedia, which may leads to unblocked content every now and then. Still, its a fine idea.
2) Well, i can be brief about this one - see my own section up above which deals with this issue.
3) I presume such ratings would be used in combination with a normal filter? Even so i would point out that subjective is generally a bad thing - it makes collaboration between editors difficult, which is a bad thing since we would have 7 million images to rate. The amount of effort and the cost of maintenance would likely defeat its purpose. I would be surprised if we will have one half-finished rating some years from now. If we used an objective measure people could at least cooperate on such a venture.
4) bad, BAD idea. We already have an example of this (Conservapedia), and we all (Well, can't really speak for others but myself but you get the point) know how neutral and encyclopedic that project is. People are free to fork Wikipedia for their own initiatives, but let them do so without Wikimedia support. One of the strengths of Wikipedia is that we have editors from all around the globe working on the same content, which enforces neutrality due to constant pushes from both sides. If we start splitting on the basis of religious, cultural or political believes we would get a fine mess without neutrality. Besides, whats the point? We would get a ton of encyclopedia's detailing the exact same content in the exact same language. And who would decide what's worthy of a Wiki?
5) Bad idea as well. Encyclopedia's are valuable because they provide reliable, sourced and neutral information. NPOV, and in part NOTCENSORED, are therefor a critical aspect to any encyclopedia. We would just end up with 1001 projects detailing the exact same thing, with slight-to-major differences in bias. We are here to build an encyclopedia, and not to create 1001 projects where a handful of people can sprout their idea's as if it were blogs. No one would be interested in reading a ton of small projects, and no one would be writing other then a handful of people. And people would still be complaining about the other Wiki's. Excirial (Contact me,Contribs) 09:04, 25 July 2010 (UTC)Reply