Talk:Image filter referendum/en/Other: Difference between revisions

From Meta, a Wikimedia project coordination wiki
Latest comment: 12 years ago by AlecMeta in topic Other
Content deleted Content added
Line 316: Line 316:
::::::::LOL, that is what ''you'' are clearly assuming, not me or anyone else. Your projection is quite telling. Unfortunately for you, the one person it tells nothing is you. --[[Special:Contributions/213.196.209.190|213.196.209.190]] 13:08, 23 August 2011 (UTC)
::::::::LOL, that is what ''you'' are clearly assuming, not me or anyone else. Your projection is quite telling. Unfortunately for you, the one person it tells nothing is you. --[[Special:Contributions/213.196.209.190|213.196.209.190]] 13:08, 23 August 2011 (UTC)
:::::::::You just switched to yet another IP, which verifies my statement that a small handful of people are using sock puppets and changing IPs to make it seem far more than there actually exists. And I find it interesting how you say "projection" as if I would be logged out editing or socking, which is laughable. [[User:Ottava Rima|Ottava Rima]] ([[user talk:Ottava Rima|talk]]) 14:07, 23 August 2011 (UTC)
:::::::::You just switched to yet another IP, which verifies my statement that a small handful of people are using sock puppets and changing IPs to make it seem far more than there actually exists. And I find it interesting how you say "projection" as if I would be logged out editing or socking, which is laughable. [[User:Ottava Rima|Ottava Rima]] ([[user talk:Ottava Rima|talk]]) 14:07, 23 August 2011 (UTC)
::::::::::Not happening, and not that important if it does. IP users are first class citizens here, let them be or try to change their minds-- don't overly fixate on the signatures. Each comment comes from a human mind-- discuss the issues raised or discuss the concerns in the abstract, but singling out specific comments and using their login status to dismiss their arguments is... counterproductive. --15:25, 23 August 2011 (UTC)


:<span style="color: green;">there have now been over 20,000 votes registered in this plebiscite, which makes it the highest degree of participation in any Wikimedia poll/vote/referendum ever.</span>
:<span style="color: green;">there have now been over 20,000 votes registered in this plebiscite, which makes it the highest degree of participation in any Wikimedia poll/vote/referendum ever.</span>

Revision as of 15:25, 23 August 2011

Concerns

Slippery slope danger

This is how all censorship starts.. slowly. First it's opt-in, then opt-out and then it wil be mandatory on request of foreign governments / organizations, since the meganism is already in place. Very dangerous. If people want to filter, fine, do it on your own PC. Don't give organizations / government the opportunity to easily implement a blacklist for their people. Koektrommel 07:58, 3 July 2011 (UTC)Reply

That's a huge extrapolation based on a very preliminary step - and not one that I think is justified. Given that it would already be technically feasible (and easy) for foreign governments to apply such censorship if they wanted (e.g. see the Great Firewall of China or the Internet Watch Foundation), as well as the finances for technical development available to governments / organisations in favour of censorship (which are several orders of magnitude larger than those available to Wikimedia), I seriously doubt that this would make it significantly easier for them to implement... Mike Peel 20:22, 4 July 2011 (UTC)Reply
I don't think that it is within Wikipedia goals to provide a tool to make job of censors easier. Ipsign 06:57, 16 August 2011 (UTC)Reply

Esto no es una herramienta que aumente la libertad, sino que la recorta. De aprobarse será la propia wikipedia, que no censura contenidos ni temáticas, la que tristemente proporcione el instrumento de censura a los censuradores. Lamentable. Wikisilki 19:01, 5 July 2011 (UTC)Reply

Translation: "This is not a tool to increase freedom, but to limit it. If approved, it will be Wikipedia, which does not censor content or themes, which sadly provides a tool for censoring to the censors. Lamentable." —translated by Sj.
Again, I must very forcefully say that it should not be the goal of Wikipedia to champion "libertad", but to educate. To deny education to the people who need it the most (i.e. children in developing nations that often have highly moralizing, oppressive cultures) because you find it distasteful to implement a voluntary, opt-in method for hiding images that might preclude large swaths of people from using Wikipedia because their cultures don't allow it - this is absolutely vile and idiotic. Unigolyn 08:59, 18 August 2011 (UTC)Reply

I concur. Implementing this feature would be clearly the first step towards censorship on Wikipedia. The next step on this way will be to block whole pages (for example, by adding category 'sexuality') which will allow schools to inject cookie with 'Display-Settings: sexuality=denied' and block such content for all the students. While it might be the intention, it won't stop there: the next step will be to create category 'Display-Settings: opposed-by-chinese-government=denied' and mark Tiananmen Square protests of 1989 with it, which will effectively allow Chinese government to simply inject appropriate cookie at firewall and block content which they don't want Chinese people to see. It is also *very important* to understand that limiting the feature only to 'logged-in' users will *not* fix this problem (regardless of implementation, it will be *very easy* to enforce at firewall level that all users from China are always logged in as the very same user). Ipsign 06:57, 16 August 2011 (UTC)Reply

It sets a precedent that I really don't want to see on Wikipedia. --Dia^ 07:54, 16 August 2011 (UTC)Reply

I dont want that tool, its the key to introduce censorship. If a certain image cannot be deleted if will be added to all possible categories to prevent presentation as far as possible. If someone has problems with uncensored images, he/she should read only their own publications.— The preceding unsigned comment was added by an unspecified user

Thanks, Ipsign and others, for illustrating the slippery slope fallacy. There is no logical reason why this would be extrapolated to tagging subjects blocked by China. In addition, China has already blocked that article and others in the past, so the tagging would be years too late to serve any purpose whatsoever. There are specific reasons given by many people why we should allow users to opt-out of seeing specific types of images. There is no reason why we would then apply this setting to aid censorship by governments. It's illogical FUD, not rational opposition. Sxeptomaniac 22:30, 17 August 2011 (UTC)Reply
Just because one uses the phrase "slippery slope" doesn't automatically make it a fallacy. Like in all political issues, setting a precedent - that Wikimedia is willing to accommodate offensivists - does make subsequent steps toward the bottom of the slope easier. Why allow personal sensitivities to influence what pics get displayed in Naturism, but not allow "national sensitivities" to do the same? 41.185.167.199 12:44, 20 August 2011 (UTC) (a.k.a Bernd Jendrissek)Reply
It's a slippery slope fallacy if you can't give good reasons why the supposed chain of events will happen. There are some gray areas, but the very fact that you can use "personal sensitivities" vs. "national sensitivities" works against the fearmongering. Sxeptomaniac 22:42, 22 August 2011 (UTC)Reply

Can't complete form: "not eligible", and I've tried en, de, nl and give up.

I am utterly AGAINST! the proposal. - It reeks of censorship! Unless it's a filter, to be built in/activated by adults (on their own computer - and ONLY there) for the protection of minors.

Lots of zeros and one question mark as my reply.

Thus "?" I answer question # 6; for "culturally neutral" is an oxymoron. Culture is diverse by nature & definition. To conjure oxygen into carbon dioxide (by magic) while breathing is about all we humans have in common.

Hence my unease at the VERY suggestive example in question # 5 (condone the "ferocity of brute force" versus proscribe a "state of undress") redolent of the nauseating puritanism which abhors all that could ever so remotely hint at sexuality (or indeed the corporeal).

Apart from moral-mongering religious fanatics (atheists included) there is the problem of definition.

"Moral" (from: Latin mores=customs): where and when? Same for "Nude": is that my freshly shaven face (half an hour ago I rid myself of a three days' beard) or only when I was actually in the shower, unclad, naturally (naturely). "Obscene" is another one. Can I come to the conclusion that I find the description of π "Offensive"? I've already mentioned "Culture" and could prattle likewise for days on end.

In my not so humble opinion there is no slippery slope: there is a precipice and, although it might take a while, we, along with wikipedia, shall splatter. Sintermerte Sintermerte 18:24, 22 August 2011 (UTC)Reply

Once you're done hyperventilating, look up "censorship" in the dictionary you used to look up "moral" and get back to us. As far as "religious fanaticism", the panicky opposition to this proposal fits the definition far better. Sxeptomaniac 22:42, 22 August 2011 (UTC)Reply

Abuses by censors: self-censorship seems to open the door to 3rd-party censorship

From other comments on this page, there seems to be a severe confusion about the intention of the feature and of its potential (and IMHO very likely) abuses. One common argument pro this feature is that it is self-censorship, which (unlike 3rd-party censorship) looks as a good thing. Unfortunately, there are severe implications of available technical solutions. It seems that this feature will almost inevitably be based on so-called cookies. It will essentially allow any man-in-the-middle who sits between end-user and wikipedia server (usually ISP, which can be controlled by government/private company with certain agenda/...), to pretend that it was end-user who decided not to show controversial images, and Wikipedia servers won't be able to detect the difference (the only way I know to prevent such attack, is SSL, and even it can be abused in this particular case - 'man in the middle' can intercept SSL too, and while user will know that server certificate is wrong, he won't have any other option, so he'll need to live with it). It means that by enabling users to filter themselves out, we will also be enabling 'in the middle' censors to make filtering much easier than they're doing it now (it will also probably mean less legal protection against censorship: for example, if somebody will filter out Wikipedia images in US right now - they will be likely committing copyright infringement - with 'fair use' defense being quite uncertain, but if it is Wikipedia servers who's doing the filtering - copyright argument evaporates, making censorship much more legal). This is certainly not a good idea IMHO. Ipsign 07:59, 16 August 2011 (UTC)Reply

In order to distinguesh between images, they have to be categorized. And there you will find the censors. --Eingangskontrolle 08:47, 16 August 2011 (UTC)Reply

Yes, even if this is intended for personal use, it will in fact be used by network managers and ISPs to censor Wikipedia for their users. MakeBelieveMonster 11:50, 16 August 2011 (UTC)Reply
Okay, so your scenario here is that a malevolent entity, with government-like powers, does broad-based man-in-the-middle attacks in order to... turn on a filtering system, where the user can get around the filter just with one click? There's nothing we can do about MITM attacks (other than recommending SSL, so at least you know you've been compromised) and if there is someone capable of an MITM attack then they are not going to use our click-through-to-see filtering system, they will just block content, period. NeilK 17:52, 16 August 2011 (UTC)Reply
FYI, the world is full of malevolent entities with government-like powers. They're called governments. (Brrrrummmm-Chhhh) ;)
The fundamental problem isn't the optional filtering switch, it's the background work on establishing a censorship-friendly categorisation system that can be read and reused by outside software and used to impose third-party filtering, the willingness to include the switch, and the difficulty that Wikipedia would then have in resisting legal demands to extend the censorship system once all the infrastructure was in place, and the crippling legal liabilities they might have if they'd already broken their own principles on non-censorship, and had the technical ability to impose broader censorship, and didn't. If WP implements a purely optional censorship system, then the next time some kid blows up their high-school, the lawyers will be poring over the kid's internet records to see if the kid ever accessed anything at all on WP about explosives, and if so, they'll be launching a lawsuit on the grounds that WP had had all the software in place that would have let them block the content, and had chosen not to use it.
If you show that you have the machinery and willingness to edit, and you choose not to edit, then you tend to lose the "simple carrier"-style assumed absence of liability for the material that you carry. That's why it's critical to stick by the principle that your charter forbids censorship, and why its so important not to develop these systems. It gives you the defence that you are forbidden to censor by your charter, and couldn't technically do it even if you wanted to. Introducing this system would remove both those defences and introduce a whole new set of potential vulnerabilities and liabilities for WP that I don't think the committee have really thought through. ErkDemon 00:01, 18 August 2011 (UTC)Reply
No. What they are saying is that all it takes is one malevolent user to inapproprately tag harmless content with labels that will trigger the censorship system. Honestly, if you are liable to be offended by images on a given subject then you wouldn't really be viewing such an article in the first place. If you do encounter an 'offensive' image it is probably because a troll has uploaded something inappropriate to an article whose subject suits your sensibilities, and I don't think they would be helpful enough to censor tag their work for you. --2.26.246.148 20:52, 17 August 2011 (UTC)Reply
"That's naive though. Suppose that I need to learn about Naturalism (art) - so I type in "Naturism" - and find (to my horror), not some crappy porcelain figurine of a shepherd, but instead, right there at the very top of the page, to the right of the lede - a picture of half a dozen naked people. A simple typo or misunderstanding of a term (that the reader is looking up in an encyclopedia precisely because they don't yet understand it) can result in such problems. Consider that there are some companies out there who might fire someone for looking at the picture at the top of Naturism on company time when they are supposed to be researching Rococo figurines! So precisely because I'm against censorship of Wikipedia - and because I demand the right to have full-frontal pictures of nudists at the top of the naturist page - I'd very much like a "No Porn Please! filter." SteveBaker 19:10, 17 August 2011 (UTC)" Not to mention ErkDemon's point just above yours. Sdmitch16 02:50, 19 August 2011 (UTC)Reply

So we start with censorship against nudity and cruel pictures and end where ? Political and religious censorship will follow soon. This is not really self-censorship, because someone will have to add a tag to each picture to categorize it. There the censorship already starts. 94.217.111.189 06:30, 21 August 2011 (UTC)Reply


Complicated editing

If this feature is implemented, then when I'm editing the article, I will need to think "hey, how this article will look if user uses this filter? Or that filter? Or that combination of dozen of filters?" In many cases images are essential part of the article, and making article comprehensible without them (or worse - with an arbitrary combination of them), will be an editor's nightmare. And while I am not involved in editing of sexual or religious articles, I am not sure that this tool won't evolve, for example, to allow filtering out "all images which are related to Microsoft Windows/Linux/..." - if it ever happens, then I will find my editing severely affected. Ipsign 07:23, 16 August 2011 (UTC)Reply

As you can see from the mock-ups, it will be immediately clear that an image is hidden. The box shape and size shouldn't change at all, and the caption won't be hidden, so this shouldn't make images look different at all. Cbrown1023 talk 15:14, 16 August 2011 (UTC)Reply
Because of the bug that is "being investigated" (see below), i cannot check the referendum questions yet, but the "What will be asked?" section on "Image filter referendum" does not seem to propose a question like "How prominent should the warning that an image is hidden be?" Are you making this statement as a promise by a member of the tech team working on this? Or as a promise by the WMF Board? A mockup is just a possible implementation, not a definite tech proposal. i agree that the mockup proposal seems sufficiently prominent that it's hard not to notice it. (In fact, it is probably so prominent that censoring authorities would not be satisfied, since it would encourage the curious to circumvent the filters.) But i don't see any promises that it would remain that prominent.
As for the main point being made here, i too would expect that in articles where a picture is important, there would be many editors who would feel that the article needs to be primarily readable by those with filters rather than those without. Given that "a picture is worth a thousand words", this effectively means several thousand extra words may often have to be added to articles, making them seem longer and more complicated to readers with filters turned off. Boud 21:35, 16 August 2011 (UTC)Reply

Simple editing: All reported images have a white space with option “some users found this image offensive or inappropriate, click to show”. Hide or not image by default based on users’ set threshold – example 1% of visitors reported this image as offensive. --unsigned

The only problem is that not only is that game-able, not only do the tools exist to game that, but there are parties who use those tools and actually thrive on gaming such systems; and these parties are active on the internet today. I'd put the odds of an attack on a reader-provided-input system to be a near certainty in the first 12 months of operation. --Kim Bruning 13:08, 21 August 2011 (UTC)Reply

Copyright concerns

Forgive me if I have got the concepts wrong, but the way I see the licensing is that an ISP may not alter the text without providing the original source (or a link to it). If we detect a middleman tampering with cookies to deny users the ability to see these images on a permanent basis, should we pursue those people for violation of the copyright? And if not, does that mean the WMF CC license is unenforceable,and pointless? Obviously data manipulation could be happening already, but a cookie-based solution is an open invitation for low-level tampering. Inductiveload 17:44, 16 August 2011 (UTC)Reply

Technically, such an ISP would not alter what is sent by the WP servers, but would alter what the user requests. Not different from secretly altering (http or even DNS) requests for wikipedia.org with censopedia.example, say (which is of course also not a nice thing to do). On the other hand, already now I can imagine that the content of w:EICAR test file might not arrive in its original unfiltered form at every user's PC, depending on anti-virus policies along the line, a thing I am reluctant to call a copyright infringement.--Hagman 13:56, 21 August 2011 (UTC)Reply


Controversy in selection of categories, classification

I put a free text response in the voting page, and I am also sharing my implementation concerns here, since I don't know where else to address them (other than on the ballot). I am understanding of why this feature was proposed, though I don't approve of it; from the discussion it seems that this feature is unavoidable. I apologize if this is very long to read but, as I hold Wikipedia and free (to reuse) culture very dear to me, I took some time to write what I had to say.

Most of the sites I visit that accept user-generated creative contributions - en: Flickr, en: LiveJournal, en: DeviantArt, en: FanFiction.Net, for example - rely on the uploading user to properly classify their content; some have penalties if the contributing users don't do this correctly. This is a single setting, either a yes/no or an age scale, which indicates that the content may be unsuitable for younger audiences or contains graphic content.

The use of categories to label content presents challenges impartial and accurate labeling of existing content. Aside from the difficulty of manually review of millions of images, this process presents a situation where humans (with opinions and biases) are being asked to make subjective judgments on multiple categories. It also presents the opportunity for individuals and groups to request categories for their own purposes, causing a bureaucratic nightmare.

These problems, fortunately, can be muted by a judicious application of clear and sensible written policies outlining exactly how any image hiding feature is to be used. These policies should not be based merely on the "average person" (to quote the en : Miller test) or pure intent, but take into account the context of the image as well. Any resulting policies should reflect Wikimedia community consensus with adequate community input.

We should not forget Wikimedia's educational mission. I've seen enough drama to know how groups use the Internet as a venue to jockey for social and political influence - just as they do in the real world - through controversy-seeking, special accomodations, special rules, and efforts to assert and maintain power over societal norms, policies and beliefs. We don't need to set up another soapbox.

My response to the ballot, heavily edited for space, emphasis added:

First, I would like to express my concern that implementing this feature will turn Wiki editors and/or Foundation staff into the arbiters of decency, with controversy over (selection of) categories. I understand the purpose for the feature: just as search engines have ("SafeSearch"). I encourage the adoption of policies that guide classification (...) into narrowly-defined, conservatively-applied definitions of content that is generally applicable to a wide range of audiences.



The filter, if implemented, should be crafted with the intent of providing an enjoyable experience for Wiki audiences, in keeping with Wikimedia's educational endeavor, and not in response to controversy. (...) if the feature generates "image flagging wars", it has failed to serve its original intended purpose.

Regarding the question that "the feature be culturally neutral (...)." It is important that the Wikimedia Foundation not censor to the lowest common denominator in an attempt to appease every individual or group that may be offended. For example, epileptics may ask for a feature to hide Flashing GIFs. Some readers cannot stand the sight of blood, even in an educational context. Television viewers might want to hide spoilers. (certail religious) adherants object to (...) certain images and text. Adding numerous categories not only makes the filter difficult to understand and use, but lends itself to exploitation by interests that wish to abuse it as a venue for expressing opinions on what image subjects should be deemed controversial (...) which could include "immodest" uncovered women, illustrations of "God", time travel (illegal in China), even photographs of poop.

(...) perhaps we should look to the "viewer discretion" notices on television programming as an example of how to approach it. (...) With regards to implementation, I suggest that the description (in Image Filter settings) be phrased as "Hide" and "Show", instead of "Allowed" and "Blocked", to reflect that the image hiding feature is not a substitute for parental controls.

Books in the public library never came with flaps over pictures to hide them from readers that might be offended (...) If it's encyclopedic, why hide it? If it's not educational, why is it in Wikipedia?

Soapbox

  1. The proposal is another level of administration that will require more policing and generate more disputes over categorisation.
  2. If utilised it is effectively censorship. Wikipedia is supposed to be a reference work, ipso facto the proposal would be morally identical to censoring the dictionary; quite literally in the case of Wiktionary.
  3. This would be far too easy to extend to the censorship pages or words in addition to images. Are we going to end up with a list of articles that don't get shown to people who have the "violence" box ticked?
  4. If this is meant for parents / guardians of children to filter what they can view then it's pointless. Children do not need to go to wikipedia to obtain sexual imagery.
  5. The use of the word report in the fourth question is inflammatory and were the proposal to go ahead should not be utilised elsewhere. Do you want people reporting the dictionary?
  6. Each censored picture would be reliant on others to categorise; what happens when a picture of a bridge is categorised as being "violent"? Does that bridge then get removed from view until someone realises that this has occurred.
  7. Violence, and sex, happen. Often. By allowing people to remove subjects from view we are aiding a disassociation from occurrences that everyone should be aware of.

Vote! Kae1is 12:48, 20 August 2011 (UTC)Reply

conservative POV bias

Any system that attempts to distinguish one image from the next will require individual labelling. In turn, any such effort will, I argue, become dominated by a conservative cultural and religious POV.

I would posit, for example, that the vast majority of images labelled as 'objectionable' will be one's relating to human sexuality and will focus in particular on what social conservatives consider 'aberrant'. It sickens me to think that every depiction of homosexuality, the naked human form or gender non-conformity, whether sexually explicit or not, may have to bear a stigmatising label reflecting a particular prejudice. Such labelling may in turn help trigger external filtering software that already reflect a conservative bias, thus further harming access to Wikipedia.

While I fully admit the global political left has far from consensus views on what is acceptable free expression, the number of images that might fall afoul of "political correctness" or "hate speech" concepts will scarcely compare to the scope and willingness of social conservatives to push their POV. I offer Conservapedia as evidence of how far they will take their agenda and this current proposal is simply opening the door and inviting them in. Added to this situation is the irony that many forms of left censorship are on behalf of cultural and religious conservatism stemming from the developing world, the most prominent example being the self-censoring response to 'offensive' images of Mohammed.

Both coming and going, an image tagging effort will be a boon for conservative views and a blow against the disinterested presentation of knowledge. --Bodhislutva 04:09, 21 August 2011 (UTC)Reply

I agree with your assessment and I'd like to add that I the idea of a catch-all category for "NSFW"/"objectionable content" is thoroughly incompatible with the notion of neutrality (which entails cultural neutrality). The idea, implicity and excplicitly professed by several people on this talk page, to actually host and maintain such a category or categories within Wikimedia is bordering on the obscene. --87.78.45.196 04:44, 21 August 2011 (UTC)Reply

We could introduce a Dutch POV bias instead? Images relating to Firearms and (most) fireworks blocked, some nudity permitted, sex blocked. (Is that roughly correct?). We could differentiate between different political leanings if you'd like. It'd be interesting to compare with what's objectionable or not in other countries. O:-) --Kim Bruning 14:08, 23 August 2011 (UTC)Reply

Other

Is the 'beyond-Perfect' proposal controversial?

I support the proposed filter as 'good enough' for complex reasons. But I'm also very interested in understanding where others stand on the issues.

What I call the 'perfect' system would be:

  • Infinitely Customizable each user can specify any list of images or categories.
  • 1-Click-To-View-at-all-times images are never denied, just temporarily hidden.
  • Only upon user request only the user can turn on the image-hiding feature.

My 'Beyond-Perfect' system would also have two additional features:

  • Not developed or promoted by WMF, no WMF resources go to support it.
  • Active only on non-WMF servers / clients

Is there anyone who finds filtering so controversial that they'd 'Vote No' on what I've call the beyond-perfect proposal? --AlecMeta 22:11, 19 August 2011 (UTC)Reply

The problem is that it's circumventable. You can lock down what a user is logged in as, you can prevent them from changing that cookie, you can deprive them of JavaScript, and you can stop them from accessing specific parts of a website that allow undoing the filter, thus preventing your "1-click" from working. Samsara 22:52, 19 August 2011 (UTC)Reply
Making it easier for person A to censor person B's reading habits, that's definitely controversial.
I'm curious how controversial this is in the ideal, setting aside slippery slopes. Do we agree a reader 'has the right' to 'censor' their own reading, or is that itself a source of controversy? --AlecMeta 02:40, 20 August 2011 (UTC)Reply
This is an interesting area. It's probably in the class of things where it is a good thing for the individual but a bad thing for society as a whole. The reason it is bad is that it avoids challenging preconceptions with fact. This is part of the concern I think, of the "filterbubble" meme (I haven't read the book). So does the reader "have the right" yes, it's in the nature of the license that they have the right to create derivative (censored) works. But nonetheless we individually and as a community have views about those derivative works. We are generally pretty relaxed about "Wikipedia for Schools" and Wikibooks for example. We are not so happy when people "print on demand" an almost random and un-quality controlled set of articles as a "book", for a ridiculously high price - using our work to take money for an inferior product. We would probably be deeply disturbed if, for example, Holocaust revisionists created a version of Wikipedia that excluded or distorted all mentions of concentration camps. They would of course have the right to do it (in most countries), but we certainly would not wish to be a purposeful enabler of it. Rich Farmbrough 03:01 20 August 2011 (GMT).
I honestly don't see anything relevant to the filer. Nazism can still create their own encyclopedia using Wikipedia materials selectively without the filter. -- Sameboat (talk) 03:22, 20 August 2011 (UTC)Reply
"We certainly would not wish to be a purposeful enabler of it." And in fact the report recognises this risk and warns against it in another recommendation in the same section: Recommendation 10, which I encourage all to read. Rich Farmbrough 15:17 20 August 2011 (GMT).
And censors can create their own copy on their own servers, paid by their money. But not money donated for free content should be used. --Bahnmoeller 16:09, 20 August 2011 (UTC)Reply
Much wisdom in the concerns of resource use. I donated knowing this was still in the works. Most of our donors probably had no idea it was coming. In May 2010, when I had donated to 'protect wikipedia' and it was briefly announced we would start deleting offensive images-- in that case, I absolutely felt a sense of 'betrayal'-- that's not what I had donated to, indeed it was the exact opposite of what I had donated to.
I think the price tag will be very small, but we have to be very careful about resource usage. The language we use to solicit donations is 100% antithetical to the language we used in the image deletions. We need a 'sense' of how "average donors" feel. If someone gave us the system for free, I'm comfy with us using it. But using large chunks of own funds on this particular project may not sit well. It's okay by me, but I can't speak for all the donors, and most importantly, I have no clue at all how this policy will affect future donations-- positive or negatively.
Donors don't 'own' us, they're not 'entitled' to dictate our future, but realistically, if they wanted to build a filter, they probably would have given their time and money to someone else, and that's very much something to consider. --AlecMeta 14:47, 23 August 2011 (UTC)Reply

A waste of server resources paid for with donated funds?

Wikipedia has only been able to scale as far as it has by placing servers behind Squid or Varnish caches so that a page is not re-rendered (a processor-intensive step) every time it is requested for reading, in normal view, by anyone other than a logged-in user. The last rendered version is stored on a separate front-end "reverse proxy" (I believe these are Squid on WMF projects) and simply retrieved in their already-prepared form from the hard disc cache on that server - a relatively quick and inexpensive operation.

Generate a multiplicity of bowdlerised versions of the same page, and the version with that "peace be unto him" guy´s photo censored can't be served in response to an anon-IP request for the same page but with those pesky vaginae with their sharp teeth and tedious monologues excised in much the same way images of the human breast must be excised to avoid traumatising innocent babies. That means a greater number of these requests, instead of cache hits, become cache misses which require the MediaWiki origin servers to re-render the entire page in some censored, butchered form. This isn't just a violation of established (WP:CENSOR) policy (much like the one-sided e-mails are something beyond just a WP:CANVASS violation) but something which will actually increase server load, requiring a greater number of Apaches be deployed to endlessly render and re-render multiple butchered versions of the same page. If those were pushing for this were paying from their own pockets to run the site, that would be one thing, but WP servers are funded through donations and WP editors are regularly solicited for contributions. This being the case, more servers just to deal with an artificially-created page-render loading caused by an ill-advised censorship scheme is not a worthy use of donated money and not something to which I would wish to contribute. --66.102.83.10 22:37, 20 August 2011 (UTC)Reply

Effort better spent on improving articles?

I don't know why we have such flurries of activism for implementing new features when the basic quality of our articles is still so low, AND we're losing contributors. Yet the changes being proposed recently don't address the problems that exist in the community, we're merely pandering to the readers, and locking more articles from editing than ever before. I also see many contributors entering into a sort of excuse culture, where typical claims are "I can't write articles (but I want to be an admin)" (can't drive a ship but want to be captain? sure, here's your badge *pins*) and "I've no expertise in the field". If you can write a comment, you can write an article, and if you don't have expertise, well, neither so did anybody else before they immersed themselves in their current field of expertise. So neither of those are valid excuses, and I suspect this goes for most others that have been tried. Samsara 22:02, 19 August 2011 (UTC)Reply

Avoiding people being offended by certain content enables them to contribute without worrying about what they might come across. So this proposal is furthering the goal of keeping contributors onsite for longer. Promethean 07:07, 20 August 2011 (UTC)Reply
At the other hand we have hundreds of active authors that now spend their time with categorizing images, discuss the pro and cons or doing edit wars to please the admins. Just imagine how many articles could be improved or written in the meantime. --Niabot 16:32, 20 August 2011 (UTC)Reply
Define "offended". Speaking honestly, I find specific categories of religious images offensive. During the middle ages, it was considered a sacrilege to use living babies as models for the infant Jesus, so in many paintings of the Madonna & Child a dead baby -- which was either a stillbirth or often born deformed -- was used. This is why the infant Jesus looks bizarre in those medieval paintings. And then there is the issue of paintings of the Crucifixion, many of which are as grisly as a auto wreck; I wouldn't be surprised if medieval people with a BDSM fetish expressed it through representations of martyrdoms. If this measure goes through, I will make an effort to get those offensive images filtered out by default. -- Llywrch 21:58, 20 August 2011 (UTC)Reply

useful

in my view, this feature is very useful for slow speed internet users like me. -RAJASUBRAMANIAN from tamil wikipedia--தென்காசி சுப்பிரமணியன் 09:57, 20 August 2011 (UTC)Reply

Dude, that means you only like looking at pages with psuedo-porn and violence :D This isn't meant to block all photos, though admittedly, I suppose that might be a nice feature for people with dial-up or old monster comps. Rodgerrodger415 16:25, 20 August 2011 (UTC)Reply
If you want to block all photos because of slow internet, your web browser already provides that feature. Yes, all current browsers do. That said, we don't really know the implementation details of this feature, so it's possible that with the filter active you might be receiving MORE data from the site.(ie you might get the original picture sent to your browser(even though it doesn't display) but you will also get additional images for covering up the filtered ones, as well as way more client side javascript to slow down your already slow computer even more. It is actually more likely that this feature will increase page load times even if you are filtering every image. Once again, this type of filtering shouldn't be Wikipedia's concern. It should be the end users. Wikipedia is in no way responsible (nor should they feel they are) for tagging or providing the means for someone to self censor themselves. This sort of feature is outside the scope of Wikipedia as a content provider. People like to mention that there are things such as Google safe search, but fail to make the distinction that Google isn't a community policed content provider, as Wikipedia is. Google has safe search becuase the way the do page rankings (programmaticly) is susceptible to mis-tagging and other such things that enable people to get there X-rated pages to come up for normal search terms. This was never an issue on Wikipedia, and it never will be, os it's irrelevant to mention it's existence in the context of this discussion. Pothed 17:06, 20 August 2011 (UTC)Reply
To suppress all images, look at en:Wikipedia:Options to not see an image for instructions. Rich Farmbrough 18:50 21 August 2011 (GMT).

My old Schoolbooks

Hey, I've just take a look into my old schoolbooks. I still got them. Its my old history book and my old biology book. The history book is a collection of violent scenes of all times. The horror of the WW 1 and WW 2, the holocaust, slavery, and so on. My biology book can't called Pornographic but you can find the explict representation of the human body. Now I'm asking you, Sue and the Board. Do they think, Wikipededia should be not so educational as my old school books? Should Wikipedia not undertake the same en:Didactic method, Teachers expect of there students? As student you can't look the other way if you really wanna learn somthing. It's also a question of real understanding a topic. But it seems the Board of WMF knows better than all the pedagogues and teachers of the world.

Or is this nonsense a question of avoid touble with conservative powers in USA? Hey, if you don't want to learn something you don't have to! All the Authors of Wikipedia spent their time by collecting the knowledge of the world. But the board is not interested in all these questions of what the sense of this work is. I think they don't care. -- WSC ® 10:51, 20 August 2011 (UTC)Reply

Why did it take you so long

to add a feature that will allow me to hide the pornographic images on Wikipedia, somehow there is no firefox add-on to do this yet (at least no effectively). Las tortugas confusas 13:55, 20 August 2011 (UTC)Reply

Because one person's pornography is often another's art -- and vice versa; there is no commonly-agreed to definition of "pornography". One example is that Oregon, which is part of the United States, has no legal definition for the word "obscenity" due to the free speech clause in its state constitution. And since there is no simple solution for this issue, it hasn't been tackled -- & many of us are no convinced that any solution would be worse than doing nothing. -- Llywrch 16:22, 20 August 2011 (UTC)Reply
"Because one person's pornography is often another's art " Which is why people can turn it on or off. Ottava Rima (talk) 02:49, 21 August 2011 (UTC)Reply
So you support my right to filter out paintings of dead babies representing the Christ child, or of the Crucifixion which are horrifyingly graphic? And are you willing to defend my right against those who don't understand my dislike for these images? -- Llywrch 03:22, 22 August 2011 (UTC)Reply
I'd just like to know where on wikipedia people are finding porn. I've been actively looking, and can't seem to come across any, so it's hard for me to imagine that someone would just stumble upon some without intention. I would argue the 'problem' this filter is aiming to 'fix' isn't really a 'problem' at all. It is something that this site (and every other content provider on the internet) simply should not be concerning themselves with. Self censorship is up to the individual and it's up to that individual to figure out how to make it work... or simply don't use the content provider. That's your choice. Pothed 17:13, 20 August 2011 (UTC)Reply
There is stuff that some people might classify as porn in Commons (e.g., pictures of genitalia, images of people from porn websites, probably even images of people engaging in some form of sexual intercourse); it's one of those problems which you only solve if you know the answer first. And let's be frank: if you're serious about finding porn, searching on tumblr is a better bet. (Hell, even a Google search is a better bet than sifting thru all of the possible categories in Commons.) -- Llywrch 22:28, 20 August 2011 (UTC)Reply
And someone who doesn't want to have any pictures of any nudity on their computer? Too bad for them? If their country bans it or they are at work, then too bad? We need to make sure that as few users as possible are able to access Wikipedia because you want to make a political statement against those you feel aren't as "liberated" as you? That is really selfish and incivil. Ottava Rima (talk) 02:50, 21 August 2011 (UTC)Reply
And if someone thinks wearing a skimpy bikini constitutes nudity (showing too much skin), won't they have the right to be upset if they find those images aren't filtered out? "Why are you using their standard of nudity and not mine?" Evil saltine 21:25, 21 August 2011 (UTC)Reply
Why do you want to force someone to look at something they don't want to look at? Ottava Rima (talk) 21:44, 21 August 2011 (UTC)Reply
Who is "forcing" anybody to use Wikipedia? --87.79.214.168 21:47, 21 August 2011 (UTC)Reply
I don't. I support optional filtering based on the image categories we already have, not creating a whole new set. Evil saltine 21:52, 21 August 2011 (UTC)Reply
It has been pointed out that the current set of categories will be used for the new set of categories. We already have the images put into the specific areas. Ottava Rima (talk) 21:54, 21 August 2011 (UTC)Reply
OK. So activating the "Nudity" filter will hide all images under Category:Nudity (and its subcategories)? That seems to make sense. Evil saltine 22:06, 21 August 2011 (UTC)Reply
What filter will hide all images of homosexuals, all images of interracial couples or all images containing background persons wearing skimpy swimsuits or "blasphemous" T-shirts (a point raised by Trystan)? Which current categories cover those? —David Levy 22:32, 21 August 2011 (UTC)Reply
There are already categories for homosexuality, swimsuit, etc. The article page shows that even a Pokemon image could be hidden. Ottava Rima (talk) 22:35, 21 August 2011 (UTC)Reply
Firstly, are you under the impression that every image of a homosexual person is categorized as such? That isn't close to true. Images directly pertaining to homosexuality (e.g. a photograph of a gay pride parade) fall under such categories, but we don't place "homosexual" tags on images simply because their subjects are gay. The planned system, however, will require us to (unless we reject the widespread belief that such images are objectionable).
Secondly, you appear to have overlooked the word "background." A photograph with an incidental person in the background isn't categorized accordingly. The planned system, however, will require us to do that. (See #Negative effects of using categories as warning labels on the usefulness of categories for a more detailed description.) —David Levy 23:09, 21 August 2011 (UTC)Reply
Are you honestly trying to say that a picture about "homosexuality" is the same as a guy in a picture who may or may not be gay? Wow, that is really weird. Ottava Rima (talk) 23:44, 21 August 2011 (UTC)Reply
No. That the two differ (and the latter currently isn't categorized) is my point. Many people object to the existence of homosexual people (such as Elton John, who's openly gay) and don't wish to be exposed (or have their children exposed) to the sight of one. This is one of countless "potentially objectionable" image subjects not covered by any category currently in use. Other examples have been cited. —David Levy 00:19, 22 August 2011 (UTC)Reply
Who, other than you, has "pointed this out"? It simply isn't true (or even feasible). The current image categories are neither designed to label objectionable content (much of which is incidental, and therefore not categorized) nor presentable in such a context (due to their sheer quantity, among other issues). —David Levy 22:32, 21 August 2011 (UTC
After the porn dispute on Commons, the categories were reworked specifically to put the various types of pornography and other content together so that excess images would be removed. That has been a major issue over there for the past year or so. Where have you been? Ottava Rima (talk) 22:35, 21 August 2011 (UTC)Reply
You're missing the point. No one is suggesting that such categories won't play a role in the planned filter system. —David Levy 23:09, 21 August 2011 (UTC)Reply
(Responding to Ottava Rima @ 02:50, 21 August 2011) If someone doesn't want any pictures of nudity on her/his computer after reading Wikipedia, then that person should be careful about which articles to read in the first place. For example, I've worked on a number of articles about 5th, 6th & 7th century European history & didn't encounter one picture of a nude person. Hell, I've read or worked on a couple thousand articles relating to Ethiopia & not once did I encounter a picture of a nude person. Not even the pictures of topless native women which made National Geographic so popular with grade school boys back in my youth -- before the Internet brought pictures of boobies a few mouse-clicks away. -- Llywrch 04:45, 22 August 2011 (UTC)Reply
What? How does that make sense? So people aren't able to get pictures about, say, the herpes virus without being subjected to genitalia covered with it? People aren't able to look up some Greek mythological figure without seeing some representation of the individual having graphic sex? There are plenty of historical or "cultural" articles that have pictures containing nudity that are just not safe for work or school. Ottava Rima (talk) 16:39, 22 August 2011 (UTC)Reply

What we have here

is a failure to communicate and yet another complete failure of Wikimedia/Wikipedia governance.

And I'm not even talking about the "referendum" itself, but about a failure in a lot of everyday communication on Wikipedia. Instead of throwing a massively intrusive measure like an image filter at the supposed problem, we should at least try to look at the real problem, which plays out on Wikipedia talk pages all the time. Sometimes it works out, sometimes it doesn't.

Consensus is consensus. If you don't like an image in an article, take your concern to the talk page. If others provide overriding reasonings for why a particular image holds explanatory value for the article, that's that. Get over it. But deal with it locally, and deal with it yourself.

Take the example of arachnophobia. On the English Wikipedia, editors at one point agreed to remove actual spider images from the article, not because "people are offended/irritated" but because including actual spider images defies the purpose of the article, rendering it useless to arachnophobics who needless to say are the most likely to look this up on Wikipedia in the first place. On the English Wikipedia, this overriding rationale to replace the photos with non-phobia-triggering drawings prevailed in large part because enough people showed good judgment and didn't keep senselessly pointing to "WP:NOTCENSORED".

On the German Wikipedia by contrast, the situation is different. On the German arachnophobia article, those images are still there, because self-appointed anti-censorship watchdogs keep restoring those images and ignore or shout down anyone raising the point at the talk page. The same reasoning (that those photos defy the purpose of the article) does apply just as much on the German article, but it keeps getting ignored and shouted down. And that is the real problem. When true, discussion-based consensus doesn't prevail against a group of "established" "editors".

The answer to that real problem is of course not an image filter. The answer is better governance. Editors, and admins in particular should look at article histories, go to the talk pages and read the presented reasonings from all sides. Most of the time, a clear judgment is possible as to whether the inclusion of a particular image or type of image does or does not ultimately make sense in the respective article. That then is the consensus, whether or not it takes an RfC or RfAr to get people to accept it.

If we only had strong enough governance so that Wikipedia could be at least somewhat more discussion- and consensus-based and not the pure shoutocracy it has turned into, there would be no need for silly things like an image filter.

Some might point out that many people who want images filtered are only readers. But the answer we give to those people cannot seriously be to essentially say, as a project and community, "yeah, we are also not sure about those images, some want them and some don't, and we as an encyclopedic project are not able to make up our collective mind". What a declaration of bankruptcy! It's basically unconditional surrender to the exact kind of problem we implicitly claim we are here to solve when we call ourselves an encyclopedic project. --87.78.45.196 00:55, 21 August 2011 (UTC)Reply

Your example is stupid - in both Wikipedias the rules are the same. In the EN wikipedia a certain group of editors has got their way, in the DE wikipedia the mayority was different. But this new "feature" would ignore the discussion about single articles and the appropiate image to illustrate that article and will allow certain users in the back of commons to distroy hundereds of articles by censoring images. --Bahnmoeller 15:33, 21 August 2011 (UTC)Reply

"stupid". Oh well, I'm no stranger to putting a conversation ender near the beginning of my own posts, so I guess I can't complain. Anyway, it may not be the most suitable example, but it's the one I could immediately think of where a well-defined group of people (arachnophobics) strongly favor the omission of a certain type of images (photos of spiders) -- and where it makes sense to omit those images based on some people's objection.
In the EN wikipedia a certain group of editors has got their way -- No, in the English Arachnophobia article, reason prevailed. In the German article, reason was single-mindedly ignored by a bunch of editors who are a tad paranoid about "censorship".
But this new "feature" would ignore the discussion about single articles -- Yes, that is exactly my point, and one of the reasons I think the image filter is a uniquely bad idea. --87.79.214.168 20:40, 21 August 2011 (UTC)Reply
It's funny, I take a different approach. I think the spider images shouldn't have been taken out of EN, and one of the benefits of this filter will be that we can make purely editorial decisions without having to stress over the emotions of our readers. Put "the best" spider image at the top, but let readers who are upset by it choose to hide it, so that they instead see a shuttered image. This will remind them that they are missing out on information because of their preference, but it won't be so brutal as to shove a spider in their face against their will either. --AlecMeta 18:20, 22 August 2011 (UTC)Reply
No, you are wrong. Spider images serve no encyclopedic purpose on that particular article. The en.wp editors were correct in removing the image. The German Wikipedia is generally more authoritative and normative, and therefore the image was kept there much more to prove a point regarding "censorship" than because of any actual editorial considerations. --195.14.220.250 18:28, 22 August 2011 (UTC)Reply
Well, I don't mean to single out and judge that specific case, I wasn't part of the discussion, I don't mean to say it was 'wrong'. And we shouldn't be troubled that two different groups of people reached different conclusions about a content decision-- that's the way it always works. I accepted the premise that we removed encyclopedic pictures because they were causing negative emotions-- if I didn't accept that premise, it wouldn't be a good example/metaphor. If it was just boring run-of-the-mill decision, then just consider a different example. :) --AlecMeta 14:27, 23 August 2011 (UTC)Reply

Are we addressing the right problem?

Much of this debate appears to be emotional, or about values that clearly vary from person to person. Could someone at the Wikimedia Foundation (or in some independent and trustworthy organization) provide basic, 'objective', neutral information about the following simple questions?

- Who is actually asking or pushing for this 'Personal Image Filter' in the first place, and why is this issue coming up now, since Wikipedia and sister projects have been working well for years without it?

- As of this writing, there are 3,715,536 articles in English version of Wikipedia. What fraction of these pages (or of other WikiMedia projects) could possibly be considered controversial (for obvious categories, such as explicit sexual references, extreme violence, etc.), even roughly speaking?

- What clear quantitative, incontrovertible evidence is there that governments (or 'authorities') are actually controlling or restricting access by its population to Wikipedia and associated projects?

- Why are the current practices of 'Neutral Point of View' and 'Least Astonishment' not sufficient for Contributors and Editors to include only materials that are appropriate to the context of their article in an encyclopedia, book, definition, etc? In other words, why is this a 'Viewer' issue and not simply a 'Contributor' issue, or possibly a quality control issue?

Since there are millions of public sites and pages dedicated to potentially offensive subjects, I would not expect curious people (or children, or random visitors) to search Wikipedia for these materials, nor for them to find much offensive content (compared to the bulk of the web outside Wikimedia). To the extent our goal is to generate open access reference and educational materials, are we not losing a lot of time and energy in trying to address a side issue?

Michel M Verstraete 22:35, 21 August 2011 (UTC).Reply

The problem developed because Wikipedia has an absurd amount of pornography and that drew a lot of complaints, especially with our high number of child users (and it being illegal to allow them to see such images). The surveys showed that there were over 1000 white penises but only 1 black penis, negating any evidence that there was truly an "encyclopedic" reason for them. However, a handful of vocal but a minority of users caused an uproar when the porn started to be trimmed down. It was suggested that we could create a filter system so that people could have the porn and others could ignore it. It was a compromise. Ottava Rima (talk) 23:48, 21 August 2011 (UTC)Reply
Can you give any sources for your claims that:
  • "... Wikipedia has an absurd amount of pornography and that drew a lot of complaints, ..."
  • "... there were over 1000 white penises but only 1 black penis, ..."
  • "... a handful of vocal but a minority of users caused an uproar when the porn started to be trimmed down."
  • "It was a compromise."
Otherwise i will have to ignore your opinion entirely, since it seams way overextended and onesided. --Niabot 00:49, 22 August 2011 (UTC)Reply
The statistics were all in 3 reports. I assumed that you bothered to read them before making such responses, being incivil, and vandalising here. I guess I assumed too much. Ottava Rima (talk) 01:25, 22 August 2011 (UTC)Reply
Give me a link to that reports and cite the related part of it, that aren't written by a good friend of Sue or his daughter. PS: I'm curious why you got banned indefinitely from EN and now proudly claim that others are uncivil or vandals. --Niabot 01:35, 22 August 2011 (UTC)Reply
"Give me a link to that reports and cite the related part of it" No. It is on the article portion of the page. You have proved that you are here inappropriately because you failed to actually look and see the basis of the filter and what the filter is supposed to do. Merely jumping in here and making all sorts of ignorant comments is not appropriate behavior. And Niabot, there are multiple websites right now watching this discussion and making it very clear that you have a major track record of inappropriate behavior. Unlike you, I wrote articles that were so respected that a Foundation Board member and an Arbitrator proxied them onto Wikipedia during my ban, and the articles were on some of the most important topics [1], [2], and [3]. You've merely put together a handful of stub like articles on obscure anime or uploaded lots of hentai that is dubiously within CC-BY standards. Your block log [4] is quite impressive. It seems rather clear that you want to force as many people to look at as much porn as possible for whatever reason. Ottava Rima (talk) 16:46, 22 August 2011 (UTC)Reply
(1) Does file:Human vulva with visible vaginal opening.jpg qualify as "porn"?
(2) Who is "forcing" anybody to use Wikipedia? --195.14.220.250 16:53, 22 August 2011 (UTC)Reply
Nice insults. I didn't expected anything better from you. A "handfull" is now comparable to dozens. Hentai is dubiously within CC-BY standards? I laughed again, a little. I was effectively blocked two times on the German wikipedia, all other blocks where removed soon afterwards because they where false. Not a single one was related to a sexual topic. To me it seams very clear that you blatantly insult others with false claims and that you would do anything to harm freedom or the project. Thats my opinion and last word. I don't see the need for a further going discussion, based upon lies. --Niabot 17:46, 22 August 2011 (UTC)Reply
1. There were no insults so don't pretend otherwise. 2. The hentai is from a dubious source and randomly pulled from the internet. 3. And not being blocked for a sexual topic doesn't mean you don't deserve to be. We are a project with children and your mere interacting with them while having explicit images on your user page at Commons is a serious behavioral problem. Ottava Rima (talk) 02:44, 23 August 2011 (UTC)Reply
What about some research? Look here: Category:Intact human penis. How many are black?
If the amount of porn on commons ist "absurd" I can't decide. But the number of white man who like to show off their dicks clearly is :-) Adornix 15:29, 22 August 2011 (UTC)Reply
I can recall deletion requests against mostly black penises. But for 1000 to 1 we should have at least 1000 pictures inside this category. ;-) --Niabot 16:29, 22 August 2011 (UTC)Reply
the number of white man who like to show off their dicks clearly is [absurd] -- That is correct. But why are we trying to superficially address this problem with an image filter rather than showing these people the door and deleting the educationally meritless images? --195.14.220.250 17:05, 22 August 2011 (UTC)Reply
    • About some so-called facts:
  • "... Wikipedia has an absurd amount of pornography and that drew a lot of complaints, ..."
"absurd" is a matter of one's own standard. For that matter, so is "pornography" All in all, as compared to what's in the online world in general, I've seen very little here--but, then, I don't go looking for it. And the complaints represent what proportion of the readership? I suppose between one in ten thousand and one in a hundred thousand, Unless, like the group preparing the report, you go out and try to evoke them.
  • "... there were over 1000 white penises but only 1 black penis, ..."
If so, it is a problem. The solution is to add more racial diversity--just as usual with cultural bias.
  • "... a handful of vocal but a minority of users caused an uproar when the porn started to be trimmed down."
"handful" is also a matter of one's own standard. To me, they include a very high proportion of the people here whose work I respect. Considering that the original trimming was the heavy -handed work of a single editor, I'd expect no less than an uproar. As I recall, most of the images deleted by that editor at that time were restored.
  • "It was a compromise."
That at least is apparently true. But there are some things we do not compromise about. the most important is NPOV, and that the encyclopedia is free of the selection & classification of any band of expert or presumed representative editors is the basis of our NPOV. The WMF does not own the content. It doesn't even own the software. All it owns is a small amount of hardware and a extremely valuable trademark. Of all sorts of intellectual property, trademarks are the sort of thing most vulnerable to compromise and dilution. Compromise NPOV, and you lowers the value of the trademark, because that's the principle thing it stands for. DGG 04:42, 23 August 2011 (UTC)Reply
Neutral point of view does not require violating the law by providing those under 18 in the United States with pornographic imagery. The WMF is legally responsible for the content regardless of "ownership" because it would qualify as a distributor. Pornography is also legally define and the standard upheld by the Supreme Court. You know all of this, so I am surprised that you would say the things you said above. Ottava Rima (talk) 05:39, 23 August 2011 (UTC)Reply
As a side question, whose work are you claiming that you respect? I have spent time looking into who posted, and the ones who oppose the referendum either don't edit en.wikipedia or haven't produced much content. I find it odd how you can respect them, when many of them just upload graphic sexual images and do little beyond that. It is also odd that you are saying that the problem of "1000 white penises" means you should have more non-white penises when it was obvious that the problem actually was that it had nothing to do with being encyclopedic but a small, undiversed group uploading images simply to put it on their user pages in an exhibitionist manner. The worst part is that we allow these people to interact with those under 18. If the WMf was being responsible, every single one of them would be indeffed under a child protection standard. Ottava Rima (talk) 05:42, 23 August 2011 (UTC)Reply


Michel M Verstraete raises lots of great questions. I'm not with the foundation in any way, but let me try to answer those I can:
"Who is actually asking or pushing for this 'Personal Image Filter' in the first place" -- The only people I know who really really want it are Muslim editors of the English-language article on Muhammad. We can't be NPOV without images of him, but there's something very sick and wrong about making old people who barely speak English learn about computers or multiculturalism before they can even happily read the article on their favorite subject. I wish they already had a browser that they could happily use, I wish they had learned multiculturalism before they came here. But the image issue takes what should be a positive experience and turns it into a negative one. People on talk can sometimes be very nice to the people who are upset, but occasionally the very worst of our commmunity replies, creating what could be 'intensely memorable, intensely negative experience". I've talked a lot with that group, and it's important to me that we find a way to 'have our cake and eat it too'-- we know it upsets them, we know how to fix it, and we know how to do it while still "modeling our values" too. So, even though I was one of the biggest opponents of the last bout of censorsip, I think at this point you could reasonably say I'm one of the people 'pushing for this', although I myself would never use it, I do want our low-english-literacy Muslim readers to be able to have access to it. And of course, I want it for people in their same situation who I just haven't personally interacted with, even though it was my experience with that particular group of readers that convinced me we should do something minimal to help them avoid offense.
why is this issue coming up now, since Wikipedia and sister projects have been working well for years without it? -- I have a GREAT answer to this. The only problem is that it's an answer I made up that doesn't reflect the board's point of view. But my answer is: We need it now because we're moving beyond Wikipedia now. This filter wasn't essential to Wikipedias because Wikipedias operate exclusively on NPOV. But the encyclopedia is only the very beginning of a library, and usually it's one of the least-controversial. Our future volumes may require us to be far more offensive and controversial than we had to be just to beat Britannica. In the future, we're not just trying to beat Britannica, we're trying to beat all books. --AlecMeta 17:15, 22 August 2011 (UTC)Reply
The only people I know who really really want it are Muslim editors of the English-language article on Muhammad -- Ah? I was wondering about that. I thought the main problem for Muslims was not primarily looking at depictions of Muhammad, but much more fundamentally the fact that there are depictions of Muhammad. --195.14.220.250 17:19, 22 August 2011 (UTC)Reply
Well, there's plenty of anger about the mere existence of the images, but our hands are totally tied there. Some people want deletion, but we can't give those people what they want, and we can't even seriously consider it, in my estimation. If someone's only problem is that we're giving people content that they want, then they, in fact, have no legitimate problem with us-- sharing information is what we do and what we are.
But a much larger group, and a lot of the raw emotions, come not from abstractly knowing about the images, but from personally encountering them unexpectedly.
When people come and say "this image upsets virtually all of the people around me, so I don't want to show it on my computer screen unless I actually click on it, so I won't upset the people around me"-- we can, in fact, help our readers with that much smaller problem. And we can do it while staying 100% true to ourselves.
I was raised by tv-- I have a very difficult time imagining what it's like that total strangers could have so much power over your emotions just by showing you a single image that wasn't intended to offend. But the fact of the matter is we're writing for Earth-- all of Earth, and believe it or not, we do, in fact, an unbelievable power over some of our global readers. We can upset or enrage them just by writing a good article. Or we can not upset them, simply by using this feature. And we don't actually want to intentionally upset people over and over and over if something this simple would really solve their problem.
There will be lots of people who aren't happy with this system because they'll know the images 'exist'. But some people will be satisfied. Even if they're just a tiny minority, that's okay- it's kind of a tiny feature. --AlecMeta 17:32, 22 August 2011 (UTC)Reply
Maybe I don't quite understand this type of self-censorship. To me, these people are rather like little children who "hide" by closing their eyes. And more importantly, it's weird to help people do that to themselves. It implies that sticking your head in the sand is healthy and good. Also reminds me a bit of the Formerly Obese Man from the Onion Movie.
I for one believe that maybe the core idea of encyclopedias is that, while everyone is entitled to their own opinion, nobody is entitled to their own reality. When will we introduce an option to view the Conservapedia version of an article instead of the Wikipedia version? --195.14.220.250 17:47, 22 August 2011 (UTC)Reply
Why are the current practices of 'Neutral Point of View' and 'Least Astonishment' not sufficient - again, this is just my answer.. Where we're going, there won't always be a single NPOV. Most books in a library aren't written neutrally. Sometimes art is intentionally astonishing. "A Clockwork Orange" is a hard to watch film because it is so upsetting, and yet, it educates about the horror of violence in a way that few other films do. Tomorrow's "A Clockwork Orange" may be released under creative commons and it may be hosted on Commons-- we want to buy more intellectual freedom for our contributors, and a filter will help buy it. --AlecMeta 17:15, 22 August 2011 (UTC)Reply
It won't. Thats a fact that the American Library Association accepted a long time ago. [5] --Niabot 17:51, 22 August 2011 (UTC)Reply
Well, see, let's think about that for a second. ALA dramatically opposes a library endorsing a rating system-- the library tries to be neutral. So consider w:Index Librorum Prohibitorum, a list books that were prohibited by the Catholic Church, the last of edition was 1948. FORCING library patrons to use it would be horrendous. But it would also be bad for a university library to refuse to let its patrons consult the w:Index Librorum Prohibitorum or access it.
We can have a whole section of our library that is "Lists of prohibited books", so long as we never have one special default list. Indeed, knowing what's objectionable is just another kind of knowledge, as long as it's not imposed-- with that knowledge, we could even have our own "banned images week" to promote high-quality images that have been objected to. --AlecMeta 14:38, 23 August 2011 (UTC)Reply

Interim vote totals

Just for the sake of information, there have now been over 20,000 votes registered in this plebiscite, which makes it the highest degree of participation in any Wikimedia poll/vote/referendum ever. There is still a week to go. Risker 03:10, 23 August 2011 (UTC)Reply

But the question is, is the software sophisticated enough to determine how many of those are done from the same IP? And with the qualification only being "10 edits", how many of those are from accounts with only 10 edits or under, say, 100? Ottava Rima (talk) 03:37, 23 August 2011 (UTC)Reply
In answer to your first question, yes, vote administrators have access to IP data. As to the second point, that is not a key criterion that will be reported by the vote administration process, although I am sure someone with API access can do the calculations if they want to run through all 20K+ votes, if they are so motivated to do. I'd suggest they wait until the polls close, though; we have no idea how many more votes are yet to come. Risker 03:47, 23 August 2011 (UTC)Reply
Well, I have a suspicion that the loose restrictions are a reason why there have been over 20,000 votes so far. I would be willing to bet that at least 50% of the votes are done by those with less than 100 edits. I hope someone who knows scripts would be able to parse through the info, especially if the WMF is going to be relying on it later. Ottava Rima (talk) 04:13, 23 August 2011 (UTC)Reply
Actually, the huge leap in the number of votes is most likely related to editors with email enabled actually reading up on the issue and expressing their opinion. The objective here is to encourage many opinions, hence the low threshold for eligibility. This tool is designed with *readers* in mind, so the edit count is less relevant for this plebiscite than would be the case for election of Trustees, for example; many people read one of the Wikimedia projects regularly without making many contributions, for a multitude of reasons. Risker 04:22, 23 August 2011 (UTC)Reply
Polls only work when they attain a level of objectivity and negate the ability to game them. Giving those like sock masters the power to decide this matter isn't what I would see as a good thing. By raising a standard, we negate the inappropriate influence of such individuals by forcing them to perform far more work at pretending to be legitimate than they most likely would be interested in performing. I'd rather not see those like Grawp be the primary decider in what the editors of Wikipedia feel. Ottava Rima (talk) 04:31, 23 August 2011 (UTC)Reply
Yeah, whatever the outcome, you'll claim victory. We get it. Also, what the editors of Wikipedia feel is that you should be banned there, which you are. Just saying. --78.35.232.131 04:34, 23 August 2011 (UTC)Reply
I find it interesting that you assume that the sock masters would all be voting against the proposal. That is quite telling, especially with your logged out editing above. Ottava Rima (talk) 05:43, 23 August 2011 (UTC)Reply
LOL, that is what you are clearly assuming, not me or anyone else. Your projection is quite telling. Unfortunately for you, the one person it tells nothing is you. --213.196.209.190 13:08, 23 August 2011 (UTC)Reply
You just switched to yet another IP, which verifies my statement that a small handful of people are using sock puppets and changing IPs to make it seem far more than there actually exists. And I find it interesting how you say "projection" as if I would be logged out editing or socking, which is laughable. Ottava Rima (talk) 14:07, 23 August 2011 (UTC)Reply
Not happening, and not that important if it does. IP users are first class citizens here, let them be or try to change their minds-- don't overly fixate on the signatures. Each comment comes from a human mind-- discuss the issues raised or discuss the concerns in the abstract, but singling out specific comments and using their login status to dismiss their arguments is... counterproductive. --15:25, 23 August 2011 (UTC)
there have now been over 20,000 votes registered in this plebiscite, which makes it the highest degree of participation in any Wikimedia poll/vote/referendum ever.
This is very, very exciting and it's going to make it a lot, lot easier to understand this issue. THIS is why we had a referendum-- this is the largest feedback we've ever had about any issue ever. The questions are confusing, but that's okay. Referendums work at getting massive feedback from readers and editors across cultures. They are a new technology the board has just started using, and despite all the handwringing that the numbers will be misused, I think it's a GREAT step in the right direction! Regardless of what the outcome is, just getting this many people to take part in a governance decision is great. --AlecMeta 15:15, 23 August 2011 (UTC)Reply