Talk:Image filter referendum/en

From Meta, a Wikimedia project coordination wiki
This is an archived version of this page, as edited by Wnt (talk | contribs) at 21:04, 17 August 2011 (→‎If I wanted to design something that worked...: new section). It may differ significantly from the current version.
 Unless someone makes a case for a formal referendum process that is approved as a standing part of WMF governance, no outcome will be legally binding on the WMF without a Board resolution.  However, any group within the WMF can set up a community referendum and commit to being bound by its outcome... which I believe is what's going on here.  
Certainly the discussion and revision process of RfCs, rather than a focus on voting, is desirable. Discussion of concerns and suggestions raised, and the outreach involved in responding to them, will be as valuable as the vote itself. I don't know how specific the referendum text will be, but I hope that the first outcome of holding one is feedback that gets incorporated into the proposed implementation. SJ talk | translate   04:02, 1 July 2011 (UTC)[reply]
I'm a little confused - for whom will the outcome be binding? aleichem 01:50, 24 July 2011 (UTC)[reply]

Why do we need to vote?

I'm just wondering why a vote is needed here. It seems like such a no-brainer to enable this sort of system; are there significant concerns that have been raised by others somewhere? fetchcomms 03:11, 30 June 2011 (UTC)[reply]

I agree; considering it is a personal choice anyways, where's the possible damage? I doubt that anyone will come complaining over the fact that they could activate this for themselves. Ajraddatz (Talk) 03:19, 30 June 2011 (UTC)[reply]
I am amazed that such a system is even being considered, let alone brought to a vote, as though there was any chance of this gaining consensus. Such a system is a blatant annihilation of Wikimedia's neutrality, placing preference towards the viewpoints of certain cultures. It is impossible for such a filter to be anywhere close to neutral, short of having commons:Category:Topics as the start point for settings. My guess is that any potential filter the foundation has in mind will satisfy the opinions of a small group of people, mostly Americans and Europeans, and ignore everyone else. --Yair rand 03:29, 30 June 2011 (UTC)[reply]
I'm also thinking I'll be against this proposal, but how about we wait and see what's actually being proposed before we start engaging in any wild speculation? Craig Franklin 08:26, 30 June 2011 (UTC).[reply]
Well the comments above shows why we need a discussion/vote to bring such system into projects. --Bencmq 08:34, 30 June 2011 (UTC)[reply]
There will definitely be opposition to this proposal, myself included. I wouldn't be so quick to call the result just yet. Blurpeace 09:13, 30 June 2011 (UTC)[reply]
I'm not quite sure how it affects Wikimedia's neutrality. It says right on the main page for this, "allow readers to voluntarily screen particular types of images strictly for their own account." From that, I presume anonymous readers would still get the same content they do now, and if you don't want to filter anything (or filter everything), you would be free to do so with your account. I believe this is about giving people the tools to filter the content they don't want, rather than deciding en masse that certain content should be hidden/shown. ^demon 17:24, 30 June 2011 (UTC)[reply]
"giving people the tools to filter" - it's not about what you do on the receiving end. It's about how "they" tag (or not tag) files at source. There are 10 million files on commons. There are no human resources to assess them (have you seen the backlogs?). But there are well-known problem areas: I'm afraid that added censorship rights will become a POV-magnet for determined POV-warriors. Welcome to filter-tag wars. Larger wikipedias have some experience of handling conflict areas, commons is totally unprepared. Let's hope that the new widget will be as helpless and useless as their "upload wizard". But if, indeed, they roll out something working, prepare for the worse. NVO 18:04, 30 June 2011 (UTC)[reply]
People will not be free to filter what they want, such a system would be pretty much impossible to make. They will be free to filter a set bunch of groups of images that the WMF have received the most complaints about from our current English readership (well actually, probably not the most complaints, the Mohammed images have received loads, but I doubt the filter's going to include an option for filtering that simply because it would make Wikimedia look like the non-neutral group it will have become). There are endless amounts of groups that have their own filtering needs which will obviously not be satisfied by this system. --Yair rand 22:16, 30 June 2011 (UTC)[reply]
Why would this be impossible to make? I don't see why everyone shouldn't be able to choose an arbitrary set of [current] Commons categories that they would prefer not to see. Would you be opposed to such a system? SJ talk | translate   03:29, 1 July 2011 (UTC)[reply]
If it really gave no preference towards certain cultures (making just as easy to hide images in Category:Tiananmen Square protests of 1989 or Category:Asparagus as images from any other categories, perhaps having selection beginning from Category:Topics as mentioned above), then it would not be a problem with NPOV, in my opinion. --Yair rand 04:36, 1 July 2011 (UTC)[reply]
It would be better as you suggest, but we would still have the problem that we don't agree on categorization. Nemo 10:42, 1 July 2011 (UTC)[reply]
Yair: thanks for clarifying. Nemo: true, but we already have that problem with applying categories. [re]categorizing media would become more popular as a result, but that might be a good thing just as it is with article text. SJ talk | translate   03:05, 3 July 2011 (UTC)[reply]
I find myself agreeing with the opposition on this issue. Regardless of who decides what is to be subject to censorship, and regardless of the fact that it is opt-in, it still puts the Wikimedia Foundation in a non-neutral position. Taking an example from my own area, what if we deemed images of the Tiananmen Square protests of 1989 or images of leaders of a certain religious group banned by the Chinese government to be "controversial", and therefore added them as an option on the filter? We would rightly be labeled as non-neutral for supporting the Chinese government's view of what is and is not controversial. I don't want us ever to be in a position like that. We are a neutral provider of knowledge. individuals already have the ultimate filter built in: they can simply not look at what they don't like.--Danaman5 23:44, 30 June 2011 (UTC)[reply]
Things are controversial if some people think it is. So yes, we should add a tag to those images if some Chinese think they are offensive. Anyone should be able to come in and declare an image offensive. Only then it would be neutral. Of course, a lot of images would be deemed controversial that way. Zanaq 09:38, 1 July 2011 (UTC)[reply]
Agree with Zanaq. If someone thinks a picture of a naked person is controversial that person should be allowed to add a filer, a symbol or whatever s/he likes to change the picture or the way it's linked. You want to give work to some filter committee so they have no time to improve Commons, Wikipedia or whatever project? Not a brilliant idea me seems. Cheers, 79.53.129.80 11:28, 1 July 2011 (UTC)[reply]
hello may i add a thought on it?

the thing is any encyclopedias who host image pornography are misguided and betray trust in the first place or else it would be merely an adult site not a genuine encyclopedia of merit. Wikipedia requires morality regardless of the wide spectrum of user-opinions or other filters. users that post anything biased need to be expected to have their item debated in the sandbox. Regarding China they should be allowed for a country to choose what not to see if a person can. In China the people there did not get to vote for any other party so that is their choice. Make ready-made choices for the whole WP and images to be aware of some nations whose Government needs an option to choose to block things. When WP is obviously not allowed and WP would be banned due to open-content, then WP ought to offer a strong way to securely Select PG or China-friendly content filter or other dictator-styled filter set in stone and compulsory mandated in WP architecture as a location-based filter including for their anonymous users. Otherwise there are whole countries of people who get none of the WP articles because of an all-or-nothing preponderance. WP policy of one-size-fits-all means WP may deny people their right to see what is allowable in the place they live. Simply because of inadequate filters people must wait. As well as supplementary personal filters even a Saintly or an Atheist filter although those may remove most of the articles on human history. A China-tickable filter or Vegan-friendly locked-down WP made filter, could be popular selections and bring new users. Basically this might allow people to create a custom-filters gallery that are like mozilla addons for firefox. The possibility is about better WP article usage by widgetising several search terms for example so a search delves following the right learnt-link rather than searches that fan-out everywhere. Marsupiens 06:12, 16 August 2011 (UTC)[reply]

This has been talked about for years. The idea that has emerged is that there would be multiple, separate filters, so you could pick and choose. The most commonly requested ones are for pornographic images (requested by basically everyone in the world except white males), religiously sensitive images (notably but not exclusively images of the Prophet Mohammed), and violent images (e.g., the picture of a man in the act of committing suicide).
I suspect that most of us would not choose to have any of them turned on, but the opposition's story to this point has generally been that if people don't want to see these images, then they should write their own third-party software (maybe a browser plug-in) because WMF should never-ever-ever offer options even remotely like Google's "safe search". In this story, enabling non-adepts to voluntarily customize their experience (e.g., by permitting a person with PTSD to suppress images that might trigger symptoms) is pure, evil censorship. WhatamIdoing 15:44, 1 July 2011 (UTC)[reply]
I'm a white male but would happily support allowing people to opt in to various filters. including grades of porn. But we need the classification to be transparent and the filtering to be genuinely a matter of individual choice - otherwise we would be accused of bias. Some people would define porn very narrowly, othere would include bare arms, legs or even female faces. I suspect it would be impractical to create a filter per every individual request, though we could go a long way towards that if any category could be filtered out by any editor. I know someone who would opt to filter out images that contained spiders. I think we should also allow groups of Wikimedians to create custom flters that people could choose as opposed to going through a ream of options. So presumably a moderate Shia filter would set the porn threshold at bathing suit, ban the cartoons of Mohammed, and perhaps images of pigs and pork? Whilst a "western cosmopolita child friendly filter" would enable you to filter out graphic violence and set the porn filter at bathing trunks. The difficult questions we have to resolve before the referendum include:
  1. What do we do if particular governments say they would be willing to unblock Wikimedia if we allow then to filter out what they consider to be porn or of religious offense (I'm assuming that we won't cooperate with countries that want to filter out images that might be politically controversial).
    Suggestion: say No, just as we do today.
  2. What happens in discussions on Wikipedia if some people want to illustrate an article with images that others can't see becuase they have opted to filter them out? My preference is that we enable "alternate image for those who've filtered out the main image".
    I believe the current proposal is to have the image visible hidden, the way sections are rolled up; so it is clear that an image is there but not being shown, in case the reader wants to see it.
    If it is planned to work that way, I wonder how it will work with anything besides the mere use of images for illustrating articles. In the Portuguese wikipedia we use a painting of a naked woman as the symbol for our Art portal, embedded in a template and displayed at every artwork article. I'm sure the more puritan would like it to be hidden from view as well.--- Darwin Ahoy! 12:16, 3 July 2011 (UTC)[reply]
  3. What do we do when people disagree as to whether a particular image shows enough breast to count as topless whilst others point out that there appears to be a diaphonous top? Do we give filters the right to fork?
    This is already an issue that comes up in categorization of images. Policies tend to be vague at present; if anyone can create their own filters then you could define a combination of descriptive categories, each of them relatively neutral, that fit what you don't want to see. (there could still be disagreements about how many different similar categories are approriate for Commons. For instance, different people with w:ailurophobia will draw different lines around what counts as 'a cat', from the perspective of their reflexive fear.)
  4. Do we use this as an opportunity for outreach to various communities, and if so are we prepared to set boundaries on what we will allow filters to be set on. For example I don't see that a porn filter is incompatible with being an educational charity, but I would be horrified if a bunch of creationists tried to block images of dinosaurs or neanderthals, especially if they did so as part of a "Christian" filter (I suppose I could accept individuals knowingly blocking categories such as dinosaur). WereSpielChequers 09:54, 3 July 2011 (UTC)[reply]
    I can't think of a neutral way to set such boundaries. Personally, I would be fine with any arbitrary personal choices people make in what images they want to see -- dislike SVGs? fine! -- just as in what pages they watchlist. I see this as a tool to support friendlier browsing, not an occasion to set new official boundaries. SJ talk | translate   11:42, 3 July 2011 (UTC)[reply]
    The reason we need to vote is because many people, myself included, are against any violation of the basic core principle of NOT CENSORED. There are three legitimate ways to avoid images 1/ turn them off in ones browser 2/ fork the encyclopedia to a version that omits NOT CENSORED 3/devise an entirely external feature making use of our ordinary metadata. What is not acceptable is for the WMF to interfere with the content in this manner. The fundamental reason for not censored is the general commitment to freedom of information as an absolute good in itself--freedom means not just nonremoval of information, but also not hindering it -- the practical reason is because no two people will agree on what should be censored. DGG 13:53, 16 August 2011 (UTC)[reply]
But how can this be censorship when there is a button on every hidden image to show it - and when each individual has the ability to simply turn off their filters? Censorship is when we hide an image in such a way that people can't see it when they want to do so. I would be strongly opposed to censorship - but this isn't that - this is giving people control of what they prefer to see...it's no different from allowing them to close their eyes or look away if they see something distasteful in the real world. SteveBaker 18:54, 17 August 2011 (UTC)[reply]

Vote is needed to trash the proposal

The proposed Image Filter project is a precursor to imposed censorship, as it involves the subjective categorization of images, e.g.: an image of a girl on a beach in a bikini, which would be inoffensive to some would be categorized as objectionable or even pornographic by religious fundamentalists. The very basis of this project is opposite to Wikipedia's core values, one if which is NO CENSORSHIP. And as noted previously above, if the project is implemented, then '....welcome to image tag wars'.

This proposal is COMPLETELY UNWORTHY OF WIKIPEDIA, which should abide by the firm policy of no censorship. Readers of our projects who view articles on masturbation or the Nanking Massacre should reasonably expect to see images which are sexually explicit or graphically violent; they and their children should not view our works if they can be so offended, since our works and their images are based on codes of verified reliable sources and neutral point of view. Parents are wholly responsible for what materials their children access via the Internet –Wikipedia is not their babysitter.

As to 'surprise' images of nudists riding bicycles (and similar objections): if such images are not in the norm (i.e. most people do not ride bicycles in the nude), then the image is misplaced or irrelevant to the article and should be expunged. If and when an article on 'Nude bicyclism' is created, then the images of nude bicylists are entirely appropriate to that article. The argument of 'surprise' is a complete red herring submitted largely by those of fundamentalist right stripes. Harryzilber 16:20, 17 August 2011 (UTC)[reply]

I absolutely agree. I think that it's despicable that the board would take such a heavy-handed approach to such a controversial topic, possibly against consensus. I marked 0 for the first question on the ballot and I encourage others to do so as well. — Internoob (Wikt. | Talk | Cont.) 17:08, 17 August 2011 (UTC)[reply]
That's naive though. Suppose that I need to learn about Naturalism (art) - so I type in "Naturism" - and find (to my horror), not some crappy porcelain figurine of a shepherd, but instead, right there at the very top of the page, to the right of the lede - a picture of half a dozen naked people. A simple typo or misunderstanding of a term (that the reader is looking up in an encyclopedia precisely because they don't yet understand it) can result in such problems. Consider that there are some companies out there who might fire someone for looking at the picture at the top of Naturism on company time when they are supposed to be researching Rococo figurines! So precisely because I'm against censorship of Wikipedia - and because I demand the right to have full-frontal pictures of nudists at the top of the naturist page - I'd very much like a "No Porn Please!" filter. SteveBaker 19:10, 17 August 2011 (UTC)[reply]
You are very easely horrified - but that experiance may broden you horizon. Perhaps you should teach your computer not to accept certain words you type into searchfields. --Eingangskontrolle 20:25, 17 August 2011 (UTC)[reply]
If your company fires you for reading Wikipedia then you're likely in Communist China, in which case you'll probably have many more important issues to deal with. Harryzilber 20:33, 17 August 2011 (UTC)[reply]

What is the question being asked?

In referendums, there is normally a single yes/no answer for those being polled to answer. The announcement doesn't give any hints as to the phrasing of this question - is there a draft available, and/or will the draft question be posted before the poll for analysis/comments/discussion? Or will there be multiple questions or multiple answers so that a diverse range of voter viewpoints can be sampled? Thanks. Mike Peel 20:51, 4 July 2011 (UTC)[reply]

I also believe that there should be a suggested implementation (or set of implementations) set out prior to this referendum, given the range of different possibilities available for such a filter. Is this the plan, or will the referendum be on the general issue rather than the specific implementation? Mike Peel 20:59, 4 July 2011 (UTC)[reply]
The referendum text will be published in advance. It will likely be a yes/no question (perhaps with an option for "I don't feel that I have enough information", as with the licensing migration). The referendum is on the general concept, as much as is humanly possible. Philippe (WMF) 01:25, 5 July 2011 (UTC)[reply]
I appreciate the idea that we are allowed to see the referendum text before the actual casting of votes. However, in my humble opinion, we should also see the text - in its final form, or (if N/A) as a draft - early enough to be able to ponder and discuss it. Therefore, I hope that "In advance" means "Asap", and, more particularly, at least four weeks before the start of the referendum.
Among others, it may take some time to translate it to the main languages; and, when it comes to finer nuances, most non-native speakers of English would benefit by having access to a good translation. JoergenB 19:21, 5 July 2011 (UTC)[reply]

Alternative ideas

low-res thumbnails

I've commented on this before, but will repeat: I think it would be more generally beneficial to allow users a setting to override page settings about the size of thumbnails, so that, for example, you could decide for all thumbnails to be shown at 30-pixel resolution (also for all images to be shown as thumbnails) regardless of the Wiki code. This would help low-bandwidth users as well as those with specific objections. My hope is that at some low resolution - 20 pixels if need be - there is simply no picture that will be viewed as intensely objectionable. I wish your referendum would investigate in this direction rather than pressing for people to "neutrally" place ideological ratings on specific images. Wnt (talk) 23:56, 30 June 2011 (UTC)[reply]

This would be an interesting low-bw option, but seems like a separate issue. If an image is clear enough to be better than no image (the ultimate low-bw option, after all), it is clear enough to be controversial. 'Placing ideological ratings on images' would be an unfortunate outcome. SJ talk | translate   03:29, 1 July 2011 (UTC)[reply]
If we really need to take in to consideration the wishes of the more squeamish readers/editors, this seems to me the only viable option. I can see quite few advantages:
  • It would avoid all work to set up the the tagging and the software to implement it. Energies that could be better invested elsewhere.
  • Would avoid all the editing wars that I already see ahead and consequent lost of time and energy.
  • Would be an advantage for people accessing Wiki with low connection/old machines.
As only (tiny) drawback, the articles would not look as sleek and appealing as now. It seems to me a win-win solution. --Dia^ 07:48, 16 August 2011 (UTC)[reply]
The cultural/religious objections to the depiction of Mohammed have no limitations on image resolution. I agree that it makes no sense to object to such low resolution images - but I also agree that it makes no sense to object to full-resolution images either. The point isn't whether the objection seems rational or not - the point is that there is a very real objection for some people. It is perfectly possible for an organization to say "No looking at pictures of naked people on company time" - without having the common sense to say "No looking at pictures of naked people at high enough resolution to see anything interesting" - and for a religion to say "Even a 1x1 pixel image of Mohammed is objectionable because it is the intent of the depiction rather than the actual light falling onto the human retina that counts". SteveBaker 19:17, 17 August 2011 (UTC)[reply]

(assisted) browser plug-in

I agree with user Koektrommel "If people want to filter, fine, do it on your own PC." (somewhere below in this discussion), but also see user WereSpielChequers statement "As for the idea of people filtering on their own PCs, what proportion of our readers are technically capable of doing that, 10%? 1%? We need a solution that everyone can use, not just the technoscenti.".
Why not go an intermediate way: Making it easy to the user/browser plug-in to filter, but keeping it local at the users PC? If every picture sends its category/tags with it, a local browser plug-in can filter easily. How complicate it is, to use this plug-in (or a concurrent one) is problem of the plug-in developers (and the user to chose an easy-to-use one). (A reader unable to install a plug-in should perhaps think about absolving a lerning lesson about using internet...)
Tagging and categorising of pictures remains task of the community; but perhaps it could be less 'absolut', e.g. relative values like: image-showing-almost-naked-woman: "67% for ~yes, she's close to naked~ / 33% for ~normal light clothes~" (123 users rated this picture)
--129.247.247.239 08:57, 16 August 2011 (UTC)[reply]


(assisted) browser plug-in

I agree with user Koektrommel "If people want to filter, fine, do it on your own PC." (somewhere below in this discussion), but also see user WereSpielChequers statement "As for the idea of people filtering on their own PCs, what proportion of our readers are technically capable of doing that, 10%? 1%? We need a solution that everyone can use, not just the technoscenti.".
Why not go an intermediate way: Making it easy to the user/browser plug-in to filter, but keeping it local at the users PC? If every picture sends its category/tags with it, a local browser plug-in can filter easily. How complicate it is, to use this plug-in (or a concurrent one) is problem of the plug-in developers (and the user to chose an easy-to-use one). (A reader unable to install a plug-in should perhaps think about absolving a lerning lesson about using internet...)
Tagging and categorising of pictures remains task of the community; but perhaps it could be less 'absolut', e.g. relative values like: image-showing-almost-naked-woman: "67% for ~yes, she's close to naked~ / 33% for ~normal light clothes~" (123 users rated this picture)
--129.247.247.239 08:57, 16 August 2011 (UTC)[reply]

Congratulations on your retrograde help to implementing image tag wars. WP should have no involvement in subjective image tags, as it will be the beginning of the end of our works. If an image is not defined illegal by law, it should automatically be permitted to reside on Wikipedia unhindered and untagged by people's opinions of whether a woman's nipple is exposed or covered, or whether the woman exposing the nipple is a 'naturalist' or a 'slut'.
If an image is inappropriate to an article, by all means remove it, however Wikipedia should have zero involvement in image tags and filtering software. Wikipedia is not your nanny. Harryzilber 16:43, 17 August 2011 (UTC)[reply]

All images on or off

Any kind of filtering system seems to me a waste of project resources and a morass. Instead, why not simply give readers the ability to turn all images on or off by default? And perhaps the ability to turn any given image back on, based on the reader's own evaluation of the caption and the article's subject. No tagging needed. No discussions required. Much easier to implement. Culturally neutral. Simple. Barte 11:21, 16 August 2011 (UTC)[reply]

That is already included in every browser setting! In Firefox is Tools > Preferences > Content > remove check mark from "Load images automatically". Finish! So an easy solution is already there. --Dia^ 13:02, 16 August 2011 (UTC)[reply]
That's not the point. If we're considering a quick "hide or show" system, this has to be done on wikimedia project, not by the browser. As you're probably browsing different websites simultaneously, you can want/need a specific website to hide it's image content, while you'll want another site to keep the images visible. Cherry 14:09, 16 August 2011 (UTC)[reply]
Please choose another browser. Opera has such a functionality in, Firefox has likely either a plugin or a greasemonkey script somewhere. For IE and Chrome will also very likely exists some extensions for that purpose (called site-specific preferences, plugins, addons and similar features do handle these!) Mabdul 20:22, 17 August 2011 (UTC)[reply]

User option to display immediately

Is it intended there will be a user option to bypass this feature and display all images immediately always? DGG 13:45, 16 August 2011 (UTC)[reply]

Displaying all images immediately is the default behavior. Images will only be hidden if a reader specifically requests that they be hidden. Cbrown1023 talk 15:11, 16 August 2011 (UTC)[reply]

At the moment its "opt in" - but you can already hear the calls for "opt out". --Eingangskontrolle 20:27, 17 August 2011 (UTC)[reply]

Committee membership

I'm curious to know the answers to the following questions, if answers are available: How has the membership of this committee been decided? Who has it been decided by (presumably by some group at the Foundation - I'm guessing either the board, or the executive)? What steps are being taken to ensure that the committee is representative of the broad range and diversity of Wiki[p/m]edia's editors and users? Thanks. Mike Peel 22:36, 30 June 2011 (UTC)[reply]

As an add-on comment to my last: I see that there are 2 US, 2 Canadian, 1 UK and 1 Iranian on the committee at present - which is very much skewed to English and the western world. E.g. there's no-one who speaks Japanese, Chinese or Russian, and no-one that is located in the southern hemisphere. If this is a conscious choice, it would be good to know the reason. behind it. Thanks. Mike Peel 07:59, 1 July 2011 (UTC)[reply]
In fact this looks pretty much an en.wiki thing. Nemo 10:39, 1 July 2011 (UTC)[reply]
Why does it matter?
The committee's job is to let you have your say about whether WMF ought to create a software feature that offers individual users the ability to screen out material they personally don't choose to look at.
The committee's job is not to decide whether Foroa (below) has to look at pictures of abused animals, or which pictures constitute abused animals, or anything like that.
Unless you think this group of people will do a bad job of letting you express your opinion on the creation of the software feature, then the composition is pretty much irrelevant. WhatamIdoing 20:28, 1 July 2011 (UTC)[reply]
My main worries are the accessibility of committee members from a linguistic point of view (i.e. if non-en/fa/az/es speakers have questions about the process), and the risk of perception of this vote as a US-centric stance (which could lead to voting biases, e.g. "it's already been decided so why should we bother voting", or "let's all vote against the American world-view!"). Mike Peel 21:03, 1 July 2011 (UTC)[reply]
The membership of the committee was largely put together by me, but informed by many other people. It's also not totally done yet, so if someone has interest, I'd love to talk to them. I've added a couple of Germans, and you're right that the southern hemisphere is under represented. You are, of course, correct in everything you say there - but we're going to try out the infrastructure that's being built for multi-lingualism by the fundraising team and hope that helps to cover some of the gaps. We will be calling heavily on the translators, as always. Philippe (WMF) 21:36, 1 July 2011 (UTC)[reply]
Geez lets put in a token German and a token Iranian. The whole "committee" is a joke. It consists of mainly anglosaxon heritage people whom will decide for the rest of the world what they can see. My oh my. The last 10 years living outside of my own country I have started to hate native English speakers. They are arrogant and think that simply because they speak English they are more than the rest of the world. And now we get a committee of mostly North Americans deciding for the other 4.5 billion people in the world .... great. Just fucking great. Waerth 02:23, 2 July 2011 (UTC)[reply]
I don't think we're doing a "token" anything. There was a concerted attempt to bring in people of varying backgrounds who also brought skills in helping with the election. I find it unsurprising that some of the people who came to our attention were Germans, given the size of the German language Wikipedia. It's disrespectful to imply that they are tokens: they're not. They're intelligent people who bring a great deal of value. The same with Mardentanha, who - in addition to being Iranian - has been a member of the Board of Trustees steering committee and thus brings practical knowledge to the table. Philippe (WMF) 11:46, 3 July 2011 (UTC)[reply]
Waerth, I don't think you understand the proposal. The committee has one job. The committee's job is "Find out whether people want a user preference so that they can decide what they see on their computer".
The committee does not "decide for the rest of the world what they can see". They may decide whether we will add one more button to WMF pages, but you would be the only person who decides whether to click that button on your computer. I would be the only person who could decide whether to click that button on my computer. The committee does not decide "for the rest of the whole world what they can see". WhatamIdoing 19:06, 5 July 2011 (UTC)[reply]
You are wrong. Whether or not was already decided. We can only elect other boardmembers for the next term, who promise to revoke this motion. --Eingangskontrolle 20:31, 17 August 2011 (UTC)[reply]

As a suggestion: try including an invitation to join the committee within the announcement itself - and particularly the translated versions of it. Saying "if someone has interest, I'd love to talk to them" is fantastic - but not particularly useful for diversifying the languages spoken by the community if it's only said in English. It's also rather hidden if it's just said on this page... Mike Peel 20:26, 4 July 2011 (UTC)[reply]

Cost

Will the information provided include also an estimate of the costs the WMF will need to face to implement the filter? Nemo 07:35, 2 July 2011 (UTC)[reply]

A fair question, since one of the tradeoffs may be between a quick inflexible solution that helps some users but frustrates others, vs. a better solution that takes more energy to develop. I'd like to see such an estimate myself (of time involved, if that's easier than cost). SJ talk | translate   11:42, 3 July 2011 (UTC)[reply]
I'll see if we can get a scope for that, yes. Philippe (WMF) 00:36, 4 July 2011 (UTC)[reply]
Correct me if I'm wrong, but I thougt there was a US law that limits publication of prices for services. --95.115.180.18 22:11, 1 August 2011 (UTC)[reply]
No, there's not. It would be a violation of the business' free speech rights. The closest thing we have to that is that the (private) publishers of the phone books usually refuse to print prices, because they're worried about fraud/false advertising. WhatamIdoing 22:43, 14 August 2011 (UTC)[reply]

Today we have severe problems with the servers - the money would be better spend there. If someone feels the need for such a tool, he should develop it and offer it as an extention of firefox. No money from the donations for free content to such an attack against free content. --Eingangskontrolle 15:21, 16 August 2011 (UTC)[reply]

I think every developer and ops person at the WMF (and volunteers) agrees with you we would rather be doing other things. However, in practice, the question we need to answer is: is it easier to achieve our goals with an opt-in image filter, or without one? The alternative could be that many libraries and schools would, sooner or later, block Wikipedia and Wikimedia Commons. That said, I think the thread OP is totally right, we should get a work scope here. NeilK 18:01, 16 August 2011 (UTC)[reply]
The proposed solution is not that great for schools and libraries (or indeed anyone) that want to censor. The user can change the settings, which is the whole idea. Rich Farmbrough 22:30 16 August 2011 (GMT).

Concerns

Slippery slope danger

This is how all censorship starts.. slowly. First it's opt-in, then opt-out and then it wil be mandatory on request of foreign governments / organizations, since the meganism is already in place. Very dangerous. If people want to filter, fine, do it on your own PC. Don't give organizations / government the opportunity to easily implement a blacklist for their people. Koektrommel 07:58, 3 July 2011 (UTC)[reply]

That's a huge extrapolation based on a very preliminary step - and not one that I think is justified. Given that it would already be technically feasible (and easy) for foreign governments to apply such censorship if they wanted (e.g. see the Great Firewall of China or the Internet Watch Foundation), as well as the finances for technical development available to governments / organisations in favour of censorship (which are several orders of magnitude larger than those available to Wikimedia), I seriously doubt that this would make it significantly easier for them to implement... Mike Peel 20:22, 4 July 2011 (UTC)[reply]
I don't think that it is within Wikipedia goals to provide a tool to make job of censors easier. Ipsign 06:57, 16 August 2011 (UTC)[reply]

Esto no es una herramienta que aumente la libertad, sino que la recorta. De aprobarse será la propia wikipedia, que no censura contenidos ni temáticas, la que tristemente proporcione el instrumento de censura a los censuradores. Lamentable. Wikisilki 19:01, 5 July 2011 (UTC)[reply]

Translation: "This is not a tool to increase freedom, but to limit it. If approved, it will be Wikipedia, which does not censor content or themes, which sadly provides a tool for censoring to the censors. Lamentable." —translated by Sj.

I concur. Implementing this feature would be clearly the first step towards censorship on Wikipedia. The next step on this way will be to block whole pages (for example, by adding category 'sexuality') which will allow schools to inject cookie with 'Display-Settings: sexuality=denied' and block such content for all the students. While it might be the intention, it won't stop there: the next step will be to create category 'Display-Settings: opposed-by-chinese-government=denied' and mark Tiananmen Square protests of 1989 with it, which will effectively allow Chinese government to simply inject appropriate cookie at firewall and block content which they don't want Chinese people to see. It is also *very important* to understand that limiting the feature only to 'logged-in' users will *not* fix this problem (regardless of implementation, it will be *very easy* to enforce at firewall level that all users from China are always logged in as the very same user). Ipsign 06:57, 16 August 2011 (UTC)[reply]

It sets a precedent that I really don't want to see on Wikipedia. --Dia^ 07:54, 16 August 2011 (UTC)[reply]

I dont want that tool, its the key to introduce censorship. If a certain image cannot be deleted if will be added to all possible categories to prevent presentation as far as possible. If someone has problems with uncensored images, he/she should read only their own publications.

Abuses by censors: self-censorship seems to open the door to 3rd-party censorship

From other comments on this page, there seems to be a severe confusion about the intention of the feature and of its potential (and IMHO very likely) abuses. One common argument pro this feature is that it is self-censorship, which (unlike 3rd-party censorship) looks as a good thing. Unfortunately, there are severe implications of available technical solutions. It seems that this feature will almost inevitably be based on so-called cookies. It will essentially allow any man-in-the-middle who sits between end-user and wikipedia server (usually ISP, which can be controlled by government/private company with certain agenda/...), to pretend that it was end-user who decided not to show controversial images, and Wikipedia servers won't be able to detect the difference (the only way I know to prevent such attack, is SSL, and even it can be abused in this particular case - 'man in the middle' can intercept SSL too, and while user will know that server certificate is wrong, he won't have any other option, so he'll need to live with it). It means that by enabling users to filter themselves out, we will also be enabling 'in the middle' censors to make filtering much easier than they're doing it now (it will also probably mean less legal protection against censorship: for example, if somebody will filter out Wikipedia images in US right now - they will be likely committing copyright infringement - with 'fair use' defense being quite uncertain, but if it is Wikipedia servers who's doing the filtering - copyright argument evaporates, making censorship much more legal). This is certainly not a good idea IMHO. Ipsign 07:59, 16 August 2011 (UTC)[reply]

In order to distinguesh between images, they have to be categorized. And there you will find the censors. --Eingangskontrolle 08:47, 16 August 2011 (UTC)[reply]

Yes, even if this is intended for personal use, it will in fact be used by network managers and ISPs to censor Wikipedia for their users. MakeBelieveMonster 11:50, 16 August 2011 (UTC)[reply]
Okay, so your scenario here is that a malevolent entity, with government-like powers, does broad-based man-in-the-middle attacks in order to... turn on a filtering system, where the user can get around the filter just with one click? There's nothing we can do about MITM attacks (other than recommending SSL, so at least you know you've been compromised) and if there is someone capable of an MITM attack then they are not going to use our click-through-to-see filtering system, they will just block content, period. NeilK 17:52, 16 August 2011 (UTC)[reply]
No. What they are saying is that all it takes is one malevolent user to inapproprately tag harmless content with labels that will trigger the censorship system. Honestly, if you are liable to be offended by images on a given subject then you wouldn't really be viewing such an article in the first place. If you do encounter an 'offensive' image it is probably because a troll has uploaded something inappropriate to an article whose subject suits your sensibilities, and I don't think they would be helpful enough to censor tag their work for you. --2.26.246.148 20:52, 17 August 2011 (UTC)[reply]

This would be a waste of time

Another waste of time by the Foundation. Wikimedia projects are not censored, except projects like :ar. They think they should not have pics of their prophet. Some christians might like to censor the Piss Christ. Ultra-orthodox hasidic jews don't like pictures of women of the opposite sex. On :nl some don't like pictures with blood. Some classification scheme has to be in place, some review process, and people have to invest time doing that. This time is wasted and can not be spent spreading free knowledge. And who will make these decisions? Zanaq 09:18, 1 July 2011 (UTC)[reply]

As I understand the concept, the 'classification scheme' would be categories for media, which already exist. The 'review process' for creating and updating categories also exists - it happens every day when media are uploaded. Category information is generally considered useful metadata and free knowledge in its own right. I could imagine people trying to create "useless" categories that have meaning only to them, but that seems like a rare case [and, again - is something that already happens today with category creation]. SJ talk | translate  
Free content, or free information, is any kind of functional work, artwork, or other creative content that meets the definition of a free cultural work. A free cultural work is one which has no significant legal restriction on people's freedom. Where do i get a refund? aleichem 09:35, 1 July 2011 (UTC)[reply]
I agree with this definition, but I'm not sure what point you mean to make. Could you describe it in more detail? For instance, if you are concerned about excluding free content from the projects, the filter proposals would not do any of that. (Notability guidelines, in contrast, exclude the vast majority of all free content from the projects - in what could more accurately be named 'censorship'.) SJ talk | translate   03:53, 3 July 2011 (UTC)[reply]
  • Great Wall of China within Wikimedia? That attempt deserves an thumbs down button. --Matthiasb 09:33, 1 July 2011 (UTC)[reply]
    • User side filtration is NOT the same as censorship. Bulwersator 09:38, 1 July 2011 (UTC)[reply]
      • Just to be clear: I do not think this is exactly censorship. I do think it is unworkable and contrary to the goals of most (if not all) WikiMedia Projects. Zanaq 09:53, 1 July 2011 (UTC)[reply]
        • When issuing the referendum, hopefully it will contain details on how it will be done, what committees will be responsible for what, how a user can get protection (against them self), use cases (I don't want to see pictures of abused animals, Bin Laden and Scientology), default values, uploader and user training, ... --Foroa 16:53, 1 July 2011 (UTC)[reply]
          • This always ends up with the situation where one group will tell what the other group is allowed to see. We occuse foreign governments of doing that. Choosing always end up in culture and politics, two things Wikipedia should stay far away from. Edoderoo 13:15, 2 July 2011 (UTC)[reply]
            • This is a user side opt-in filter. I cannot understand how this becomes censorship. --Bencmq 14:47, 2 July 2011 (UTC)[reply]
              • I have actually seen very little info so far. I may hope you are right. An opt-in filter is self-censorship, I won't care. Any other option is an area I'm even not ready to discuss about. I can't tell what others can see or not... No one can... Edoderoo 16:25, 2 July 2011 (UTC)[reply]

┌──────────────────┘

  • Although I do agree that there will be inevitable problems - if the filter is based on categories, there may be dispute over certain images if they should be be put into this or that category. But let's wait and see.--Bencmq 16:51, 2 July 2011 (UTC)[reply]
    This would be an opt-in filter. SJ talk | translate   03:53, 3 July 2011 (UTC)[reply]
  • This is how all censorship starts.. slowly. First it's opt-in, then opt-out and then it wil be mandatory on request of foreign governments / organizations, since the meganism is already in place. Very dangerous. If people want to filter, fine, do it on your own PC. Koektrommel 07:54, 3 July 2011 (UTC)[reply]
    • I'm not aware of any examples where censorship has come in via that gradual way. I suspect the typical pattern is more that if organisations do things that upset people and refuse to compromise by allowing some sort of opt out or rating system, then eventually there is an overreaction, censorship ensues and people regret not agreeining a reasonable compromise. As for the idea of people filtering on their own PCs, what proprtion of our readers are technically capable of doing that, 10%? 1%? We need a solution that everyone can use, not just the technoscenti. WereSpielChequers 17:12, 3 July 2011 (UTC)[reply]
    Then you need a system that does not require an account. As far as the slippery slope goes, suppose we start with an opt-in system and discover that it isn't all that effective as the vast majority of readers do not register and log in. So what then? Do we default to "reasonable" censors for IP readers, encouraging them to register if they want to opt out?? And straight down the rabbit hole we go.
    Before we even consider this referendum, WMF needs to understand the requirements. And the requirements are a system that (a) does not require registration (so has to be based on cookies rather than on-wiki preferences) (b) is easy to use and (c) can easily be advertised without running afoul of the "No disclaimers in articles" prohibition on en and likely other projects. And even then, it will be completely ineffective in many areas. Muhammad pictures for one, as many Muslims won't be satisfied unless we censor them for everyone. The "you have the capability to disable them for yourself" response has gone in one ear and out the other for years.
    For all the battles it will cause, a simple categorization system for images does have value. But I have to say that I'm in with the group who thinks the actual tools for censoring images should be left to third parties. Resolute 14:24, 4 July 2011 (UTC)[reply]

This would be a distraction from Foundation principles

The Wikimedia Foundation, Inc. is a nonprofit charitable organization dedicated to encouraging the growth, development and distribution of free, multilingual content, and to providing the full content of these wiki-based projects to the public free of charge. The Wikimedia Foundation operates some of the largest collaboratively edited reference projects in the world, including Wikipedia, a top-ten internet property.

Any type of filter runs contrary, even if opt in. These projects were designed to take the sum of human knowledge and make it freely available. The donated money, if any at all is used here, for which the foundation uses to further the goal, should be put to better use. NonvocalScream 19:57, 4 July 2011 (UTC)[reply]

That would depend on whether the content was standing in the way of growth, development or distribution - which it seems that it is at the current time. The content would remain completely freely accessible to everyone even with this option in place. As such, I don't believe that it runs contrary to the Foundation's principles. Mike Peel 20:17, 4 July 2011 (UTC)[reply]
What is standing in the way of growth, dev, and dist at the current time? Filters should be provided at the client side, not the server side. The projects or the Foundation have no business wasting resources in such an elegant way. NonvocalScream 20:27, 4 July 2011 (UTC)[reply]
Let's see ... how about contributions from schools, universities and organisations where Wikipedia is currently blocked due to its controversial images? People in countries where their national sentimentality is offended by Wikipedia's content sufficiently that they don't edit (e.g. pictures of Muhammad)? As a very controversial statement, what is worse: Wikipedia not being editable in China due to an image of Tiananmen Square, or not having much more content about China that is freely accessible to everyone on the internet?
Given that most large internet sites provides filtering on the server side, not just leaving it up to the client side, it's clear that there is demand for a server side solution. Simply based on that, I would argue that it is worth the Foundation running a poll/referendum of both its editors and readers to see whether Wikimedia should operate a similar server-side solution. Mike Peel 20:37, 4 July 2011 (UTC)[reply]
The Foundation can run the poll/ref. I think that conversation here, now, before the poll is healthy.
I maintain that the Foundation, and its projects really have no business expanding into this type of area. Regardless of what country won't display because of "this" image, it is not in the ambit of the WMF to do this thing. It is totally out of scope. NonvocalScream 20:42, 4 July 2011 (UTC)[reply]

As long as readers can choose to see the image with one click, I don't believe Wikipedia loses any freedom with this image filter. --NaBUru38 18:43, 15 August 2011 (UTC)[reply]

I agree, but unfortunately it won't work this way. Filtering almost inevitably will be based on cookies, and there is nothing easier than to inject cookie at firewall level, pretending that it was user who decided not to see the image. Ipsign 07:39, 16 August 2011 (UTC)[reply]
If a malicious party controls the firewall, the game's already over. They can block all of Wikipedia, or all images, or pages in some categories, or pages containing particular words. Bypassing those firewalls is out of scope. 59.167.227.70 09:30, 16 August 2011 (UTC)[reply]

I hardly understand why this would be to wikimedia itself to set a filter option. If a filter tool is seen as useful, there is plenty of technical means runned by third party. We can suggest some to people who need or want them. If you don't want to be offended, you'll probably want that for other website too. Then, you don't need a website specific preference, you need a tool that'll be work with a loads of website. I don't think this filter thing is a legitimate goal in regards of Wikimedia principles.
Plus, it could be counter-productive and dangerous. Wikimedia is, by many ways, one of the Internet White Knights, one of those you can count on, because you know they are promoting stuffs that almost everybody will, in the end, call "good" ou "valuable". Would Wikimedia promote a system that some will call "censorship", the whiteness would be gone. And Grey Knights of incertain morals may not deserve the time and money lot of people agree to give to White ones. This filter thing is the best way to ruin Wikimedia's reputation. Think about how good we'll look in press articles. Cherry 14:32, 16 August 2011 (UTC)[reply]

Complicated editing

If this feature is implemented, then when I'm editing the article, I will need to think "hey, how this article will look if user uses this filter? Or that filter? Or that combination of dozen of filters?" In many cases images are essential part of the article, and making article comprehensible without them (or worse - with an arbitrary combination of them), will be an editor's nightmare. And while I am not involved in editing of sexual or religious articles, I am not sure that this tool won't evolve, for example, to allow filtering out "all images which are related to Microsoft Windows/Linux/..." - if it ever happens, then I will find my editing severely affected. Ipsign 07:23, 16 August 2011 (UTC)[reply]

As you can see from the mock-ups, it will be immediately clear that an image is hidden. The box shape and size shouldn't change at all, and the caption won't be hidden, so this shouldn't make images look different at all. Cbrown1023 talk 15:14, 16 August 2011 (UTC)[reply]
Because of the bug that is "being investigated" (see below), i cannot check the referendum questions yet, but the "What will be asked?" section on "Image filter referendum" does not seem to propose a question like "How prominent should the warning that an image is hidden be?" Are you making this statement as a promise by a member of the tech team working on this? Or as a promise by the WMF Board? A mockup is just a possible implementation, not a definite tech proposal. i agree that the mockup proposal seems sufficiently prominent that it's hard not to notice it. (In fact, it is probably so prominent that censoring authorities would not be satisfied, since it would encourage the curious to circumvent the filters.) But i don't see any promises that it would remain that prominent.
As for the main point being made here, i too would expect that in articles where a picture is important, there would be many editors who would feel that the article needs to be primarily readable by those with filters rather than those without. Given that "a picture is worth a thousand words", this effectively means several thousand extra words may often have to be added to articles, making them seem longer and more complicated to readers with filters turned off. Boud 21:35, 16 August 2011 (UTC)[reply]

Technical limitations

If this feature is implemented some technical difficulty may arise. For instance, if the feature relies on cookies, what will we do about people whose browser autodelete cookie at closing ? This isn't just a personal choice : in many public, browser are set on "autodelete cookie after closing browser". Cherry 14:53, 16 August 2011 (UTC)[reply]

If they consistently run into issues like that and usually delete their cookies, then they should login and have their preferences saved with their account. Cbrown1023 talk 15:13, 16 August 2011 (UTC)[reply]
This also goes the other way: many internet cafes or other public access venues will not turn off browsers between different users. So one user choosing to turn on a filter will cause put the next user in an opt-out mode, instead of opt-in. Cafes could presumably also hardwire some cookies to be non-deletable. Non-techie users will try clicking to opt-out, but fail to know why they cannot opt-out or find a method. Boud 21:41, 16 August 2011 (UTC)[reply]

Additional maintenance overhead

Who is going to set up the huge infrastructure that is required for this, and more importantly, who is going to keep it running in an unbiased manner? All the users who think this is stupid won't want to spend hours arguing with religious types (Danish cartoons, 15th century engravings of Muhammad: blasphemy or encyclopaedic?), people who oppose homosexuality (men holding hands: OK or not?), people who think that a picture of a medical condition will scar a child for life ("Think of the Children!" will be a very common refrain), people who have certain political ideologies (a map with Taiwan and Tibet as distinct from China offends User:CCP), people who don't like pictures of any weapons, people who wish to use the system to further wiki-battles (User:Alice has her images systematically tagged by User:Malloy), people who have noting better to do that annoy others (much more time-consuming than vandalism, and much more fun to watch unfold). We have enough of that with the existing CV and PROD set-ups. We will need a clearing-house, individual discussions for each image, and a process to watch for abuse. The backlog will grow since no-one will agree on anything even vaguely controversial (buttocks from the side: nudity or not?). The users who are capable of running a neutral system will probably get bored and and move on, letting the crazies run the show.

This proposal will generate huge amounts of discussion for every borderline image. There are thousands out there who love nothing more than a pointless and time consuming fight that will need to be arbitrated and mediated by people who really have better stuff to do. It will suck up thousands of user-hours (more than it has already) on useless bickering. Inductiveload 17:44, 16 August 2011 (UTC)[reply]

Copyright concerns

Forgive me if I have got the concepts wrong, but the way I see the licensing is that an ISP may not alter the text without providing the original source (or a link to it). If we detect a middleman tampering with cookies to deny users the ability to see these images on a permanent basis, should we pursue those people for violation of the copyright? And if not, does that mean the WMF CC license is unenforceable,and pointless? Obviously data manipulation could be happening already, but a cookie-based solution is an open invitation for low-level tampering. Inductiveload 17:44, 16 August 2011 (UTC)[reply]

Filtering

Wikipedia is an open encyclopedia. Much of the content, if not all, is made for adult educational purposes. Filtering may cause editors to become careful and tone down controversial subjects. I am against using obscenity or explicitly gory or sexual content. However, let the editors be the filterers. This is an educational site for the world, not an elementary school. Let parents filter for their own children. The answer is not a filtering system, that would only interfere with content search ability, in my opinion. A filtering system would give POV to many articles. For example an article discussion on Sally Hemings and Thomas Jefferson could be viewed as obscene, rather then researched historically accurate. An open statement on Wikipedia front page that states Wikipedia contains adult language and content would be appropriate, rather then a burdensome filtering system. Cmguy777 01:40, 17 August 2011 (UTC)[reply]

Definitions

Educational materials

I don't understand what "educational materials" can mean in this page. Materials to educate the voters? Nemo 10:47, 1 July 2011 (UTC)[reply]

So that they could make the right choice... ;-) --Mikołka 10:58, 1 July 2011 (UTC)[reply]
An image or entire set of filter options might help suit different groups.```` — The preceding unsigned comment was added by Marsupiens (talk)
this is a little late, but "educational materials" might be an inadvertent US-centric edit -- in the US every voter gets a packet of information about the upcoming vote, who is running, what the full text of the ballot referendums are, etc. before every election. I always assumed this happened other places too until my UK friends were in town before an election and expressed amazement. At any rate, that's the kind of educational materials that are meant -- to answer the question "what's going on"? -- phoebe | talk 18:37, 31 July 2011 (UTC)[reply]


Phobe, thanks for the explanation. Yes, it's an US-centric thing, I never heard of it anywhere in Europe for example. --Dia^ 07:59, 16 August 2011 (UTC)[reply]

"servers hosted by a neutral third party"

From the content page: "The referendum [...] will be conducted on servers hosted by a neutral third party." Um. Well. What? Will users' login information be shared with this "neutral" third party? If not, how can the vote be held on external servers? Why would a vote like this be held not only off wiki, but off Wikimedia? How could a third partly outside Wikimedia possibly be neutral? --Yair rand 13:38, 3 July 2011 (UTC)[reply]

This means simply 'conducted as the Board election was' - by a third party specialized in overseeing fair votes. There may not be a need for such precautions in this case [I would prefer an open-ballot vote on Meta, myself], but it is often seen as an unbiased choice for running the technical process of voting. SJ talk | translate   15:09, 3 July 2011 (UTC)[reply]
Sj is correct. The election is held using the Securepoll extension, which passes a user who's already logged in to a WMF wiki (and verified as meeting voting requirements) to the servers of the third party for voting. Philippe (WMF) 00:40, 4 July 2011 (UTC)[reply]
Thank you for the explanation. --Yair rand 02:53, 4 July 2011 (UTC)[reply]
Is there some sort of technical explanation of the mechanism, except the source code? If IP addresses and user names are not connected on the third party's servers, then I suppose some cryptographic key is included in the URL or body of the request to the third party server. I brief explanation of how this is done and what information is exposed to the third party would be important. --LPfi 16:39, 5 July 2011 (UTC)[reply]

Interesting, that we are not asked yes or no. The decision ist already made and we are asked for minor details. --Eingangskontrolle 08:54, 16 August 2011 (UTC)[reply]

Design ideas

suggestion from Dcoetzee

I'm pasting this from a mail I sent to Philippe the other day on this topic.

I can't claim to represent general community consensus, although I have some idea what that consensus is like. In particular, in discussions of image filtering the community has made very clear that they don't want WMF projects to have any direct involvement in the construction or maintenance of blacklists, which are lists of articles/files or categories that are blocked for a particular user.

I have a vision for what I think a personal image filtering solution should look like, but not the time and resources to implement it. Basic outline follows.


Customer

Persons interested in self-censorship and censorship of small children by parents or guardians. In particular, censorship of teenagers or technically adept persons is not a goal for me. This is helpful because it means strong technical measures against circumvention are unnecessary.

Requirements

  1. If an article/page/entry is included in the user's blacklist, and is not in their whitelist, attempting to view it will produce an error message and show no content. If the user is using the software in self-censorship mode, they may proceed to view it regardless. Otherwise, a password is required to proceed to view it.
  2. If a file is included in the user's blacklist, and is not in their whitelist, it will be hidden, showing it its place an icon indicating that it was blocked. If the user is using the software in self-censorship mode, they may click it to reveal it. Otherwise, a password is required to view it.
  3. A user may at any time add a page or file they're looking at to their blacklist, hiding it and preventing it from appearing in the future.
  4. When a password is entered to view a page/file, either a time limit is given, or the page/file is permanently added to the user's whitelist.
  5. Both blacklists and whitelists may be specified using a combination of individual pages/files as well as categories of pages/files, which will include subcategories. Whitelists take priority over blacklists.
  6. A public distribution center is available (ideally on a third party/unaffiliated site) where users can share their blacklists/whitelists, and import those of others. This allows communities to specialize the software to meet their community standards, while still being able to individually adapt to their personal standards and the needs of the specific child. Several blacklists can be imported and merged into one.

Architecture

  • Filtering can be done either on the server side (with the cooperation of Mediawiki software, by associating blacklists with user accounts) or on the client side, using web browser extensions. I favor the latter approach because it avoids Mediawiki ever having to deal with blacklists, which is a point of contention in the community. This also avoids trivial circumvention methods like logging out or clearing cookies.
  • Blacklists/whitelists can be stored on local storage. Initial blacklists/whitelists are not shipped with the web browser extension, but are instead selected from the above-mentioned public distribution center.
  • Categories can be extracted using the Mediawiki API.

Design notes

  • To avoid circumventing the filter by editing to remove categories, recent removal of categories can be ignored. If the edit stands and is not reverted as vandalism it will eventually be accepted.
  • In addition to retrieving the categories of each article/file used in the article, it's necessary for each category to walk up the category tree to get its ancestors in case they're listed in a blacklist/whitelist, which can lead to a lot of Mediawiki API calls per page load. This can be mitigated by caching categories.

I pointedly avoid the idea of "labelling" specific articles or files with a "type" of content, although a blacklist may be intended for a specific type of content. I also strongly oppose the idea of having a single one-size-fits-all blacklist, or even just a small number of them to choose from - I think it's essential to let the user community build lists to their tastes, and to force users to choose among a wide, fine-grained collection to discourage overly broad filtering.

Any feedback is welcome. Dcoetzee 07:48, 2 July 2011 (UTC)[reply]

I heartfully agree and support Dcoetzee proposal above. It will be much less contentious and more productive if the whole debate about what is censored and what is not stays out of Wikimedia, as well as the way of implementing it.--- Darwin Ahoy! 02:00, 3 July 2011 (UTC)[reply]
I don't like the idea of the filter being enabled "by parents or guardians". If the small children is considered grown up enough to surf the web, he's certainly smart enough to control the filter as requested by his parents or guardians. It seems an unnecessary complication (management of passwords?!) with great danger of abuse e.g. by network managers. Nemo 07:37, 3 July 2011 (UTC)[reply]
A password seems too much. That would conflict with the idea that no readers are prohibited from seeing any image. Better for such a system to simply let readers change the default display setting (hidden or not) for categories of images. SJ talk | translate   11:42, 3 July 2011 (UTC)[reply]
I see nothing wrong if this is made based on existing content labelling standards. This would allow existing parental control softwares to work immediately with such labels. It's not our job to determine if these labels are working or not, or can be easily avoided by people wanting to pass over. But at least those that want those filters for themselves will be satisfied. It's just enough and if it allows more people visiting or contributng without fearing to see such images, we will win something. (Note that enyway, content filters based on the textual content of articles are already working, we made absolutely nothing to prevent this, but if the existing softwares can't easily parse images, all they will finally do is to block images from Commons completely, whatever they are showing in any article : this is already occuring in third party sites that are reindexing and republishing selectively the content of Wikimedia sites).~
We are NOT proposing any kind of censorship, only content labelling based on existing standards. Censorship only occurs elsewhere and we have absolutely no control on those third party sites or softwares. Users will setup their own filters themselves or will have to live anyway with these third party sites and softwares; or to their national legislation, if it is applicable to them, but we may (and probably should) still help them comply to their law). verdy_p 16:56, 3 July 2011 (UTC)[reply]
Wrong. Categories should be broad and small in number. One either doesn't mind seeing photos of sexual activity, or chopped up body parts or one doesn't. Broad banding is preferred unless one really wants to give users the choice of saying don't mind porn but none of that gay stuff, do you really want them to be able to say no to piss Christ but yes to piss Mohammed, or no piccies to dead Westerns yes to piccies of dead Africans? John lilburne 17:11, 3 July 2011 (UTC)[reply]
John - Can you articulate why it would bother you if someone else chose to see one of those complicated sets of categories? SJ talk | translate   18:59, 4 July 2011 (UTC)[reply]
Principle of least astonishment. Imagine someone who is taken aback upon seeing a sexually explicit image, decides to use this feature, and finds them self looking at a menu that catalogues every conceivable particular sex act or fetish. It could be a double-take aback. ~ Ningauble 19:39, 4 July 2011 (UTC)[reply]
That's a good principle for interface design - as is broad banding. But that doesn't necessarily preclude having an option to create a custom set of categories (or choose from a longer user-defined list) for one's own use -- which is how I read John's comment.

Lets ban all works of art with nudety. Just to be on the safe side. --Eingangskontrolle 15:32, 16 August 2011 (UTC)[reply]

"We are NOT proposing any kind of censorship, only content labelling based on existing standards." So does that mean you are proposing a special "content labelling" for articles like en:Tiananmen Square protests of 1989, en:Falun Gong, en:Tibetan independence, en:Taiwan independence, en:corruption, en:police brutality, and en:anarchism? These are considered to be unacceptable articles according to a well-established and reliably-sourced (RS'd) existing standard. The NPOV name for that existing standard includes the word "censorship" (whether i agree or not is irrelevant). Boud 22:10, 16 August 2011 (UTC)[reply]

Questions about the design, existing alternatives

Hello there, I will just add a few comments regarding the upcoming referendum. As to my background: I am a Computer Scientist by profession, so I can technically judge what is involved. As a quick outline:

  • A user (either anonymous or named) is able to exclude certain images from search reasults. In the case of anonymous users, the preferences are stored inside the users session. Closing and re-opening the browser will reset the settings. Users who log in to edit can store their settings in their profile; their settings will not be reset when they close/open their browser.
  • Architecture wise: blacklists and whitelists can be used. To be determined: do these act on groups of images, or on single images? - Is it possible to whitelist single images in a group that is blacklisted? - To blacklist single images of a whitelisted group?
  • How does the software identify the images to "filter"? - There are two options: At load time, the software analyses the image; the result of this analysis is used for filtering. This will incur extra costs in computing time, and memory; There are different algorithms, which yield different results. The other option is static tagging. This option has the drawback that some people need to decide the tags to use ("tag wars" have been cited above). Also the behaviour needs to be specified if an image does not have any tags; the blacklist/whiltelist approach can be used.
  • There are programs on the market that implement a client-side proxy, and that probably cover 80-85% of what this development will achieve. I currently see no benefit in implementing this solution on the server. The solution where the filtering is done dynamically (i.e. no static tags), and on a per-image basis would probably be superior to the client-side filtering. This however comes at the cost of additional cpu and memory usage, as well as false positives/false negatives.

To summarize:

  • If the solution of static tagging is chosen, we have the problem that images need to be tagged, and "agreement" over the tags to use needs to be reached in some way. Also, the behaviour in the case of an untagged image needs to be defined. Finally, we need to define the granularity: Is it possible to "whitelist" individual images of a group that is "blacklisted" (or to "blacklist" individual images of a whitelisted group). Finally: how do we determine the "tags" (or group of tags) to use?
  • If we tag dynamically, we incur extra costs in cpu and memory use of the system. We need to reach agreement over the algorithms to propose for identifying images; we need to implement those algorithms, which may be technically difficult; we may need to think about caching results of calculations, to reduce cpu load. Also note that the algorithms use stochastic information. There will be false positives, and false negatives.

Both approaches have their benefits, and drawbacks. Neither is "quick to implement". So given that client proxies ("filters") out there probably cover 80-85% of the requirement the usual client needs ("don't show images of nude people of the opposite sex"), where is the use case that would justify 3-5 people work 3-6 months, to get the extra 15-20%? --Eptalon 09:12, 4 July 2011 (UTC)[reply]

Please provide more data to explain what "80-85%" means to you here. (list some of the clients you have in mind, and the use cases you feel constitute the 100%). If there are client-side tools that are aware of Commons image categories [or can be customized to be] that would be a useful data point. (And pointing to an open-source client-side option for readers who want one, that is known to work smoothly with WM projects, would be in line with the goal here). Two use cases, for discussion purposes:
  1. You're browsing wikipedia, possibly on someone else's machine, and want to toggle off a class of images. [for instance: giving a WP demo at work, or in Saudi Arabia, &c.] It may not be possible to install your own client software, and you'd like to be able to set this in under a minute.
  2. You come across a specific image you don't want to see again (and checking, find it is part of a category of similar images), and want to hide it/them in the future.
SJ talk | translate   15:42, 4 July 2011 (UTC)[reply]
Client-side proxy filters are aimed at parents worried that their children might see the wrong type of image; AFAIK most of them work with whitelists/blacklists of sites; they do not do an on-access scan of the image. In addition, they might do "keyword scanning" in the text (to filter hate sites, and similar). The customisation of these products lies in being able to select "categories" of sites to block/allow, perhaps on a per user basis. Our "static category blacklist/whitelist" approach would in essence do the same thing, except that to achieve it, we need to do development work, and at the best, we match the functionality of a USD 50 product. In addition, load is placed on our servers to do the filering work (+possible problems with the categorisation). Using the dynamic approach will mean even more load on our servers, the possibilities of "false positives"/"false negatives"; the difficulties in finding training data (note: that data can not be used later on), etc. In short: a lot more (difficult) work. We may exceed the USD 50 product as to functionality, but we have 3-5 people developing 4-6 months. I really don't know if I want to spend up to 250.000 usd (24 man-months) to "not see an image again" - it seems out of proportion.--Eptalon 19:29, 4 July 2011 (UTC)[reply]
Point of clarification... $250,000? NonvocalScream 19:36, 4 July 2011 (UTC)[reply]
Let's be very careful about throwing around dollar figures. That hasn't been scoped, so I think it's dangerous to introduce false numbers to the equation at this point. Philippe (WMF) 19:39, 4 July 2011 (UTC)[reply]
Duration of project: several people, 4-6 months (for the dynamic approach, not using static tags). --Eptalon 20:15, 4 July 2011 (UTC)[reply]
According to whom, by what metrics and judging by the speed of what resources? How about we let the folks who are designing the thing scope it out, once it's, you know, designed? Philippe (WMF) 23:59, 4 July 2011 (UTC)[reply]
Eptalon, you seem to assume that everybody's got a web proxy. I don't. If I really, really don't want to see the infamous picture of the guy jumping to his death from the Golden Gate bridge, my options at the moment are:
  1. Don't read Wikipedia (because any vandal could add it to any page at any time),
  2. Especially don't read pages where that image might logically be present (bummer if you need information about suicide), or
  3. Figure out how to manually block all versions of that image in every single account (five) and every single browser (two) on every single computer (four) I use—which will effectively keep that image off my computer screen, but not any others like it.
This proposal would let me control my computer by clicking a "don't really feel like seeing images of dead bodies today, thanks anyway" button. The images would appear "hidden", and I could override the setting any time I felt like it by simply clicking on the "This image hidden at your request because you said you didn't feel like seeing any images of dead bodies today" button. There is nothing here that would let some institution control my computer. WhatamIdoing 19:23, 5 July 2011 (UTC)[reply]
I dont have one (or use any filtering software); Your porposal shifts the problem though. You need to agree with other people about the categories. In the 16th century, a painter called Lucas Cranach the Elder pained a woman, before a tree, wearing a necklace (called 'Venus'). In the same century Mechalangelo did his statue David. In the 19th century, Jules Joseph Lefebvre painted a woman with a mirror ('Truth'). To me, all these works are works of art, and as such, limiting their audience does not make sense. In the 1990s, a museum in London, used Cranach's painting as an ad for an exhibition; they showed it on posters in the London Underground - and there was an outcry.--Eptalon 14:27, 6 July 2011 (UTC)[reply]
@Eptalon: I think there is a very real advantage to doing a system specific to Wikipedia that takes advantage of our category structure: filtering can be made more precise, which means not just missing less things that people want to block, but more importantly, avoiding blocking educational materials that young readers need access to to learn. This is our chance to give people a solution that isn't as conservative and overzealous as every other generic solution on the market. Dcoetzee 22:31, 16 July 2011 (UTC)[reply]

Logging in to permanently save

Just to point out: the concept presented in File:CC-Proposal-Workflow-Anon-FromNav-Step3.png (logging in to permanently save options) wouldn't necessarily work in the classic scenario of parents applying filters to their children's computers, as:

  1. Their children would then edit using the accounts that have been logged in (pro: would work well in terms of accountability, con: what happens if the account is subsequently blocked for vandalism? Note that the account would most likely pass semi-protection due to the length of time since creation - although it might not pass in terms of number of edits, at least to start with.)
  2. Their children could simply log out, and/or log in with their own accounts, thereby bypassing the filter.

Mike Peel 21:48, 23 July 2011 (UTC)[reply]

Protection levels

Just like many security software appplications have many security levels, I propose the same for this image filter. There must be few levels per tag, 3 or 4, so decisions are easy. For example, 0 for no sexual content, 1 for light clothes like here, 2 for underwear and 3 for visible genitals. --NaBUru38 18:39, 15 August 2011 (UTC)[reply]

Opt-out of seeing the "hide content" tabs

If we were to implement this, I would like to see a user-configuration option (which obviously applies only to registered users) that allows me to turn OFF the "hide content" tabs so that I don't see them. That way my limited screen space is not cluttered with things that I will never click. Of course the default for that option can be "show tabs", because I'll only have to turn them off once. Mitch Ames 10:07, 16 August 2011 (UTC)[reply]

If we do decide to do this proposal, I too think this would be necessary. I wouldn't want to be reminded every time I see a illustration that Wikipedia is implementing self-censorship. DGG 16:13, 16 August 2011 (UTC)[reply]
Even better: the "hide content" tabs could be made opt-in too. The opt-in for the hiding-content system could be located at the bottom together with the * Privacy policy * About Wikipedia * Disclaimer * Mobile view links. Otherwise, we constantly get the idea that we should feel guilty about looking at Commons images shown on the Wikipedia. "Edit" and "hide content" are very different: one is about correcting errors (or completing, clarifying, etc.), the other makes us think about censorship. Boud 22:29, 16 August 2011 (UTC)[reply]

Block all images option

The only image filter that I would really like to see would be one that blocks all images, as long as I could unblock them by a simple click (or right click then select). (I know that this can also be done in the browser or with plugins such as adblock, but each presents problems: Many web sites use images for links, so blocking images can make navigation difficult, and sometimes collapsed images distort the web page. Unblocking images blocked by a wildcard filter in addblock can sometimes be a pain too.) I spend most of my time in Wikipedia reading history articles, where images add little information content and seem to be there mostly to enhance the user experience. For people with a slow internet connection (I was still on half speed dialup a few months ago), not having to load images really helps navigation speed. Does loading images each time an article page is visited use Wikipedia’s system resources?--Wikimedes 07:09, 17 August 2011 (UTC)[reply]

Different idea for categorization

Instead of trying to categorize the infinite ways that people can find images offensive, how about allowing people to block images on all pages from a certain project or project category. People who don’t want to see images of battlefield carnage could block images on Military and Warfare or death pages, people who didn’t want to see depictions of Mohammed or the Crucifiction (very violent) could block images on religion pages, people who don’t want to see images of tantric sex could block images on religion or Sexology and sexuality pages, etc. I could block images on history and society pages where they’re not usually necessary but leave them in Technology or Medicine pages where they’re often useful. Granted that if a reader’s goal is to block offensive images this type of filtering system might be even more porous than one based on objectionability. But a filtering system based on objectionability means that the Wikimedia Foundation is taking a position on what’s objectionable to the extent of creating objectionability categories. A filtering system based on project categories would avoid that.--Wikimedes 07:09, 17 August 2011 (UTC)[reply]

Voting process

Transparency of voting

Please consider using "open voting" in such a way that we can see/validate the votes, such as in steward elections. I do not trust a third party as currently proposed.

Also, please consider placing a sitenotice. I do not believe that VP posts are enough to inform every member. NonvocalScream 20:47, 4 July 2011 (UTC)[reply]

Open voting works well for voting from Wikimedia user accounts, but I would imagine it won't work as well for votes from readers (assuming that readers will also be eligible to vote - which I would view as a prerequisite). Posting a site notice (along the lines of the fundraising banners) should also be a prerequisite to this referendum in my opinion. Mike Peel 20:54, 4 July 2011 (UTC)[reply]
You're moving too fast. :-) The only thing that's been published so far is the call for referendum, which has been posted to village pumps and mailing lists. That is not the only notice we are going to be doing throughout the process — it's the first of many. We will be using a CentralNotice during the actual vote period, as far as I know. Cbrown1023 talk 01:22, 5 July 2011 (UTC)[reply]
Voting will be verified as usual for the Board of Trustees elections (meaning, the committee will verify them using SecurePoll). The third party will not be responsible for that. The third party, incidentally, is Software in the Public Interest, as usual. Philippe (WMF) 01:23, 5 July 2011 (UTC)[reply]
Ok, I understand Cbrown's remark and I'm comfortable with that response. Could you help me understand the advantage to using the closed voting system, versus the open one? NonvocalScream 04:17, 5 July 2011 (UTC)[reply]
It's the same advantage that convinces us to use secret ballots in the real world: A greater likelihood that people will freely vote for what they individually believe is best for themselves and the group, without fear of public censure or retaliation. WhatamIdoing 19:27, 5 July 2011 (UTC)[reply]
And automatic checking is less time consuming than hunting for voters with their second edit etc Bulwersator 06:49, 7 July 2011 (UTC)[reply]

Ability to change our vote

I see no hint in my Special:SecurePoll/vote/230 page that once I have voted I'll be able to come back and change my vote. I'd like to be able to do that in case I hear more discussion which changes my mind. Without that guarantee (which should be stated since it is not the way most real elections work) folks may delay while they researche the vote, and end up forgetting to vote. Can you change the ballot to make it clear one way or the other? Nealmcb 18:01, 17 August 2011 (UTC)[reply]

You can come back. You will get a warning message and you can vote again. The final vote will overwrite older ones. --Eingangskontrolle 20:43, 17 August 2011 (UTC)[reply]

Status of referendum text

There's less than a month until this vote is supposed to start (August 12). A few people have asked to have some advance notice on the text of the referendum prior to the vote (plus I imagine it needs to be translated). When will the text be available? --MZMcBride 07:00, 19 July 2011 (UTC)[reply]

Hopefully in the next 24 hours, if all goes well. Thanks :) Philippe (WMF) 18:53, 19 July 2011 (UTC)[reply]

Vote/Referendum Question

When can we see the proposed wording of the question? Kindly, NonvocalScream 17:43, 23 July 2011 (UTC)[reply]

You've probably already seen it, but: Image filter referendum/Vote interface/en. Cbrown1023 talk 18:56, 15 August 2011 (UTC)[reply]

Suggest splitting question

I would suggest splitting "It is important that the feature be usable by both logged-in and logged-out readers." into two questions: one asking about logged-in readers, the other asking about logged-out readers. Otherwise, what would anyone that thinks that the feature should only be enabled for logged-in readers vote? Alternatively, if this question is intended to ask "should this feature be enabled for logged-out as well as logged-in users?", then the phrasing should definitely be improved... Mike Peel 21:10, 23 July 2011 (UTC)[reply]

suggestions welcome :) the latter is what was meant. Basically, you can do a lot with account preferences, as you know; it takes a bit more hackery to do it for anons... but that's 99% of our readers, who are the target audience. -- phoebe | talk 18:45, 31 July 2011 (UTC)[reply]
Actually, it's all of us on occasion, because you can't stay logged in for longer than 30 days. That means that even the most active user is going to have at least one moment every 30 days when s/he is logged out. WhatamIdoing 19:23, 4 August 2011 (UTC)[reply]
I arrived here with the same question that Mike Peel asked. Votes based on the current wording may not produce as clear results as if it were split into two questions. Rivertorch 05:01, 10 August 2011 (UTC)[reply]
I agree with Mike Peel, that question should be split in two. -NaBUru38 18:40, 15 August 2011 (UTC)[reply]
I guess it's too late to implement this change now, which is a shame. Would anyone on the committee be able to explain why this wasn't implemented? Mike Peel 19:11, 16 August 2011 (UTC)[reply]

Importance vs. should/shouldn't

"It is important for the Wikimedia projects to offer this feature to readers." - again, there are two separate questions here that have been rolled into one. (a) should the Wikimedia projects offer this feature to readers?, and (b) how important is it to offer this feature? Although answering (b) implies support of (a), asking about importance is a very separate question from whether the feature should or should not be enabled. If the majority of the community believe that Wikimedia shouldn't allow this feature, then the importance doesn't particularly matter. (in contrast to: if the feature is rated as low-importance, then that could mean that the WMF still funds its development and implementation but doesn't rate its development as highly as if it were high importance). Mike Peel 21:15, 23 July 2011 (UTC)[reply]

Note that I don't believe that this should be subject to Wikimedia's standard rule of consensus: if e.g. 10% of voters believe that this feature should be enabled, then that's probably a sufficient amount of people (i.e. representing sufficient demand) to make this worthwhile implementing - particularly if those voters come from under-represented parts of the world. Mike Peel 21:29, 23 July 2011 (UTC)[reply]
We are not asked if we want this feature nor not. It will come anyway as there is no way to stop it by this farce. --Eingangskontrolle 09:00, 16 August 2011 (UTC)[reply]

Voting eligibility

I have to admit that I simply can't understand the rules for eligibility to vote for this election. My questions include:

  • Why can't Wikipedia readers vote? (should this really be restricted to the editing community?) - this is my key point in leaving this message. I know that this presents technological difficulties, but this should be the same technological difficulty that is present in implementing the image filter, so the difficulties should inherently be solve-able (otherwise this exercise is somewhat moot)...
  • Why should mediawiki developers, WMF staff and contractors, and board members be eligible to vote if they don't meet the 'editor' criteria? Either they should have sufficient experience on-wiki to meet these criteria, or they will most likely be working on topics unrelated to this and hence their vote shouldn't necessarily count
  • If WMF staff, contractors and board members are eligible to vote, then shouldn't the same apply to other Wikimedia staff, contractors and board members - i.e. including those of the Wikimedia chapters?

P.S. my input here is meant to be constructive, and I hope it comes across as being in that spirit. Apologies if it comes across otherwise... Mike Peel 21:27, 23 July 2011 (UTC)[reply]

We did try to come up with a way to include Wikipedia readers; however, it is nearly impossible to ensure a representative result. For example, certain countries have minuscule IP ranges, and it would be impossible to tell if votes received were from hundreds of different readers, or a handful of readers voting multiple times; the same could be true for readers from large institutions who operate through a single or very small IP range. As to the remainder of the criteria, I believe this is intended to be the standard group of criteria, with the only variation between votes/referendums being the minimum number of edits. The intention is to be as inclusive as possible while still doing our best to ensure each person casts only one ballot. Risker 23:23, 24 July 2011 (UTC)[reply]
How about using cookies to identify individual computers? There is still the potential for people to abuse that, by deleting the cookie / resetting the browser, but it largely avoids the IP address issue. You could also (temporarily) record the additional information of IP address, internet browser, operating system version, etc. in combination with looking for identical votes (e.g. same rating for all questions each time) which would help identify multiple votes from the same location. Since there should be a very large number of people voting, a small number of people voting 2-3 times won't matter that much (it'll be well within the noise / uncertainty) - so you'd only need to pick out cases where someone might have voted over 10 times or so.
Re "standard" - I don't believe that there is a standard yet, since this is only the second referendum that I'm aware of (the first being the license migration), which means this is part of setting the standard. Either way, the voting criteria should be thought through each time to make sure they're appropriate. I'm obviously all for being as inclusive as possible, but I'm not sure that the additional criteria are inclusive in a balanced way. Mike Peel 08:42, 25 July 2011 (UTC)[reply]
I don't think this sounds like a good idea at all. Cookies can be deleted very quickly, a new IP address can easily be requested, and there is no law that says that a browser has to send the same identifying information all the time. While each of these may seem like a layer of extra protection, the problem is that the one hacker defeating the vote will know how to get around them all. Of course, it is possible that an editor could start - or hack into - lots of extra accounts to do the same thing, but at least that actually sounds like work.
Also, I really think that someone who only reads, never edits, wouldn't appreciate the issues involved. If all you do is read articles you have no idea of the kinds of rancorous debates that can get started over one word in an article. This "filter" idea might sound perfectly practicable to someone in that position, who doesn't realize how much drama would erupt over any borderline categorization. Wnt 19:56, 30 July 2011 (UTC)[reply]
I'd like to know the opinions of the logged-out readers, but it sounds technologically difficult. Because of these difficulties, I probably wouldn't treat the two kinds of votes as being equivalent. That is, it might be nice to know that about ___ percent of unregistered users thought this or that, but you'd want to take that with a huge grain of salt, whereas the same statement about registered users could, I think, be relied on as fairly precise. WhatamIdoing 19:27, 4 August 2011 (UTC)[reply]

I don't see how it would be a standard that "Developers" "staff, contractors" of the Wikimedia Foundation, "Board members and advisory board members" can decide about the content of wikipedia. How many contractors does the Wikimedia Foundation have, and who are they, is there a list? If I spend money to the Wikimedia Foundation, do I become a contractor then? --Rosenkohl 13:06, 17 August 2011 (UTC)[reply]

Quantification of representation of the world-wide populace

It would be very useful and interesting to quantify what fraction of different countries hold that this is an important (or unimportant) feature, particularly if it turns out that the countries with low representation in this poll and in Wikimedia's readership believe that this is a key issue. It is crucial that this is quantified given Wikimedia's pre-existing biases, which could mean that the poll will simply measure the selection bias present in those questioned.

If e.g. it turned out that American (or British/European) editors thought that this feature was unimportant, but that African/Asian cultures thought it was very important, then it would obviously be very important to implement it for Wikimedia's future growth (even if that implementation happens on a geo-located basis). If the vote came back simply saying that this was not important to Wikimedia's current editing populace, then it could be argued that this just shows a western bias rather than presenting meaningful results. Mike Peel 21:36, 23 July 2011 (UTC)[reply]

I totally agree. I hope we can figure out a way to measure this while keeping anonymity etc. Side note: this is a complex (but hopefully solvable) type of problem -- how to take a global, representative referendum on any question that affects the projects? this is something we need to figure out how to do better, now that we are collectively so big. I'll be happy if this referendum is a test for good process. -- phoebe | talk 18:32, 31 July 2011 (UTC)[reply]
Similarly, although perhaps somewhat less importantly, if the feature is strongly supported by women, then I hope that fact would be considered relevant to Wikipedia's future growth. WhatamIdoing 19:29, 4 August 2011 (UTC)[reply]

Transferring the board

Maybe it could be wise to transfere the board to a more neutral country like Switzerland? aleichem 23:45, 24 July 2011 (UTC)[reply]

ha :) which board? The WMF board doesn't actually really have a home (members are from the US, Europe & India at the moment); but we do govern the WMF which remains US-based. At any rate, the board's deliberations are more influenced by wikipedian-ness than they are by nationality :) -- phoebe | talk 18:29, 31 July 2011 (UTC)[reply]
which remains? on what authority? aleichem 01:31, 7 August 2011 (UTC)[reply]
The Wikimedia Foundation is a legal entity located in the USA, moving it to Switzerland would mean moving it to a country with laws much closer to European Union Privacy law. I for one am not convinced that this could be done without deleting a lot of information we hold on living people. The Foundation is ultimately governed by a board and several members of the board are elected by the community, either directly or via the chapters. So I suppose that if the community wanted to move the legal registration from the USA to Switzerland or indeed any other country the first thing to do would be to present a case for this and the second thing would be to elect board members who favoured the idea. At the moment I think that is a long way off, and judging from the recent elections there is little if any support for fundamental change to the WMF. WereSpielChequers 08:07, 9 August 2011 (UTC)[reply]

Inadvertent overwrite with telugu translation

Admins: Please restore the english version.Sorry for the inadvertent overwrite.-Arjunaraoc 12:01, 25 July 2011 (UTC)[reply]

You don't need to be an admin to revert changes. ;-) I've reverted the page back to the english version. Mike Peel 13:07, 25 July 2011 (UTC)[reply]

Neutrality

An example of what a completely non-neutral filtering tool would look like.

The FAQ for this referendum says that one of the principles behind the creation of the filtering tool is that "The feature is to be culturally neutral, and all-inclusive". However, the mock-up designs displayed show completely non-neutral filtering options. The IFR page says that "These are subject to change based on the outcome of the referendum and the realities of feature development, but it's likely that the final product would look very similar to these mock-ups". I think it would be very helpful if this were made clearer, explaining that these are just images showing what the interface style might look like, and that the actual feature will actually be neutral, giving no possible filtering setting any higher availability than any other conceivable filter, and not at all include any options like those given in the images displayed, assuming that this is the case, of course. If this is not the case, the FAQ must be corrected before the start of the referendum, in order to allow people to understand whether this feature would abolish NPOV before they vote. --Yair rand 22:29, 26 July 2011 (UTC)[reply]

Hi Yair. Thanks for your comment, but I think you're likely misreading the intention behind these (admittedly vague) statements. The board specifically noted neutrality in design as a core principle -- we were thinking of things like category names. "Nudity" is pretty objective and neutral, for instance, as a descriptive category; "bad content" is not. This is important. That doesn't mean, however, that we can't pick certain categories of images to make hideable, as those that are commonly controversial across many cultures. I don't quite know what you mean about interface design; but I disagree quite strongly that the feature would "abolish NPOV" -- remember in this proposal nothing goes away permanently, and editing standards remain the same. -- phoebe | talk 18:18, 31 July 2011 (UTC)[reply]
How is that relevant? The reason (well, one of them) behind rejecting repeated proposals for temporary/permanent hiding of certain "objectionable" content was that "objectionable" means something different for everyone and hiding certain content would be giving preference to certain groups in violation of NPOV, which, as you helpfully pointed out, is exactly what this image filter will do, and not, as SJ suggested above might be the case, give options to hide any such images in any neutrally selected categories not giving preference to select groups. --Yair rand 20:32, 31 July 2011 (UTC)[reply]
  • There will be no permanent hiding of anything. Every single reader will be able to see every single image he wants to see.
  • This is not a one-size-fits-all filter system. You decide what categories of material you find objectionable (if anything). It doesn't matter what everyone else wants to hide. Consequently, it doesn't matter if "objectionable" means something different for every reader.
  • Putting a 'click here to see the image' button on your screen doesn't impair NPOV at all. Failing to load the picture of the man jumping off the Golden Gate Bridge when the page loads the first time does not "give preference to certain groups" or "violate NPOV". WhatamIdoing 19:37, 4 August 2011 (UTC)[reply]
  • It is a one-size-fits-all filter system, though, because the filtering categories with which the end user is presented, as well as the content of those categories, will be decided by the community. That will inevitably mean edit wars and non-neutrality. I too would much prefer the system that SJ suggested, but that is not what we will be voting on here.--Danaman5 00:57, 7 August 2011 (UTC)[reply]
The fact that multiple options will be presented means that it's not a one-size-fits-all system. You could choose to see everything except one type of images, and I could choose to see everything except one different type of images, and the next person could choose to see everything except four types of images. The ability to configure it means that it's not one-size-fits-all. It might well be a 5040-sizes-fit-most system (that's the number of combinations available in the sample above), but it is not a one-size-fits-all system. WhatamIdoing 22:49, 14 August 2011 (UTC)[reply]
You are wrong. The user can decide, that he does not want the see "extreme politicans". But who will fit to this description? Some will say Hitler, Stalin, Goebbels. Other will include Mao, Lenin and Mubarak. And still others will include Palin, Obama or George W. Bush. I predict a censors war. --Eingangskontrolle 09:10, 16 August 2011 (UTC)[reply]
Contrary to what was said above, nudity is not an all-or none concept. Are pictures that deliberately leave that fact ambiguous, nudity? Is a image with the genitals covered by a large X, nudity? Do we propose to differentiate between male and female nudity? And similarly with all the other categories. Anyone who thinks this is hair-splitting should see current discussions of almost anything controversial on Wikipedia. There are no world-wide standards, and any attempt to do one is likely to drift either to the current US view of people like most of our editors, or towards the most inclusive position. Or possibly to the least inclusive--certainly at least some people opposed to the general concept will try to move towards that one. DGG 16:25, 16 August 2011 (UTC)[reply]
The filter system is POV. What could be obscene to someone may not be obscene to someone else. A filtering system, in my opinion is all about controlling content in an adult educational format. I do not believe that a fundamental change on Wikipedia be dictated by a few persons who are offended by some photo. I have yet to find an article on Wikipedia that is explicitly gory or contains graphic sexual content. There is currently no need for a filtering service on Wikipedia. A filtering system would destroy any neutrality in an article, since, by its very nature photos from an article would be not shown if considered obscene by some unknown persons who could be offended by this photo. There are articles that children, in my opinion, need to be supervised by an adult. Again, let the choice be for the parents to make, rather then a biased filtering system. Cmguy777 02:35, 17 August 2011 (UTC)[reply]

What does this mean?

The last question: "It is important that the feature be culturally neutral: as much as possible, it should aim to reflect a global or multi-cultural view of what imagery is potentially controversial."

Does this mean that the categories would be drawn according to some kind of global average? Does it mean more categories would be created otherwise? I don't know what effect a vote either way would have on development. Wnt 19:49, 30 July 2011 (UTC)[reply]

We were thinking, with this and a few other points, of sort of a straw poll on what principles are most important. I'm not yet sure what it would mean specifically for design, but the answer to this would help everyone know what's important to the wikimedia community at large (I can assume that many people agree with this principle, for instance, but not everyone does -- I expect a truly global standard would offend both those who think there is nothing controversial in our image collections, and those who think there is a great deal of controversial material). -- phoebe | talk 18:27, 31 July 2011 (UTC)[reply]
I take a culturally neutral feature as being one that can be tuned to suit people of many diverse cultures, some might choose to filter out images that I'm happy to look at, others might choose to see photos that I'd rather not see. That means a much more sophisticated filter than a simple on off button, and consequently so many user chosen lines that there is no need to debate where one draws the line. If instead we were debating a simple NSFW option of excluding material deemed Not Safe For Work then yes we would need to work out a global average of prudishness. But we aren't so we don't need to. WereSpielChequers 07:54, 9 August 2011 (UTC)[reply]
Given the complexities of designing a system to meet fine grained requirements, the conclusion follows we should have no such system at all. If anyone wants one, they can design their own at the browser level,and the WMF should take the neutral position of neither helping or hindering them. Nothing has to be done or developed here. It's the business of other entities entirely. (The only other logically consistent position is to regard freedom of expression as such an absolute good that we should interpret our role as hindering the development of any censoring apparatus. It would still be legal under the CC license, of course, just that we would arrange the system to make it as difficult as possible. This is what I myself would ideally want, but I accept of the need to maker our content accessible to those with other views than my own, and consequently not actively hindering outside filtering.) DGG 03:33, 17 August 2011 (UTC)[reply]

Meaning of the Referendum

I read the questions as listed now as "We've decided to implement this feature, so how should we do it?", and not as "Should we implement this feature?" Is that a correct reading?--Danaman5 06:06, 4 August 2011 (UTC)[reply]

I would really like an official response to this from someone involved in planning this referendum. Is it already a certainty that this feature will be implemented, or not? If not, the questions really need to be revised.--Danaman5 01:00, 7 August 2011 (UTC)[reply]

  • Hi Danaman5, yes, the Board passed a resolution asking for this kind of feature to be developed; so that's a given. The questions are asking for community input in how it is developed and implemented. -- phoebe | talk 05:00, 15 August 2011 (UTC)[reply]
Rely on "the" category system? Many wiki projects have their own image uploads and their own categories, so we are dealing with many category systems. Currently the Commons category system is really quite an internal thing, the images displayed in an article on any other project are not affected in any way by what we do to the category system, images are recategorized all the time, category structures are modified all the time, if filtering relies on the commons category system we can now cause huge disruption very easily in our everyday editing. We are currently going backwards as regards getting new uploads properly categorized, I doubt the extra energy that will go into maintaining categories that comply with the filtering requirements will help that problem. I suppose an optimist would hope that the new emphasis, put on Commons categories, would mean that more people would come to Commons to help. But I expect that the filtering system will attract zealots trying to censor material by categorizing images that they (and no one else) want hidden, and given the current backlog in categorization, they will not be rapidly reverted. --Tony Wills 09:22, 9 August 2011 (UTC)[reply]
Ok, I see from the Personal image filter page that the idea is to have a separate flat category branch for filtering, that would allay my fears (it might cause other problems, but wouldn't impinge on the current categorization of files). --Tony Wills 10:59, 9 August 2011 (UTC)[reply]

Combination with content filters

When designing this feature, in the case it gets accepted, perhaps you also could find a way to allow content filtering software to integrate it in their products? Or perhaps a little software that automatically enforces certain filters, protected with a password? This would mean that parents who don't know how to set up the filtering themselves, or who want to prevent their children from disabling it, can also profit from this. Adelbrecht 19:44, 5 August 2011 (UTC)[reply]

This is a thorny but important issue. Currently we don't have supervised accounts where parents can set preferences for their children, hence this proposal where an 8 year old would be only a click away from overriding a filter. I would anticipate that any content filtering software should be programmable to use these filters - the bigger question is which content filters filter out the whole of Wikimedia and which only filter out certain images or categories. WereSpielChequers 04:49, 9 August 2011 (UTC)[reply]

IP editors

The current mock ups show the edit filter option appearing even if you are logged out. I'm uncomfortable about this as it implies that one person using an IP could censor the viewing of others. I would be much less uncomfortable if this was cookie based, so I could make decisions as to what appeared on my pc. I don't like the idea that the last person to be assigned my IP address gets to censor my viewing. So my preference is that filters are only available if you create an account and login, or via a session cookie on that PC, not a permanent cookie. WereSpielChequers 04:49, 9 August 2011 (UTC)[reply]

I believe it would be handled based on cookies, not based on an IP address. You're right that having the settings associated with an IP address just wouldn't make sense. Cbrown1023 talk 21:16, 9 August 2011 (UTC)[reply]
Thanks for the explanation. I can see an advantage that schools will be able to set a filter, but of course anyone will be able to change the filter for the whole school IP. Libraries and internet cafes will be in a similar situation. I foresee this leading to lots of queries. WereSpielChequers 14:58, 10 August 2011 (UTC)[reply]

How will "the community will decide the category structure and determine [image categorization policy]"?

What images will be hideable?
Hideable images will be grouped into categories. The community will decide the category structure and determine which images should be placed within those categories.

As others have said above, (I redundantly reiterate their sentiments merely to bump the issue), this will just lead to categorization edit-wars and contentious, unproductive debates over which categories are worthy of being censorable (or possibly significant server strain to accommodate ridiculously flexible censorship options). I could perhaps envision semi-private categories created by subcommunities, but this would have obvious ownership+neutrality issues. On the whole, I think the entire thing is much better left up to third-party content filter providers; there is no need for Wikimedia to so entangle itself. I, for one, shall therefore vote this down. There are the best of intentions at work here and some good ideas, but I fear the actual plan of execution is fatally marred and frustratingly vague. Good day, --Cybercobra (talk) 07:27, 9 August 2011 (UTC)[reply]

I tend to agree. However worthy the idea, I don't think our category systems are robust enough to make filtering reliably workable. Some of the ideas about the category system (which I take to really mean the Commons category system) assume that mis-categorization, or malicious (re)categorization would be noticed and fixed in short order - I don't think this can be expected to happen reliably, there aren't enough eyes at work here, and as far as I know you can't "watchlist" the contents of a category. I think any filtering or censorship needs to be done outside the current projects. If there aren't already such projects I think a children-safe wikipedia project would be one approach. --Tony Wills 09:39, 9 August 2011 (UTC)[reply]
I see from the Personal image filter page that the idea is to have a separate flat category branch for filtering, that would allay my fears (it might cause other problems, but wouldn't impinge on the current categorization of files). --Tony Wills 10:59, 9 August 2011 (UTC)[reply]
Even if the categorisation is currently incomplete that isn't an argument not to do this. If someone wants to filter out penises and the filter only filters out 95% of them that is a job more than 95% done, not least because a complaint about one of the 5% is easily resolved with hotcat rather than as at present an argument as to whether something is encyclopaedic or educational. WereSpielChequers 18:05, 9 August 2011 (UTC)[reply]
An interesting side effect of this sort of filtering is that editors may be less inclined to self-censor, and will add more provocative images to articles on the basis that those who do not like them can filter them out. --Tony Wills 22:55, 9 August 2011 (UTC)[reply]
Or the opposite could happen with people choosing to illustrate articles with images that people are willing to see. But my suspicion is that the vast majority of current editors will censor few if any images that are actually used in articles. This is something that a small minority of editors are very keen on and which will have little or no effect on editors who choose not to use it. WereSpielChequers 02:18, 10 August 2011 (UTC)[reply]

Please be sure to release ballots

The data is invaluable. To 'announce the results', release the full dataset, minus usernames. Sublte trends in the data could affect the implementation. E.g. If targets Brazil or India overwhelmingly want this feature, that's alone might be a good reason to build it. --Metametameta 10:03, 10 August 2011 (UTC)[reply]

Well, I'd certainly like to see the (properly anonymised, of course) dataset, if only because I'm a sick person that enjoys analysing stacks of data. Disagree with your example though, the opinion of someone in India or Brazil should not be considered any more important than the opinion of someone in Zimbabwe, Australia, or Bolivia (to give some examples). Craig Franklin 08:03, 16 August 2011 (UTC).[reply]
Some countries will undoubtedly have skewed results because they restrict (or prohibit altogether) visiting Wikipedia. In this case, their sample size will be smaller on average per capita. At the same time, those with the ability to circumvent the government blockings of "sensitive" websites will no doubt against this image filter vote because they will view it as oppressive. OhanaUnitedTalk page 15:30, 16 August 2011 (UTC)[reply]

Referendum start moved to Monday, August 15

The start of the referendum has been moved to Monday, August 15, and all of the other dates will be bumped by around 3 days. Instead of rushing to put the technical aspects together quickly, we decided it would be better to just push the start date back a little and do things right. Updated schedule:

  • 2011-08-15: referendum begins.
  • 2011-08-17: spam mail sent.
  • 2011-08-30: referendum ends; vote-checking and tallying begins.
  • 2011-09-01: results announced.

This gives us more time to finish up the translations and look at the proposals. :-) Cbrown1023 talk 19:01, 12 August 2011 (UTC)[reply]

Hi. I've got some questions in IRC regarding why the referendum has not started today as scheduled and I can't find any information on-wiki regarding this. Any ideas? Best regards, -- Dferg 15:34, 15 August 2011 (UTC)[reply]
Generally we are just waiting on the rollout of notices and making sure the voting software is coordinated... I think we are suffering from having people in lots of different time zones, so eg. while it has been the 15th of August for a while in some places, it is only just still 9:00am in San Francisco :) We are planning to get it out *sometime* today. I don't know if there's been any other technical delays. -- phoebe | talk 16:32, 15 August 2011 (UTC)[reply]
Indeed. It's going to start sometime today. :-) Cbrown1023 talk 16:38, 15 August 2011 (UTC)[reply]

Sigh ... I guess I'll dismissing more global notices over the next two weeks :/. Please no big banners. fetchcomms 19:16, 15 August 2011 (UTC)[reply]

Self censorship is fine, thus this is a great idea

Hopefully this will decrease attempt by people to censor content seen by other such as we see here [1] Doc James (talk · contribs · email) 21:32, 14 August 2011 (UTC)[reply]

Precisely. I can't understand why so many people are opposing a plan that will increase users' freedom to avoid content they don't want. Nyttend 04:51, 16 August 2011 (UTC)[reply]
Because it can and will be abused by censors (there is nothing easier than to inject a cookie saying i don't want to see specific kind of content at the firewall level - which will have nothing to do with user preferences). Ipsign 07:13, 16 August 2011 (UTC)[reply]
Well my concern at this proposal is quite simple: how will the filters be implemented? How will images be determined to fall into a given category? Consider Jimmy Wales' misguided attempt to delete "pornographic images" from Commons, which resulted with him selecting several acknowledged works of art -- & amazingly, none of the images of penises. The reason filters haven't been implemented yet, despite on-&-off discussion of the idea for years, is that no one has proposed a workable way to separate out objectionable images -- how does one propose criteria that define something, say, as pornographic without at the same time including non-pornographic images? (Dissemination of information about birth control in the early 20th century was hampered because literature about it was treated like pornography.) Explain how this is going to work & convince me that it is workable -- then I'll support it. Otherwise, this proposal deserves to be filed here & forgotten. -- Llywrch 06:05, 16 August 2011 (UTC)[reply]

Scope

If I understand this proposal correctly, the settings are to apply to each user's own account(s) and are customisable. I do not see this to be a problem as it would entirely be a matter of personal choice. However, assuming that non logged in users are able to see the full gamut of images from the smutty to the explicit, I fail to see the point: Creating such a system, which may or may not include the rather elaborate protection suggested system requiring passwords, seems to be technology for its own sake. Something that can be circumvented by simply being logged out has to be pointless. --Ohconfucius 04:56, 16 August 2011 (UTC)[reply]

Agreed! --81.173.133.84 08:12, 16 August 2011 (UTC)[reply]
Logged-in users are not required to see the full gamut of images. The mock-ups on the attached page all "show how the hider might look to an anonymous user", not a logged in user. It's likely that the system would be cookie-based for logged out users. If readers want something more permanent, they should create an account and have their settings be saved. Cbrown1023 talk 15:17, 16 August 2011 (UTC)[reply]
If I understand the proposal correctly, it is meant to be entirely voluntary. Some one who wants to circumvent the image blocks can simply change his preferences or not opt in in the first place. What does circumventing by logging out have to do with anything?--166.250.0.49 05:39, 17 August 2011 (UTC)Looks like i wasn't logged in.--Wikimedes 08:17, 17 August 2011 (UTC)[reply]

Self Consorship?

(durante décadas la gente ha luchado por liberarse de censura disctatorial, por poder ofrecer una educación integra y laica a todos. como es que aquí se puede considerar que hay "imagenes controvertidas". en dicho caso se debe someter a modificación el artículo cuando por supuesto no se refiere a manipular la historia, no considero factible por ejemplo censurar imagenes historicas de guerra, educación sexual, fotos de comunidades nativas, imagenes de como las rebeliones, torturas. en ese sentido cada uno es responsable por lo que quiere ver y puede en su propio ordenador o movil adaptar un control paternal sobre las imagenes. si el problema son los niños... tendran miedo porue no se sienten capacitados para dar una explicación de acuerdo con la edad de ellos, por lo que deberían buscar a un profesional para que les ayude a dialogar. Son los niños los que deben entender mas el mundo que les rodea y sus origenes, para así crear un futuro mejor y más unido en la paz. No los dejen en la ignorancia, ese es el objetivo de este proyecto, conocimiento al alcance de todos libre de la enferma censura.)

for decades people have fought for freedom from censorship disctatorial, be able to offer education and integral secular all. as it is here to be considered as "controversial images." in this case must be submitted to change course when the article is not about manipulating history, I consider it feasible, for example historical images of war censorship, sex education, pictures of native communities, as images of rebellion, torture. in that sense each is responsible for what you want to see and be on your own computer or mobile phone to adapt a parental control over the images. if the problem is the children ... be afraid porue feel unprepared to give an explanation according to their age, so they should seek a professional to help them to talk. They are children who need to understand more the world around them and their origins, to create a better and more united in peace. Do not let them in ignorance, that is the goal of this project, all knowledge available to the patient free of censorship.

Waste of time

Here is my comments that I made in my vote:

There is very little to be gained if this feature was introduced. The way Wikipedia is written it would be unusual for offensive images to appear where they are not expected. To quote the example used in this poll, a reader will not get picture of naked cyclists if viewing the bicycle article. Editors generally use images that are applicable to an article since they are conscious of the wide range of sensitivities of the readers of Wikipedia. Images that are offensive to some may be present on an article about some aspect of sex for instance, but in these cases the reader may be unwilling to view the article.

Implementing this proposed feature will place yet another burden on the overworked editors. Rather than placing this burden on editors, of which the vast majority are volunteers, a better option is to have a clear warning on every page stating that Wikipedia is not censored in any way. This can easily added to the footnote containing other disclaimers that currently appears all Wikipedia pages.

If parents or guardians have wish to control the browsing habits of children in their care they can use other means of control such as supervision or content filtering software.

The internet contains huge amounts of easily accessible material that is considered by some to be offensive. Those who are offended by certain images must take responsibility for how they browse the internet in order to avoid offence. Alan Liefting 07:11, 16 August 2011 (UTC)[reply]

You may think that it makes sense to not put naked cyclists on a bicycle article, but situations like that are not at all uncommon. Did you know that the German Wikipedia once ran w:de:Vulva as its featured article of the day and included a picture of a woman's vulva right there on the Main Page? Any reader who went to dewiki's Main Page saw that image. Just because we both think something like that isn't the best thing to do, doesn't mean that that's not going to happen. About putting burdens on editors, it doesn't require doing anything new. It's designed to work with our existing category system. People won't have to do anything new — they already maintain our category system. Cbrown1023 talk 15:21, 16 August 2011 (UTC)[reply]
About w:de:Vulva on the German article of the day: An endless discussion (de) and bad press (de) was the result, which lead to interest in this proposed image filter. --Krischan111 23:40, 16 August 2011 (UTC)[reply]

Pictures of Muhammad

How will the Wikimedia Foundation avoid that people will vote for the option to remove pictures of en:Muhammad?

  • The recent version of the important article ar:محمد already seems to contain no picture of Muhammad, so it is factually censored.
  • Some people may argue, they feel hurt by depictions of Muhammad and that for them, these are images of violence.
  • The voting system is very vulnerable for abuse, e.g. by voting with multiple accounts, or by external organizations orchestrating votes.
  • You want to ask users to give their view on how important it is "that the feature be culturally neutral (as much as possible, it should aim to reflect a global or multi-cultural view of what imagery is potentially controversial)". What will happen if the average outcome to this question will turn out to be less then 10? What will happen if the outcome turns out to be below 5?

--Rosenkohl 11:03, 16 August 2011 (UTC)[reply]

Won't it be the users themselves who choose which type of images they want to hide, and which to view? Depictions of Muhammad may be tagged as such (instead of 'violence, etc.), so that readers who may be offended by such images may filter them out. --Joshua Issac 12:04, 16 August 2011 (UTC)[reply]
How about any image of humans? we are all Gods image and therefor not to be portraited. --Eingangskontrolle 12:22, 16 August 2011 (UTC)[reply]
With a well-implemented system, (the few) people who are offended by pictures of humans could choose to not view any pictures of humans. Those who are offended by pictures of Muhammad can choose not to view any pictures of Muhammad. And the rest of us can go on merrily on our way, unaffected. What other people choose to view or not view will have no effect on the rest of us. B Fizz 18:39, 16 August 2011 (UTC)[reply]
The rest of us ("we" the authors of Wikimedia projects) will have to do the task of categorizing the pictures of humans or categorizing the pictures of Muhammad for those who feel offended by these pictures to be able to turn them off, and will be the authors of resulting categorys. I don't see how this effect is neglectible for us or the encyclopaedia, --Rosenkohl 20:59, 16 August 2011 (UTC)[reply]

What does "culturally neutral" mean?

I'm afraid don't understand what is meant in the questionnaire with the words "culturally neutral" and "global or multi-cultural view".

Some readers probably will think that something being "culturally neutral" would mean that it is applied for all person in the same way, regardless of their particular culture; so that a "cultural neutral" filter would never filter any image for the reason that it may hurt the culture of a reader; e.g. that a "cultural neutral" filter would never filter pictures of Mohammad (this at least was what I thought when reading the questionnaire the first time today).

However now I suspect, "culturally neutral" here has exactly the opposite meaning: that a filter takes account of the culture of reader, regardless which culture this is; so that a very "culturally neutral" filter would allow people to choose to not see pictures of Mohammad.

The same applies for the meaning of "global" and "mutli-cultural". I would think a "global" encyclopaedia provides the same knowledge, content and pictures for all readers, regardless of their particular local place, country, or culture. However, in the questionnaire "global or multi-cultural view" may mean the exact opposite, that a "global" and "multi-cultural" encyclopaedia would provide each reader with a different kind of knowledge, content and pictures, adjusted to their local place country, or culture.

So I think the wording of the questionnaire is highly misleading, if comprehensible at all, --Rosenkohl 13:08, 16 August 2011 (UTC)[reply]

Tired of censorship

Many are becoming increasingly concerned about the range of unpleasant or explicit material available on the internet. However, many are also becoming rather concerned about the limitations on free speech and the growing acceptance of censorship. How does one reconcile these two? I acknowledge that Wikipedia is a community, but it is also an encyclopaedia.

When I was in school, I remember watching holocaust documentaries, etc. But it was important to have seen these things, however unpleasant. It was important to our education that we could understand. Had these images been censored, we would have felt cheated of a proper history education. If censorship of that kind is not practiced in a school, I dread to imagine the day it is accepted for an encyclopaedia.

We must consider the context of these images. They exist in an encyclopaedia (albeit online) and are almost certainly relevant to the subject. Some images may be unpleasant, but we shouldn't simply filter them out because of it. If a child were to look in encyclopaedia, they may find the same materials. Equally, there are academic books and journals available in every library containing the most horrific of images, but they are included because of their educational value. Filtering out relevant (or potentially highly relevant) images seems to be more harmful than helpful.

Whilst this would only affect those who choose to censor themselves, we are still permitting censorship and promoting ignorance. If the image is relevant to the subject, then the end user is placing restrictions on their own learning and filtering out content because of personal intolerance. Wikipedia aims to educate people, but it can't do that by filtering out their access to useful information (in the form of images), regardless of whether people make that choice. To paraphrase Franklin, those who would give up essential liberty don't deserve it.

Alan Liefting makes a good point. Why not simply have a content warning? If one watches a documentary, there is often a warning that the piece contains sex, violence, etc. The viewer is advised to use their discretion in watching the documentary, but the content is certainly not censored. Why should an encyclopaedia be any different?

Wikipedia was built on ideas of freedom of speech. Perhaps it's about time we defended it and took a harder line on censorship. If parents or particular institutions wish to block content, they are free to do so through various software, DNS services, etc. If they want to censor content, that's their decision, but we shouldn't be doing it for them. Zachs33 11:15, 16 August 2011 (UTC)[reply]

I agree with you that sometimes people need to see certain things, and that we shouldn't exclude an important image just because it might offend some people. That's why this feature is good: if someone decides at some point they don't want to see naked images or images of violence, they don't have to. If they then found themselves at "Treatment of concentration camp prisoners during the Holocaust" and noticed that there was a hidden article of some naked prisoners being hosed down, they could make a conscious decision to see that image. They'd read the caption and be like, "Okay, I hid this by default, but I want to see the image in this case." A little bit of a delay removes the surprise and empowers the reader to choose what they want to see. It's not our job to force them to see that, but it is still our job to make sure it's there for them if they're interested. It will always be clear exactly where an image is hidden, and it will always be easy to easily unhide a specific image. Cbrown1023 talk 15:30, 16 August 2011 (UTC)[reply]
There's an old expression - "freedom to choose". It's often true to say that there is freedom in choice, but it's not always realistic. Should we be encouraging users to censor information from themlseves? An image is a form of information. What about textual content that describes unpleasant torture or treatment of jews? Should users be able to choose to filter out unpleasant paragraphs? Of course, this would not be feasible, but would we support such a change if it were possible? Of course not. The Blindfolds and stupidity? topic (among others) discusses self-censorship and the reality of restraint in such a choice. I find the idea of filtering 'controversial content' to be rather frightening. I'm sure there are many holocaust deniers on the internet. The potential for content (of any kind - whether it's a text, image or otherwise) to be flagged as controversial because of such people is far more offensive than any image. Many users have jumped in to this discussion out of principle. We're talking about the principle of free speech and the potential problems should we permit such a proposal. Zachs33 16:48, 16 August 2011 (UTC)[reply]
My stance on the issue is allow the ignorant to remain in ignorance if they so choose. Taking the jewish torture content example, imagine someone wishes to remain in ignorance of jewish torture. Now there are two possibilities: 1) we leave Wikipedia as it is, and this person avoids Wikipedia because they know it contains content about jewish torture. Perhaps they use Wikipedia for a while but then they see something about jewish torture and are immediately repelled. Or 2) we provide a filtering tool, and the person uses Wikipedia with the tool to filter out jewish torture content. The person may occasionally see an area that is blocked out and says "this content is hidden by your filtering preferences". They probably know that it is content about jewish torture, but they are not repelled because the content itself isn't there.
By providing the filtering tool, we increase the likelihood that this person will continue to use Wikipedia. You might say the ideal situation is to force this person to understand the truth about jewish torture, and that may be true. But option #1 will not provide that outcome. Option #2 is, imho, the lesser of two evils, and a step closer to the ideal. B Fizz 18:53, 16 August 2011 (UTC)[reply]
You have a point in that we don't want to become too paternalistic, but I'm not sure it's fair to use the word 'force' in this context. A primary aim of Wikipedia is to educate. If someone does not use Wikipedia because, in reading an article, they wish to selectively ignore potentially entire elements of that article, there is perhaps no hope in educating them. In the context of your example, we're talking more about intolerance than ignorance. We should perhaps tolerate intolerance, but we certainly shouldn't encourage it (by providing a mechanism such as the one proposed). If they want to read through an encyclopaedia crossing out the parts they don't like, one wonders why they're reading it in the first place. Zachs33 20:15, 16 August 2011 (UTC)[reply]

Translation problems

Though ukrainian translation of questions is ready, this page is not in Ukrainian. How to fix this?? --A1 11:30, 16 August 2011 (UTC)[reply]

Thanks for translating, A1! The vote interface translations do not show up automatically, unfortunately. They are routinely exported from Meta-Wiki and imported to the voting site. The Ukrainian translation will be included in the next export. Cbrown1023 talk 15:31, 16 August 2011 (UTC)[reply]
Could the same be done for french translation, which was probably translated using a "google translate" or similar tool... French speaking sites currently have this insufferable translation . Thanks in advance. Loreleil 10:22, 17 August 2011 (UTC)[reply]

Voting instructions

The voting instructions say "Read the questions and decide on your position." But the main questions I see are "Why is this important?", "What will be asked?", and "What will the image hider look like?", and I don't think I need to form a position on these. It's very confusing. Am I voting on how much I support or oppose each of the bullet points in the "What will be asked?" section, or how much I support or oppose the proposal as a whole? It's rather unclear. Either way, these are "questions" I can "read", but proposals (or principles) I can support or oppose. Could this be reworded for clarity? Quadell 11:33, 16 August 2011 (UTC)[reply]

No, the acclamation has already started. There is no "no" option. You can just express, that it is not importent. --Eingangskontrolle 15:04, 16 August 2011 (UTC)[reply]

The guiding principles in the What will be asked? section is what you will be commenting on. Cbrown1023 talk 15:32, 16 August 2011 (UTC)[reply]

Another blow against freedom

A reason given for this change in your introduction is "the principle of least astonishment". In four years of using Wikipedia several times daily I have never accidentally come across an image that astonished me. That's a pity. Astonishment is a valuable ingredient of learning. I fear that the people who are pressing for this change are not those who are afraid they may see a picture of a nude cyclist (an example that you give), but those who wish to prevent others from this. It won't work. My mother, an astute psychologist, forbade me to read any of the books in my brother's room as they "were too old for me", with the result that I had read them all by the age of ten. Labelling an image as hidden will only entice people to look and find out why. When the people pressing for the change discover that it doesn't work, they will press for stronger controls. And then for even stronger controls. Freedom is surrendered in a series of tiny steps, each of which appears innocuous on its own. No doubt you will assure readers that you won't follow that path. You won't be able to keep that promise. Once on the slippery slope it's very difficult to get off. Apuldram 12:01, 16 August 2011 (UTC)[reply]

Images are editorial choices

In wikipedia, images are editorial choices of the contributors, I see no reason why to only propose auto censuring tool for this part of the editorial choices made. Moreover, an image illustrates a topic but should (more reallisticly may) also be commented and described in the article itself; removing the illustration will make this part of the article useless or really weird.

IMO it's not our job as a community to implement a parental control solution. It's up to the parents. Does anyone want thoses kind of censoreship tools in museum or castel to not display the statue of Naked men, women and children ? To sum up, waste of time and money IMO. --PierreSelim 12:34, 16 August 2011 (UTC)[reply]

The proposal isn't to be automatic; it is instead for people who don't want to see such things to be able to choose to not see them. Images aren't permanently removed; even if one person hides the picture for themselves, it will still be there for the rest of us -- and it will still be available even for the person who has hidden it. Nothing editorial is being interfered with. I just want to make sure the proposal is clear :) -- phoebe | talk 15:57, 16 August 2011 (UTC)[reply]
If people dont like to see certain images, they should not use the internet. --Eingangskontrolle 16:32, 16 August 2011 (UTC)[reply]
That is the most asinine argument I've heard yet. The purpose of this feature is to allow more people to comfortably use Wikipedia. Your counterargument is that these people shouldn't be using the internet at all? How does that, in any way, help Wikipedia's goal to reach as many people as possible? That should be a major goal of Wikipedia, anyways. Perre said "it's not our job ... to implement a parental control system". Sure, it's not our job, but why not provide one anyways? It will be convenient for parents, and will leave everyone else unaffected. You don't have to opt in if you don't want to. B Fizz 18:19, 16 August 2011 (UTC)[reply]

The system is not a parental control system; it doesn't take power away from the reader and put it somewhere else. The power remains with the reader, and that's as it should be. I'm not even sure, off-hand, whether it would even be possible for Wikimedia to create a parental control system. How would it identify children and parents, given that the overwhelming majority of readers are anonymous, and accounts are just screen-names with an optional email address?? Parental controls have to be on the relevant computer to work. The only link with parental controls is that parental control software might eventually use WP classification of images to enforce controls locally (but they can already do this using categories of images and pages, if they really want). Rd232 18:31, 16 August 2011 (UTC)[reply]

What does the referendum mean?

An image many will avoid to see

The official start of censorship in Wikipedia. --Eingangskontrolle 12:31, 16 August 2011 (UTC)[reply]

Don't be naive. Conservatives won't be offended by evolution enough to opt-out of being forced to view illustrations or evidence of it. This is about penises:

thumb|center|300px|Something children should have the freedom to choose not to view.

--Michaeldsuarez 13:16, 16 August 2011 (UTC)[reply]
Michaeldsuarez: No doubt you are tempting users to remove the image to prove your point. I don't particularly wish to see the image and am certainly tempted to remove it (although I shall refrain from doing so), but that's because I don't wish to view the kind of content to which it belongs. If I were looking at articles on masturbation techniques, to which the image belongs, I would have no problem with the image. A child will only come across the above image if viewing a relevant article, which will be of an explicit nature, both in images and in text. My only problem with the image is that it does not belong on this talk page as it lacks relevance. I understand your point, but it's rather weak. Zachs33 13:44, 16 August 2011 (UTC)[reply]
This is a free image, not a fair use image. If I wanted to, I could post 50 of them on my userpage. It's more relevant than the finch image inserted earlier. There are ways for children to wonder onto enwiki's masturbation article. There are links to that article from the "marriage" article, the "Jock" disambiguation page, the "List of British words not widely used in the United States" article, etc. The only image I've ever removed from an article was a drawing of a little girl by a non-notable pedophilic artist DeviantART user. I'm not against images such as the one I've embedded above. I won't be censoring myself. I'm an Encyclopedia Dramatica sysop. What I'm against is imposing my will and beliefs on others. Users ought to have a choice. --Michaeldsuarez 14:00, 16 August 2011 (UTC)[reply]
I've seen much tamer images deleted from Wikipedia articles - when people don't like an image, they start with the whole "what does it add to the article" line. Heck, an image of a brown smear was removed from en:santorum (neologism), though the mastergamers there managed it by repeatedly extending article protection until it was deleted as an unused Fair Use image. I've also seen much tamer images deleted from userpages (e.g. Commons:User:Max Rebo Band). So this is pretty much a strawman as I see it. Wnt 14:32, 16 August 2011 (UTC)[reply]

Hello, I've changed one file in this section to an internal link. I don't believe this file is offensive or off topic, but think it is distracting for the purpose of this talk page. Since I had started two new sections further above on this talk page today (and before this file had been added to this talk page), my hope is that there will be a higher chance for usefull responses on this talk page when the file is not directly displayed, greetings --Rosenkohl 14:37, 16 August 2011 (UTC)[reply]

Sigh. Actually I thought it was pretty good bait. And there are people who thought it was unacceptably indecent for Elvis to waggle his hips. There is a time coming when Adam and Eve will be able to dance through the Garden of Eden unclad and unashamed. Wnt 14:43, 16 August 2011 (UTC)[reply]
A clear and concise image about what images this proposal is really trying to prevent from unwanted child viewage is censored, while the deceptive, scare-tactic "OMG. They're going to censor Darwin" image is to be kept? --Michaeldsuarez 14:52, 16 August 2011 (UTC)[reply]
I choose this image because it is not obvious that it will be endangered. But it is. First we will filter the images with 99% disagreement, then 90%... then your example... and then we will mark only the images that will find the acceptance of the remaining board members (there will be no more users left in this project). --Eingangskontrolle 21:04, 17 August 2011 (UTC)[reply]
Well, what undermines your point is that you're using an image which is hard to use in an encyclopedia article; if you were sampling our fine (but too small) collection on Category:Gonorrhea infection in humans it would be easier to see this - indeed, in the U.S. such images are shown to young children by the government as sex education. So if your image isn't encyclopedic why is it on Commons? Because it's useful for a German Wikibooks page.[2] Now someone viewing that page probably wants to see the pictures.
In any case, bear in mind that our opposition to this image hiding proposal doesn't mean opposition to all image hiding - I have, several times for each, proposed a simple setting to allow a user to thumbnail all images at a user-defined resolution (can 30 pixels really be all that indecent?), and an alternate image hiding proposal where WMF does nothing to set categories but allows users to define their own personal categories and lists and share them on their own, so there is no need for WMF to define or enforce categories. Also note that there are many private image censorship, ehm, "services", which a person can subscribe to, and which are in fact, alas, taxpayer-subsidized in schools and libraries. Wnt 15:11, 16 August 2011 (UTC)[reply]

Dont be naive. They (the createtionits) control school districts and they try to control the media by removing book from public libaries. And they will tag images they dont link to be shown. Its a shame, that there was no single "no" to this proposal. --Eingangskontrolle 15:00, 16 August 2011 (UTC)[reply]

Have you read any of the comments on this talk page? The opposition is extremely vocal. There are plenty of no's here. It's religion that's constantly censored. You can't have rosary beads: [3], [4]. You can't even voluntarily pray in school. I could understand forbidding mandatory school prayers, but if an individual makes a personal choice to pray for himself or herself, why should we stop them? Couldn't they give students the choice to use their recess time to either pray or play? Wouldn't that be an acceptable compromise? In addition, Creationists are far from having the ability to take over the wiki. --Michaeldsuarez 15:45, 16 August 2011 (UTC)[reply]
Stop the madness. Let's not drag the creationism battle here. It's off topic. Philippe (WMF) 18:36, 16 August 2011 (UTC)[reply]
Apart from objections to some medieval paintings of Muhammad, I haven't seen much effort to censor religion here. Certainly I agree that students should be able to dress as they wish, and say what they will, but what can I do about that on Wikipedia? Wnt 19:40, 16 August 2011 (UTC)[reply]

When did personal preference become synonymous with censorship?

I'm going to go against the grain here and suggest that the implementation laid out here is inadequate and wrong-headed. The principle of "least surprise" doesn't suggest that people should be given the chance to block an offensive image after it has been seen, it suggests that people not be surprised by such images. This suggests an opt-out (opting out of filtering) system, rather than an opt-in system. Anything less will not satisfy the critics of Wikipedia's use of images (who are acknowledged to have a valid point by the very development of this system).

Since images are necessarily going to be sorted as into categories of offensiveness for any such system to work, it is simple to restrict display of all possibly offensive images unless the user takes deliberate action to view them. This would be a preference for logged in users and a dialog for non-registered readers, just as in the current proposal. An inadequacy of the current proposal is that it does not tell the user why a particular image has been hidden. I suggest that this is necessary for a reader to make the choice of viewing or not viewing a hidden image. As a quick example, suppose that one an image is both "sexual" (i.e., contains genitals) and medical. As a non-registered user, I may have no problem with sexual images but may be squeamish about medical images. Although it may be apparent from context, there is no reason to deprive the user of information which helps them make an informed choice.

I understand that if the WMF were to implement an opt-out system there would be a very loud hue and cry from the same people who are here crying "censorship". Giving someone the ability to choose not to see things that may offend them is not censorship. Certain comments here read to me as though the people making the comments feel that they know better and everyone should accept their particular worldview. Clearly people are upset that others do not wish to see images of Mohammed and feel that allowing users to block those images is some form of personal defeat. The WMF will never convince the hardcore "anti-censorship" crowd and any opt-in or opt-out system will be met with equally strident reaction. The WMF needs to look past the comments of resident ideologues and examine the concerns raised by its critics and casual users. Opt-in filtering does not answer their concerns nearly as well as opt-out filtering would. Growing the userbase means attracting new users, not placating those who have become entrenched in positions that no longer serve the project.

Be brave. Implement opt-out filtering. Delicious carbuncle 14:17, 16 August 2011 (UTC)[reply]

Personal choice is a good thing, yes. The problem with this proposal is that there are two things which are not personal choice.
  • Categorization of images. The poll asks about the importance of flagging images, talks about 5-10 categories of images. But who decides whether an image is in or out of a given category, and how do they decide? The problem is, you're going to have people arguing about, say, whether a woman breastfeeding a baby, where you see a few square inches of non-nipple breast skin, is indecent or not. When these disputes escalate - or as the offense-takers get bolder and start hunting for scalps - this is going to lead to user blocks and bans. Both the arguments and the editor losses are significant costs to the project.
  • Speaking of costs. We still have a pretty rudimentary diff generator that can't recognize when a section has been added, doesn't mark moved but unaltered text in some other color, etc. We still have no way to search the article history and find out who added or deleted a specific source or phrase, and when. WMF has things to spend its money on that don't require a site wide referendum to see if they're a good idea or not. There are probably hundreds of listed bugs that any idiot can see we should have devs working on right now. Wnt 14:37, 16 August 2011 (UTC)[reply]
Wnt, I know you aren't going to block any images, so why do you care if people choose to spend their time arguing about what is or isn't too much skin? As for cost, you and I both know that this site-wide referendum is a done deal. Image filtering is going to be implemented. I am suggesting what is really a minor change in terms of implementation. I do not pretend to know what the WMF should be working on, but it has asked for opinions in this case, so I have offered mine. Delicious carbuncle 15:41, 16 August 2011 (UTC)[reply]
As I just explained, the reason why I care is that (a) developers are probably going to waste months fiddling around with this image hiding toy, when there are more pressing tasks for them to do, and (b) soon ArbCom cases like the one you're involved in and which I commented about during its initial consideration (w:Wikipedia:Arbitration/Requests/Case/Manipulation of BLPs) will include a whole new field of controversy with people saying that so-and-so has too loose or too tight a standard when tagging images. Especially if there are 10 categories - I can't even imagine what they all would be. So we won't just be ignoring the image blocking, and you know it, because it's going to be this huge new battlefield for inclusion/deletion. This is a big step in the wrong direction, and we already have too much trouble. Wnt 19:48, 16 August 2011 (UTC)[reply]
Oh, yes, the ArbCom case that I'm involved in. Thanks for bringing that up here, even though it isn't related in any way but might make me look bad. Especially if people thought I was the one manipulating those BLPs, which I wasn't, or that my role was anything other than whistleblower and finger-pointer, which it wasn't. I'll be sure to remember your thoughtfulness in this matter. Delicious carbuncle 20:26, 16 August 2011 (UTC)[reply]

@Delicious carbuncle: hardcore "anti-censorship" crowd - do you know what you are writing? The freedom of speech is regarded as one of the human rights and the reason why such a project like Wikipedia is possible. I dont want anyone to choose in the background of the flaging process which image will be seen and which will not be seen. If somebody has legal problem with certain images, he should come forward and start a delete discussion. --Eingangskontrolle 14:53, 16 August 2011 (UTC)[reply]

Thank you for demonstrating my point. How is the tagging of an image as offensive (even if incorrectly tagged) in any way interfering with your freedom of speech? Delicious carbuncle 15:21, 16 August 2011 (UTC)[reply]
Its not directly interfering with freedom of speech, it's interfering with knowledge and neutrality. Neutrality is the sum of all knowledge. Selective knowledge is censorship and leads to misunderstanding, prejudice and conflict. Anyone that lives inside Germany knows this as a fact, any American that thinks about terrorists should also know this as a fact and anyone living under modern dictatorship will also know it, but has no other choice as to stay quite for his own safety. Thats how you loop back to freedom of speech which will also suffer from blindfolds given to the audience of the speaker. --Niabot 16:20, 16 August 2011 (UTC)[reply]
The selective knowledge approach is being used extensively on Wikipedia already. Wikipedia doesn't have a "Teach the Controversy" approach. If Wikipedia teaches the controversy, then perhaps evolutionists would stop buying into the idea that Creationists and Intelligent Designers are uneducated hillbillies who rape and murder people in Hollywood horror and post-apocalyptic action films. Wikipedia is already censored, and it's affecting how people think of people they're not truly familiar with. --Michaeldsuarez 16:37, 16 August 2011 (UTC)[reply]
If the situation is already this worse, why we even think about this bullshit to make it even more worse? Whats next? A Wiki for nationalists a Wiki for terrorists and next a Wiki for the guys that doesn't want to see any of the previous? Technically we create such internal differentiations and should no wonder if the climate gets down further if draw such borders. A free mind can look at anything without being offended. He stands on top of that. --Niabot 16:53, 16 August 2011 (UTC)[reply]
A lot of talk about "freedom" on WIkipedia seems to involve depriving others of their ability to chose for themselves. Delicious carbuncle 17:01, 16 August 2011 (UTC)[reply]
If more viewpoints are taught, then the people will become less arrogant about what people of other viewpoints think. The more people understand about each other; the better. Wikipedia needs to be more inclusive when considering which viewpoints to include in articles. Wouldn't it be advantageous for us to document how Creationists, nationalists, and even terrorists think and what they believe in separate sections instead of allowing stereotypes fill in the blanks? --Michaeldsuarez 17:09, 16 August 2011 (UTC)[reply]
Ignorance is a choice?! In relation to creationism, the "teach the controversy" approach was refused by many science teachers in schools, on the grounds that there was no scientific evidence upon which to base creationism. I could be wrong (not really), but I believe encylopaedias rely on verifiable sources and scientific evidence. Creationism does not fit the criteria, it's that simple. Encyclopaedias must settle on what is provable and generally accepted. If there is controversy, for which reliable evidence exists, it should be covered. Wikipedia is not a platform for terrorists, creationists, holocaust deniers, liberals, conservatives or any particular group. It's an encyclopaedia and should remain as neutral as possible. Zachs33 17:25, 16 August 2011 (UTC)[reply]
Why are you arguing about Creationism here? This is a discussion about opt-in versus opt-out for image blocking. Delicious carbuncle 17:53, 16 August 2011 (UTC)[reply]
Sorry, I was just expanding on Niabot's idea of "Selective knowledge is censorship and leads to misunderstanding, prejudice and conflict." --Michaeldsuarez 18:57, 16 August 2011 (UTC)[reply]

I agree with Delicious carbuncle. Opt-in filtering will please no-one, and hardly seems worth the massive effort in creating and maintaining the system. It has to be opt-out, and by virtue of being opt-out, very simple. For example, if the system doesn't know the user's preferences on what to see (eg anonymous user), it doesn't show images classed as problematic (with various categories of problematic so users can fine-tune preferences, but in the first instance, the system needs to hide everything problematic), and provides a placeholder with a textual description instead (probably the alt text, which we should have for visually impaired users anyway), along with a "click here to see the image" button. Simple, and not a trace of censorship, which takes power away from the reader, where this gives power to the reader. Rd232 17:59, 16 August 2011 (UTC)[reply]

Thats exactly what i don't want. A default censored Wikipedia with option to enable the decensored content. This is total bullshit and should be fired up in hell. --Niabot 18:22, 16 August 2011 (UTC)[reply]
Luckily, that's exactly what's... NOT proposed. Don't conflate the two. Philippe (WMF) 18:35, 16 August 2011 (UTC)[reply]
I'm arguing that opt-in is fairly useless, except for the sort of power users who might be involved in this discussion. WMF should commission some research (a poll?) on opt-in/opt-out, because the people involved in these discussions are entirely unrepresentative of those most affected by the choice. Rd232 18:38, 16 August 2011 (UTC)[reply]
See discussion below on what censorship is. Hiding something a click away is not "censorship", and never will be no matter how often the claim is made. Rd232 18:38, 16 August 2011 (UTC)[reply]
In this case it's inconvenience. Making something not as accessible as something else is the same as to say: This party can start showing advertisements a year before election and this party only one week before the election. Of course you could go to cinema XYZ and watch it there all time. Do you now understand that it would violate any principle of neutrality? --Niabot 18:57, 16 August 2011 (UTC)[reply]
It's every bit as accessible. The feature is off by default... you have to choose to turn it on. Philippe (WMF) 20:12, 16 August 2011 (UTC)[reply]
And I am proposing that it is on by default. 20:19, 16 August 2011 (UTC)
An opt-in system is as much use as chocolate dildo in the Sahara. John lilburne 20:42, 17 August 2011 (UTC)[reply]
Niabot, our analogy is, unsurprisingly, unrelated nonsense. If an IP user wants to see images that have been blocked, all they need to do is bring up the dialog as illustrated in the mocked-up screenshots the first time they encounter a blocked image. After that, they will see the images as they choose. This is exactly the same amount of inconvenience that users face in the current proposal, except the onus has been switched from users who do not wish to see potentially offensive images to users who do wish to see those images. Delicious carbuncle 20:19, 16 August 2011 (UTC)[reply]

I am not strongly opposed to the idea of an opt-in filtering system. (Please note: my views on this issue have not solidified yet.) However, an opt-out filter strikes me as unworkable.

Delicious carbuncle writes: "it is simple to restrict display of all possibly offensive images unless the user takes deliberate action to view them". Yes, but how do we decide what images are "possibly offensive"? Images of gore, explicit sex, and the prophet Muhammad seem to clearly qualify, but beyond that it gets fuzzy. For example, should pictures of spiders be tagged as "potentially offensive" given the existence of arachnophobes?

Of course, you might argue that we can limit the opt-out filter to the "obvious" cases. But that would not work for two reasons:

  1. Some of the opt-in supporters will inevitably complain that such a filter does not allow them to filter everything that they'd like to filter.
  2. No one will agree on exactly where to draw the line with regard to "obvious" cases. Above, I listed images of Muhammad as an obvious example of "possibly offensive" material, and it does seem obvious to me. But, at least in the Western world, there are many on both the right and the non-PC left who will probably object to tagging a mere picture of Muhammad as "possibly offensive" -- not because they don't recognize that such images do in fact offend some people, but because they view the sensitivities of the people in question as an insufficient reason to have the pictures hidden by default. Likewise, many would probably object to tagging images related to evolution as "possibly offensive" -- again, not because they don't recognize that the images do in fact offend some people, but because they view the sensitivities of the people in question as an insufficient reason to have the pictures hidden by default. There are countless other cases where some images do in fact offend a subset of the population but where there will be resistance to having the images hidden by default. There seems to be no way to draw the line.

So, to sum up, I'm not dead-set against any kind of filter. But I object to an opt-out filter on the grounds that it is simply not feasible. --Phatius McBluff 20:48, 16 August 2011 (UTC)[reply]

My statements are based on the assumption that there will be an image filter implemented and there will be tagging of images by some group (as yet unknown) using some schema (as yet unknown). There will, of course, be many arguments caused by this. My suggestion is simply that any such filter is on by default. The pints you raise apply equally to an opt-in filter. Delicious carbuncle 20:59, 16 August 2011 (UTC)[reply]
"The pints you raise apply equally to an opt-in filter."
Only somewhat. You're right that there will be a lot of arguments over what images get tagged for what category even if we go with the opt-in suggestion. However, your suggestion, if implemented, would require a whole second round of arguments about which categories (or combinations of categories, or degrees of membership in a category such as nudity, etc.) qualify as "possibly offensive" to, say, a "reasonable person". I submit that this is not the sort of value judgment that Wikimedia projects should be making, given our value of neutrality.
However, since I am unlikely to persuade you on the above points, I will offer the following constructive suggestion: if we do implement an opt-out filter, then we sure as heck better implement an optional opt-in filter as well. Or, to put it differently, users should be able to customize their filters by adding additional categories to be blocked. How else will we avoid charges of favoritism? --Phatius McBluff 21:15, 16 August 2011 (UTC)[reply]
Just to clarify: My understand of the opt-in proposal was that, under that proposal, we would tag images as belonging to various (relatively value-neutral) categories (e.g. nudity, spiders, firearms, explosives, trees) and allow people to choose which categories to filter. My point is that, if we are going to have some of those categories filtered by default, then we had better allow people the option of adding additional categories to their filters. --Phatius McBluff 21:18, 16 August 2011 (UTC)[reply]

Ambigious questions

The questions are ambiguous and must be changed; For example "It is important for the Wikimedia projects to offer this feature to readers." should become "It is important that the Wikimedia projects offer this feature to readers.". i.e. "for" has a double meaning and is thus ambiguous. AzaToth 15:18, 16 August 2011 (UTC)[reply]

Thanks AzaToth. I don't quite follow the confusion though -- explain further? -- phoebe | talk 15:49, 16 August 2011 (UTC)[reply]
The ambiguity here is what is important for the projects as whole, i.e. it's self interest, or what is important for the readers. AzaToth 16:04, 16 August 2011 (UTC)[reply]
Got it. Thanks. I'll see if we can change it. (There's a heap of translations that might be affected). -- phoebe | talk 16:13, 16 August 2011 (UTC)[reply]
Thats fine, change the question while it is answered. And then add the option: Dont spend any further efforts in this idea. --Eingangskontrolle 16:39, 16 August 2011 (UTC)[reply]
For the record, we are not changing the question in mid-stream. :) Philippe (WMF) 18:19, 16 August 2011 (UTC)[reply]

Quite trying to fix something that isn't broke

Wikipedia is one of the most successful sites on the Internet despite not having content restriction features. Lets leave well enough alone and stop tinkering with the open principles that made Wikipedia successful in the first place. Inappropriate images aren't a major issue but the slippery-slope solution presented has the potential to ruin Wikipedia itself. Jason Quinn 15:24, 16 August 2011 (UTC)[reply]

Approaching Wikipedia with a "don't fix what ain't broke" attitude will stifle its ability to progress. Yes, censorship is a delicate issue, but as long as it is opt-in, we are relatively far from the dangerous slippery slope, imho. I can't imagine any scenario where people would be driven away from a website that allows them to filter things they might not want to see. Nobody complains about Google's safe search feature; if you don't want a "safe" search then you keep it turned off. B Fizz 18:08, 16 August 2011 (UTC)[reply]

Blindfolds and stupidity?

I can't image why it should have any positive effect on mankind to declare knowledge or visualizations of knowledge as non existent. I'm strongly against such a proposal since selective oder preselected knowledge is the worst kind of knowledge. Preselected knowledge is the same as censorship, even if you censor yourself. It allows human beings to build there own reality which way different from actual reality. On top of it we encourage cultural misunderstanding and discrepancy. Frankly said, I'm already offended by the fact, that we even talk about such idiocy. -- Niabot 15:52, 16 August 2011 (UTC)[reply]

It's rather sad that you can't accept that other people have opinions different from your own. If some people want to view a page without seeing images that would bother them enough that they might avoid the page with them included, then they should be able to, regardless of the fact that you think they're missing out by filtering the knowledge. Giving them the power to make that decision for themselves is the right thing to do.69.151.12.26 16:15, 16 August 2011 (UTC)[reply]
I accept other opinions, but i don't accept selective knowledge to build up such opinions. If you can't bear an image (the question would be why?) then you can look away. Most likely you are not even interested in such a topic if you are not willing to look at the images which are telling the same story in a different way. Thats not the way to give people an own will or to provide them knowledge. Knowledge might hurt you and it often needs to. --Niabot 16:26, 16 August 2011 (UTC)[reply]
Everyone has the option not to use Wikipedia. --Eingangskontrolle 16:27, 16 August 2011 (UTC)[reply]
Likewise, everyone has the option to not use the self-censorship feature. B Fizz 18:02, 16 August 2011 (UTC)[reply]
" If you can't bear an image (the question would be why?) then you can look away." - I've come across articles that would I have read if I could have hidden the image while reading the text. Yes, I'd have clicked to look; but then I'd have clicked again to hide it while reading. Because the images, whilst informative, were just too unpleasant to be able to concentrate on the text. So I didn't read the article. Rd232 18:09, 16 August 2011 (UTC)[reply]
@B Fizz: That won't be true for long. Sooner or later you will be forced to use some of the restrictions. May it be on the terminal of a library or inside an organization (your workplace). Then you don't have this right anymore and you will have to eat what you get. --Niabot 18:18, 16 August 2011 (UTC)[reply]
@Rd232: I doubt that you wanted to read the content if the images where such disturbing. I wonder why you can read about the same content and otherwise don't like to see it. Thats the truth and it hurts like knowledge does, until you understand it. --Niabot 18:18, 16 August 2011 (UTC)[reply]
Doubt all you like; the failure of your imagination is not my problem. Rd232 18:20, 16 August 2011 (UTC)[reply]
If you are in a library or at your workplace, you are inherently giving up some of your rights for some form of convenience (free stuff at the library, or getting paid at work). Those places could just as easily block Wikipedia entirely if they find the images offensive. It is their realm to decide what and how to censor within their own walls. So your argument is that we are providing a tool that will aid these places in enforcing censorship. That is true, and that argument I can understand. But Wikipedia, in and of itself, with this proposed feature, is not censoring anything. And those places, with or without this feature, will go on censoring as they always have. If they really wanted to censor Wikipedia images, though, there are plenty of other ways to do that. B Fizz 18:32, 16 August 2011 (UTC)[reply]
True. Should we remove categories from images and articles? They can be used by local software to censor too, quite efficiently. So can article titles though (less efficiently). Well I guess we can number the articles, and not categorise them... But then the software could still censor articles by number... so you'd have to constantly rotate the numbers to fool the software. In effect, Wikipedia would be reduced to the "random article" button... It's Wikipedia, Jim, but not as we know it! Rd232 18:46, 16 August 2011 (UTC)[reply]
If they want to censor the workplace, let them do it. But don't give them the tools to do it. In this case it costs them money and thats when usually someone with an open mind gets involved.
"... the failure of your imagination is not my problem." I don't have a problem with my imagination, neither with the images. Thats all part of humanity. Something looks nice, something doesn't. But that is no reason to hide it. You don't like how your neighbor looks like? What will you do? Hide him from public? ;-) --Niabot 19:08, 16 August 2011 (UTC)[reply]
Ah, the non-listening Wiki(p)(m)edians, how I've (not) missed that. "don't give them the tools to do it." is a reply to an argument ad absurdum illustrating that we are already giving them the tools to do that, and that attempting to remove that is impossible. Yes, the proposed filter will make it slightly easier, but that's not our problem; it's just information, and on the Wikimedia end, the only use for that information is to empower the reader to control how they view Wikipedia. You're effectively, if you think about it, arguing for censorship here, by saying that we cannot provide information about the nature of images because that information might be misused by others. Rd232 20:04, 16 August 2011 (UTC)[reply]
I have no particular objection to you trying to make a new category and classify images - but you'll soon find that it's a very big project you've started. Despite the fact that, unlike under this proposal, you can start whatever category you want and use whatever method of classification makes the most sense, rather than trying to form an international consensus about how to define it. It's not done now because it's not doable, and adding a GUI isn't going to make it any more doable. Wnt 20:09, 16 August 2011 (UTC)[reply]
(ec)I think it's important to remember that the en:Children's Internet Protection Act requires U.S. schools and libraries to spend public funds on third party censorware that could block Wikipedia image by image. Decisions on what to block or allow there are made by paid professional content police, not a rough consensus of amateur editors, trolls, and vandals. So there's no special need, at least in the U.S., for us to be doing this stuff. The same is probably true wherever else people are too easily offended. Wnt 20:05, 16 August 2011 (UTC)[reply]
So because some locations censor Wikipedia (remove the ability of the reader to decide what they see), Wikimedia should not empower readers to decide what they see? Really, it's increasingly clear that logic has exited the building long ago. Rd232 20:08, 16 August 2011 (UTC)[reply]
To be clear, I long ago suggested strategy:Proposal:Provide services to facilitate "child-safe" and selective mirror sites. Unfortunately, that's not what we're voting on today. I don't have a problem with providing readers with choice, even to some extent choice they impose on their children, provided that we, here, on Wikipedia, don't have to make and enforce the philosophical decisions. One of a billion examples: is a photo of a Rabbi carrying a Torah scroll into a new synagogue something we should classify as Hate Speech? Does it matter that the Torah calls for violence against homosexuals? Does it matter that modern Judaism interprets it completely differently and denounces such acts? Does it matter if the synagogue is in a West Bank settlement, and the Palestinians see it as a way in which Israelis are staking claim to their land? That's one big dirty diaper of a political battle that we want out of our house. Wnt 20:16, 16 August 2011 (UTC)[reply]

┌─────────────────────────────────┘
If this feature is developed, then I hope that when the time comes for enwiki to adopt a policy to prevent abuse, we should ensure that only the extremely graphic (ie. anything that could give children nightmares (eg. severed heads)) are added to the filter. The scope of this feature should be kept narrow. Black-and-white images of Holocaust and genocide victims shouldn't be filtered. We need to apply common sense. One of the things I love about MediaWiki software is its transparency. There are logs, history pages, and special pages for users to monitor. I'm assuming that there'll be a log or special page that would enable the community to monitor, scrutinize, and revert abuse. --Michaeldsuarez 23:22, 16 August 2011 (UTC)[reply]

This always leaves us to draw the line between what we be "censored" and what not. Ask me what i would hide and then ask Rd232 what he would hide. I guess we will draw two very different lines. At the end we will have very long discussions about nothing, with different opinions. In case we enable this feature as the default we could/should possibly hide any picture or text/article (if planned) by default. This would reduce it to the simple case: Visit Wikipedia if you are willing to look at the truth, hard facts and different opinions or just leave. But hey, you already have this possibility, even without such a nonsense of extra work. It's definitely the wrong approach for this topic. I can only support Wnt's statements. If we want such filtering, then we should set up (live) mirrors with selected content, but not destroying the main project. --Niabot 23:52, 16 August 2011 (UTC)[reply]
The conflict between multiple different "lines" can be resolved through compromise. Consensus is already at the core of decision at Wikipedia. Abusive censorship proponents will find it difficult to enact their agendas while dealing with an unfamiliar consensus process, Wikipedia's guidelines, and Wikipedia's anti-censorship community. --Michaeldsuarez 00:31, 17 August 2011 (UTC)[reply]
How should a compromise in this case look like? Displaying half of the image or using an replacement that doesn't gets the point? This is not a discussion about sources or how to express things the right way. It's a simple yes or no. You might see that the room for compromise is very small (non existent) and decisions in non trivial cases will always be based on personal opinions, which shouldn't be the case according to basic project rules. Just to put some examples on the table: (a), (b), (c). Offensive or not? --Niabot 00:45, 17 August 2011 (UTC)[reply]
They're not offensive to me. The only sexualized character is File:Fuwafuwa-chan.jpg, and the only two appear to simply allow the audience to admire (semi-)nudity (sort of like Greek or Renaissance sculptures). File:Anime_Girl.png seems to be the only character with passion, but it isn't sexual passion. File:On_the_edge_-_free_world_version.jpg contains too much information and scenery. The details are a distraction. The audience would be confused about what's truly the focus of the image: the immersible scenery or the female character. The image wouldn't fix into an article without being cropped in order to focus on a single subject in my opinion, but the image isn't featured on any articles anyway. The main problem I see is all three character's age. These characters have an extremely youthful appearance. File:Fuwafuwa-chan.jpg should be filtered for sexualizing a child. The other two shouldn't be categorized under "sexual content", but they should be categorized as "semi-nudity" and filtered by any who doesn't wish to view semi-nude images. Amusingly, I had prior experience with critiquing these images and examining whether they should be used in enwiki's "fan service" article: [5], [6]. --Michaeldsuarez 02:06, 17 August 2011 (UTC)[reply]
I'm not going to say that consensus doesn't work, but in recent months Wikipedia has been feeling more and more polarized, not unified. Let's just say that consensus is expensive, in terms not just of time but of discord and disillusionment. Saying that things can be resolved with consensus is like saying that we can fix a problem by spending money. Yeah, it's true, but... Wnt 06:14, 17 August 2011 (UTC)[reply]

1984

welcome to censorpedia Bunnyfrosch 16:53, 16 August 2011 (UTC)[reply]

Welcome to 2011. It's opt in. B Fizz 17:53, 16 August 2011 (UTC)[reply]
Calling this censorship is an insult to the many millions of people who have suffered under actual censorship. Real censorship does not have a "thanks, but I want to see it" button. Rd232 18:05, 16 August 2011 (UTC)[reply]
This censorship is an insult to all contributers who made the Wikimedia projects in their spare time, asking no money. They believed in principles that will be thrown away once this censorship is introduced. Next step will be censorship on articles, and slowly, the project will be taken over by politicians and religious fanatics - Quistnix 18:08, 16 August 2011 (UTC)[reply]
You appear to have a non-standard definition of the word censorship. From Wiktionary: "Censorship: The use of state or group power to control freedom of expression, such as passing laws to prevent media from being published or propagated." How is anyone preventing anything from being published or propagated here? Philippe (WMF) 18:16, 16 August 2011 (UTC)[reply]
So, your response to my saying "calling this censorship is an insult" is to repeat the insult? And then to throw in the slippery slope claim. Except that this is not censorship so there is no slope, slippery or otherwise. No matter how often or how hysterically the claim is made, it will still not be true that this is censorship and it will still not be true that this is a slippery slope to actual censorship. Rd232 18:18, 16 August 2011 (UTC)[reply]
@Philippe We already have the first calls to make the censored version the default and to hide anything that someone could find controversial, until they "opt-out" to enable the uncensored view. Even if we make it an "opt-in" it's very likely that it sooner or later changes to "opt-out" which meats the criteria for censorship. It's the drawing of pink bunny jumping over yellow flowers. After the discussions on EN about what images might be suitable (featurable) for the mainpage I'm fully convinced that it would not stop at this point and many authors and their ideals (uncensored, neutral knowledge, accessible for everyone) would be betrayed. --Niabot 19:16, 16 August 2011 (UTC)[reply]
@Rd232:Time will prove you're wrong. One day you'll wake up, and that day will be too late to change things - Quistnix 20:00, 16 August 2011 (UTC)[reply]
@Philippe: Nothing wrong with that definition. We just differ in our interpretation of it - Quistnix 20:04, 16 August 2011 (UTC)[reply]
Your honour, it's not theft, I just disagree with your interpretation of "ownership". Is this what passes for reasoned debate round here? Meh. Count me out. Rd232 20:06, 16 August 2011 (UTC)[reply]

Its the first step to enable censorship by third parties. School districts, governments, internet-provider, employers... And many fundamentlists will remove tagged images from the articles, just because they are tagged. --Bahnmoeller 09:01, 17 August 2011 (UTC)[reply]

For anyone who doubts the slippery-slope argument in this case, there has already been, on this page, a proposal to make the filter opt-in - for all values of "offensive". And indeed such a call would be hard to resist - we would be seen to be either assuming that readers are prurient/ungodly/voyeuristic, or deliberately exposing them to "unpleasant" images when we can flip a bit to "protect" them. Rich Farmbrough 14:05 17 August 2011 (GMT).

Other Projects

Commons must be sensitive to the need of every other project in the Wikimedia universe, as well as users around the globe

Can we translate that into: If a project elects or is forced to elect to ban images of ( ), the foundation will be happy to assist? --Eingangskontrolle 17:33, 16 August 2011 (UTC)[reply]

That seems to be a very different statement from what was written. Philippe (WMF) 17:35, 16 August 2011 (UTC)[reply]

Or can some project choose, not to use this system at all. --Eingangskontrolle 17:37, 16 August 2011 (UTC)[reply]

No, the Board of Trustees has directed the Foundation to proceed with this. It will be integrated into the software, as I understand it. There will not be an opt-out. Philippe (WMF) 18:17, 16 August 2011 (UTC)[reply]
I'm not clear on why the WMF would ask users to comment on something that they already intend to implement. Can you explain? Delicious carbuncle 20:07, 16 August 2011 (UTC)[reply]
From the Content page: The feature will be developed for, and implemented on, all projects. It will not permanently remove any images: it will only hide them from view on request. For its development, we have created a number of guiding principles, but trade-offs will need to be made throughout the development process. To aid the developers in making those trade-offs, we are asking you to help us assess the importance of each by taking part in this referendum. Philippe (WMF) 20:09, 16 August 2011 (UTC)[reply]
So, to be clear, an image filter is going to be implemented, regardless of the "result" of this "referendum"? Delicious carbuncle 21:02, 16 August 2011 (UTC)[reply]

Why is it called a referendum? Usually that would be a "yes/no" on whether we want this, not a request for comment on what is important and what is not. (Building consensus is important, building a self-censorship infrastructure is not, especially when that could be done by third party add-ons without compromising the integrity of the Foundation's mission of disseminating all human knowledge). Kusma 20:35, 16 August 2011 (UTC)[reply]

As far as I understood it is decided that a filter will be implemented, and that it will be opt-in-only (= no filtering by default). All other details of its configuration shall be developed as far as possible within the community's ideas and wishes. The referendum tries to find out which filter features are wanted or rejected by the community. --Martina Nolte 22:33, 16 August 2011 (UTC)[reply]
I find it already strange that such a "feature" was implemented without asking the community. On top of that: Why do we vote anyway if its already decided? It's like: "Hey guys, you can censor yourself now. If you don't like, we will do it for you". --Niabot 23:31, 16 August 2011 (UTC)[reply]

EN Banner message - misunderstood it at first reading

On the EN Wikipedia - at first I thought it referred to the opt-in use of a personal image of an editor. Could we sacrifice concision here and write it as '...gather more input into the development of a feature which will allow readers to voluntarily screen particular types of images strictly for their own accounts.' Novickas 17:47, 16 August 2011 (UTC)[reply]

I got this first impression as well. B Fizz 17:55, 16 August 2011 (UTC)[reply]
We're pretty much at the maximum text for the banner as it is.  :( Philippe (WMF) 18:18, 16 August 2011 (UTC)[reply]

UI comments

The interface uses different phrases for some things, when it's unclear whether they're the same or different. Perhaps some of the terms should be standardized or clarified.

  • display settings, content filter settings, filter settings, content filters, filters, settings

Is display a superset of, or the same as content filtering? Let's use one term if it's the same, or clarify if it's not. Filtering is technical jargon which may confuse readers – let's use concrete terminology like show and hide. Why is this “settings” and not “preferences,” as in Special:Preferences?

  • hide content, show, hidden, denied, allowed (in addition to “filtering”)

Confusing. Are we allowing the filter to block the content, or allowing the content to be displayed (“shown”)? Or are we allowing the reader to view the content? Why add the concept of allowance and denial (of content, of access to it, or of some action?), when we're already clearly talking about showing and hiding content?

  • Content Hidden, This image..., Show Image

Why mix generic “content” and specific “image?” What terminology will be applied to audio, movies, animated gifs, timelines, etc?

The terminology in “Content Filters: Children's Games, Gladiatorial Combat, etc.” is confusing. When presented over the “Hide Content” button, it actually represents categories that the image belongs to – it's certainly not a list of “content filters,” because there are no filters applied to the image in this state. Can this be rewritten for clarity?

What does the gear icon represent? It appears next to “Display Settings” at the top of the page, and next to “Filter Settings” in a hidden-image box. Actually in the hidden-image box it appears next to one of two nearly-identical “filter settings” links – does the gear indicate a difference, or are they the same? Such redundant-but-different controls add clutter and confuse. Let's eliminate the gears and the redundant link.

Why are we introducing a non-standard on-page pop-up box, with a new modality? Most other actions in MediaWiki (like logging in, changing preferences, editing, previewing) are handled in new page loads, while some minor alerts (like “Are you sure you want to leave this page?”) are Javascript dialogues. Better to stick with an existing modality that's familiar in MediaWiki and other websites, than to confuse the reader with a novel one.

Why is there a single tab at the top of the pop-up box? Decoration? Is it draggable by the area behind the tab? As a tab, it implies that other “display” settings may be available in some contexts, or in the future. If it doesn't serve a purpose, let's omit it. If this acts like a modal dialogue box without tabs, let's make it look like a modal dialogue box without tabs.

Why are the filters toggled with iOS-style slider controls? These are unfamiliar, and inconsistent between platforms. In some software they must be dragged left and right, while in some other they must be clicked (and dragging doesn't work). Better to use familiar conventional checkboxes to turn list items on and off.

How will the hover content be presented in a touch-based interface? Touch interfaces have no hover state, so this important content either disappears, or surprises and delays the user by turning an action into a mode, and requiring a repeat of the action for success. It would be better to redesign this with all important content visible.

Why does the [Save Settings >] button have an arrow? It looks like it might be a pop-up list, or a combo pop-up/button control. Why is the action “Cancel and Close” a link instead of another button? Does the top-right X do the same thing as “Cancel and Close?”

Capitalization of titles and headings is inconsistent, and contrary to en.Wikipedia's heading capitalization guidelines.

Will every single image sport the strange “Hide Content ⃠” under-tab, or only candidates for filtering/denying/hiding? Let's put this control inside the rectangular image box, instead of cluttering the page with irregular boxes and inconsistent bottom margins.

I also suggest that a hidden-state image retain the full dimensions of the original image, preventing the surrounding text and layout from jumping around when the state is toggled. It may also help hint at how potentially disturbing something might be. Michael Z. 2011-08-16 19:14 z

I suggest to redirect the UI to: /dev/null --Niabot 19:19, 16 August 2011 (UTC)[reply]
Thanks for the detailed feedback. These are just mock-ups though -- the feature won't necessarily look exactly like these. The referendum is about the principles behind the hider, not the actual feature itself. That type of thing should probably be directed to the techies on MediaWiki.org. Cbrown1023 talk 20:28, 16 August 2011 (UTC)[reply]

Javascript

Does this filter system work without Javascript? Will the pictures be filtered or unfiltered if Javascript is not available? --94.134.216.98 20:21, 16 August 2011 (UTC)[reply]

The filter hasn't been created yet, so we have no idea if it works without JavaScript. The images are just mockups of how it could look to help you figure out your stance on the different filter principles. Cbrown1023 talk 20:30, 16 August 2011 (UTC)[reply]
Every other sort of show/hide box on Wikipedia is displayed when Javascript is disabled. But if there were to be any backsliding about making this system opt-in, I'd expect that's where it would begin. Wnt 20:35, 16 August 2011 (UTC)[reply]
As Erik Möller states below, the filtering is apparently dependent on javascript being available. Delicious carbuncle 02:05, 17 August 2011 (UTC)[reply]

"Error fetching your account information from the server."

https://wikimedia.amellus.net/index.php/Special:SecurePoll/login/230?site=wikipedia&lang=de gives me this failure notice. What is to be done? --Martina Nolte 20:51, 16 August 2011 (UTC)[reply]

Yeah, there's something wrong. We took the banners down and the tech team is investigating the issue. Cbrown1023 talk 21:01, 16 August 2011 (UTC)[reply]


  • As usual, some server is broke. All Toolsever tools have been broken since Wikimania. This proposal, how much is the Foundation going to spend on it? Would not it be better to spend the money on more useful things? Or is it expected to increase funding? /Pieter Kuiper 22:10, 16 August 2011 (UTC)[reply]
This was fixed a bit ago (at 22:01), sorry for not leaving a note here. It turned out that there was a networking issue with SPI's datacenter. Everything should be working now. Cbrown1023 talk 23:25, 16 August 2011 (UTC)[reply]
OK, it seems to work now, thanks - i got to the page which offers me a choice of questions and clickable dots, etc. Boud 10:52, 17 August 2011 (UTC)[reply]

Proposal: Filter is enabled by default

I am proposing that, by default, the image filter is enabled (all potentially offensive images are blocked). There are several reasons why this is preferable:

  • It is in keeping with the "principle of least surprise/least astonishment". Users should not have to take action to avoid seeing images which may offend them.
  • Some of Wikipedia's critics have long complained about offensive images. Unless those images are blocked by default, the criticism will continue. Having potentially offensive images blocked by default will silence that criticism.
  • Schools and some libraries are likely to want potentially offensive images blocked by default.
  • Many institutions with shared computers do not save cookies for security reasons, so any setting would be gone the next time the browser is started. This poses a problem for institutions which require potentially offensive images to be blocked, unless the filter is on by default.
  • Wikipedia vandals often place offensive images in templates or articles. While users viewing en:Fisting may not be surprised to see a photograph of a man being anally fisted, they would not expect to see it on an unrelated article, which is what happens with this type of vandalism.
  • The misuse of images (as opposed to the deliberate abuse of images by vandals), coupled with the potential for users, especially younger users, to be unaware of alternate meaning of terms, means that users may be surprised by images that they did not expect to see. Blocking offensive images by default mitigates this possibility.

Therefore, I propose that implementation of image filters be done in such a way so as to enable image filters by default. Unregistered users could turn it off with the dialog shown in the mock-ups. Registered users would be able to change it via a preference that they set once. Note that I am not proposing any change to which images are filtered or how they are classified - those remain issues for the WMF to work out. When I refer to "potentially offensive images" I simply mean images which would be filtered if a user had chosen to use the filter (there is no difference in what is filtered between the opt-in or opt-out implementation). Delicious carbuncle 20:52, 16 August 2011 (UTC)[reply]

  • Support - as proposer. Delicious carbuncle 20:52, 16 August 2011 (UTC)[reply]
  • Support - flickr is probably the largest source of free porn that is hovered up and stored here, and by using simple opt-in filters they still manage to keep the place relatively porn free for those that don't want to encounter it in specific circumstances. John lilburne 21:20, 16 August 2011 (UTC)[reply]
  • Oppose - My understanding of the original "opt-in" filtering proposal was that, under that proposal, we would tag images as belonging to various (relatively value-neutral) categories (e.g. nudity, spiders, firearms, explosives, trees), provide each viewer with a filter that doesn't block anything by default, and allow the viewer to choose which categories to block. I'm not entirely happy with that suggestion, but not strongly opposed as of yet. In contrast, Delicious carbuncle's proposal is to have certain "potentially offensive images" blocked by default, while giving viewers the option of un-blocking those images. Unlike the opt-in proposal, this one would require us to make a definite value judgment (i.e. regarding which kinds of images are "potentially offensive" to a reasonable person); there seems to be no way of drawing the line here, unless we're going to have the filter block by default all images that any identifiable demographic finds offensive. Sorry, but I don't think it's feasible to expect people from all over the world to come to an agreement on that issue. --Phatius McBluff 21:33, 16 August 2011 (UTC)[reply]
    • Comment - However, if we do end up having some categories filtered by default, then we had better allow people the option of adding additional categories to their filters. That's only fair, given the extreme subjectiveness of what counts as "potentially offensive". --Phatius McBluff 21:33, 16 August 2011 (UTC)[reply]
      • To be clear, what I am proposing is that any image which would simply be filtered under the opt-out scheme be filtered by default under the opt-in. Whatever system and schema used to classify images for the original proposal would be used in my variation. By "potentially offensive" I am referring to any image tagged as belonging to any of the categories shown in the mock-ups. Nothing is changed except the filter is on by default. Delicious carbuncle 22:32, 16 August 2011 (UTC)[reply]
  • Strong Oppose for all reasons given by me in the sections above this poll. --Niabot 21:48, 16 August 2011 (UTC)[reply]
  • I can't think of a worse way of using Wikimedia resources than to help libraries keep information from their users. Also, it is impossible to implement in a culturally neutral way... Kusma 22:31, 16 August 2011 (UTC)[reply]
    • Again, there is no difference in implementing classification under my proposed variation, simply that filtering is on by default. If your issue is with the idea of a filter or with the difficulties of classifying images, please do not comment here. Delicious carbuncle 22:35, 16 August 2011 (UTC)[reply]
"All potentially offensive images" would include all images of people for a start, and I suspect all mechanically reproduced images (photographs etc.), and quite possibly all images. Rich Farmbrough 22:47 16 August 2011 (GMT).
OK, I stand corrected. There are two culturally neutral ways to filter images: all or none. Kusma 22:48, 16 August 2011 (UTC)[reply]
Once again, I am not proposing any changes to which images are filtered or how they are classified, only that the filters are on by default. If you have concerns about how image will be classified or who will do it, take them up with the WMF. If you have an opinion about whether the filters (which will be implemented) are on or off by default, feel free to comment here. Otherwise, your participation in this discussion is entirely unhelpful. Delicious carbuncle 23:03, 16 August 2011 (UTC)[reply]
I strongly suggest that they are disabled by default or will never be implemented. There is one big, open and time consuming question: Which image is offensive to who? "Personally i hate flowers and numbers, please block them all in the default view, because i often change my workplace..." --Niabot 23:36, 16 August 2011 (UTC)[reply]
Let me then rephrase. Oppose because in the event that filters are implemented they should, and undoubtedly would, include a filter for the human form "it is feared by many Muslims that the depiction of the human form is idolatry", all human and animal forms - certain hadiths ban all pictures of people or animals - and for all representations of real things "You shall not make for yourself a graven image, or any likeness of anything that is in heaven above, or that is in the earth beneath, or that is in the water under the earth". Rich Farmbrough 23:31 16 August 2011 (GMT).
What a load of crap historically, and currently. Damn I've not seen such ignorance outside of BNP and EDL lies. John lilburne 06:55, 17 August 2011 (UTC)[reply]
  • Oppose: If you implement the filter, make it off by default, otherwise it would further marginalize images seen as controversial by those in power, something to which I'm opposed to on anarchist grounds. Cogiati 01:37, 17 August 2011 (UTC)[reply]
How is that an Anarchist perspective! As an Anarchist myself I don't think ANYONE should be dictating to what I MUST see anymore than some one should dictate what I MUST NOT see. The solution that allows the most freedom of choice is to allow me to decide whether to view the image or not. John lilburne 11:46, 17 August 2011 (UTC)[reply]
  • Oppose – Forcing users to opt out is far more intrusive than granting users the ability to opt-in. --Michaeldsuarez 02:56, 17 August 2011 (UTC)[reply]
  • Oppose. Delicious carbuncle: You seem to be under the impression that the idea is to compile a finite list of "potentially offensive" subject areas and offer the option of filtering them. This is incorrect. An image "offensive" or "controversial" to one culture/individual is not necessarily viewed as such by another. "All potentially offensive images" = "all images". The "categories shown in the mock-ups" are merely examples.
    For the record, I strongly oppose the plan on philosophical grounds. Your suggested variant, in addition to being even more philosophically objectionable, isn't even viable from a technical standpoint. —David Levy 03:45, 17 August 2011 (UTC)[reply]
    You appear not to have read what I wrote. A filtering system is being implemented. I am simply proposing that the filters are on by default. Any issues with deciding what' to filter exist in the current implementation. Delicious carbuncle 10:59, 17 August 2011 (UTC)[reply]
    You appear not to have read what I wrote (or what Phatius McBluff wrote). The proposed system would not include a single set of filters applicable to all cultures (an impossibility) or to one particular culture (a non-neutral, discriminatory practice). It would, at a bare minimum, encompass numerous categories widely considered controversial/offensive in some cultures and perfectly innocuous in others.
    "The filters are on by default" would mean that everything categorized within the system would be filtered for everyone by default. Some religious adherents object to images of women without veils or women in general, so all images of women would be blocked by default. That's merely a single example.
    For the plan to even conceivably succeed, "deciding what to filter" must be left to the end-user. (Even then, there are significant issues regarding how to determine the options provided.) Your suggestion relies upon the existence of a universal filter set, which is neither planned nor remotely feasible. —David Levy 18:54, 17 August 2011 (UTC)[reply]
  • Oppose. Just because you're playing on a slippery slope over shark-infested waters doesn't mean you have to dive off it the first minute. Look, I'm not completely ignorant of the religion of the surveillance state; the concept is that the human soul is a plastic card issued by a state office, and all of the rights of man emanate from it. For the unidentified reader, who doesn't run Javascript or cookies, to be able to access specialized information (or eventually, any information), let alone post, is a Wikipedia blasphemy against this faith no less extreme than the Muhammad pictures are against Islam. And so first you feel the need to make the image hiding opt out by default; then limit it to the registered user; and eventually only such registered users as are certified by the state, because your god commands it. After that it would be those specialized articles about chemistry, and so on. But Wikipedia has been defying this god since its inception, so I have high hopes it will continue to do so. Wnt 05:45, 17 August 2011 (UTC)[reply]
  • Support, as a parent in particular. Currently I have all of the WMF's websites blocked on my daughter's browser, which is a shame because she loves learning and wikipedia would be a good resource for her if it weren't for problematic images that are generally only a few clicks away. --SB_Johnny talk 12:30, 17 August 2011 (UTC)[reply]
I'm disturbed to read your vote, because if you can block WMF websites on your child's computer you must be running some sort of "nannyware", but I would expect any remotely professional censorship company to be able to block out the sort of images being discussed here while allowing through the bulk of Wikipedia material. What do they do in schools which are required to run such software? Wnt 14:41, 17 August 2011 (UTC)[reply]
Why would you be "disturbed" by a parent using nanny software? Or (say) a preK-8 school, for that matter? --SB_Johnny talk 16:39, 17 August 2011 (UTC)[reply]
There is no software that will reliably detect inappropriate images (whatever the criteria for inappropriate is). Thus it is simpler and less error prone to block the entire site. John lilburne 15:03, 17 August 2011 (UTC)[reply]
Yup. I seriously doubt the sites will come off the red flag lists until at least some effort is made to change the defaults. OTOH, from other comments on this page about "cultural neutrality" and "radical anti-censorship", the categorization and tagging efforts might fail anyway. --SB_Johnny talk 16:39, 17 August 2011 (UTC)[reply]

Proposal to at least put this in as one of the voting options

Currently the poll/referendum/whatev doesn't even ask the question (thus "censoring" the poll) ;-). Since it's already mentioned that people can change their votes anyway, is there any reason it can't be added as an option on the poll/referendum/whatev? --SB_Johnny talk 21:04, 17 August 2011 (UTC)[reply]

My own, belated, dismay

I suppose I should have been paying more attention. I knew a referendum was coming, and I naively expected it to be on the substance of this monstrosity, and not on implementation details — it had not occurred to me that the Foundation would even consider committing a "feature" such as this without overwhelming community support given that it goes exactly opposite everything it is supposed to stand for.

There is no such thing as "offensive" or "controversial" information, only offended points of view. It's destroying our educative mission for the sake of — well, I'm not even sure what rational objective this is supposed to serve. "Morality" as defined by a minority in some random theocracy? Some vague and ill-defined standard of "respectability" handwaved into being without thought?

Encouraging readers to blind themselves to the parts of reality they dislike is the opposite of our educative goal. It is stealth POV forking, causing people to view different articles that match their preconceptions (or, in the real world, somebody else's preconceptions as imposed to them by social pressure out outright force). There is no real conceptual difference between allowing image POV filtering and allowing text POV filtering. When does the "exclude liberal text" toggle come, then? Or the "I don't want to see 'evolutionist' articles" checkmark?

You may think I'm abusing a slippery slope argument; I don't believe there is a slope: being able to exclude images of "sex" (however you want to defined it), or that of "the prophet", or whatever is exactly the same thing as being able to exclude articles on biology, or on religions you disapprove of. The false distinction between images and articles is just a fallacy to rationalize this "feature".

That's not even getting into the problems about how easy such categorization of contents over arbitrary pigeonholes of "offensiveness" is easy to abuse by third parties — others have expounded on that aspect elsewhere and in more detail.

Yes, people are allowed to use the blinders the foundation will provide for them; I believe in freedom to self-harm. I have no desire to be a party to it. We should be the beacon of knowledge, allowing people to expand their limited vision — not giving them a comfortable experience of strictly limited points of view. That's TV's job. — Coren (talk) / (en-wiki) 20:57, 16 August 2011 (UTC)[reply]

I couldn't agree more. Kusma 22:32, 16 August 2011 (UTC)[reply]
Wikipedia. The free Encyclopedia. If those filters come Wikipedia isn't free anymore. --Matthiasb 23:30, 16 August 2011 (UTC)[reply]
I completely agree with this points. --Niabot 00:28, 17 August 2011 (UTC)[reply]

Next Step: Filters for Objectionable Ideas

At first I thought: Why not? Everybody gets the Wikipedia they want.

Then I thought harder. If you want an article to be the way you want it, write and illustrate your own article, and publish it somewhere. If it makes sense to start tagging images, the next step is to start tagging words, phrases and ideas, so that no one need come to a consensus by participating in those bothersome editing discussions about what is appropriate, notable or reliably sourced. Rather one can simply avail oneself of a checklist of ones current prejudices (or, to be gentler, beliefs) and let the cadres of censors (or Filter Editors, as they will probably call themselves) for those beliefs protect you from any challenges to your current point of view.

I don't know if the whole idea is insidious or merely ridiculous. It would be insidious if it could actually be implemented without causing a chaos of claims and counterclaims about what interests and prejudices may be covered by tagging, with no "reliable sources" to settle the matter. Just consider for a moment the prospect of removing all articles, images and sentences relating to Black persons, so that White supremacists may get the "culturally neutral" Wikipedia they want for their children. It would make the current Discussion pages on controversial articles look like flower arranging. Given that I think it could never be implemented without destroying what is best about Wikipedia, I judge this proposal to be ridiculous. —Blanchette 21:35, 16 August 2011 (UTC)[reply]

Are you aware that there are already filters in place at en.Wikipedia which disallow certain word and phrases? Many of those filters are secret (i.e., you cannot see which words and phrases are disallowed). Where were the defenders of free speech when those were implemented? Delicious carbuncle 22:38, 16 August 2011 (UTC)[reply]
They do not disallow the words and phrases, they stop them being added in certain circumstances without a little extra effort. And Wikipedia is not about free speech, in that sense - Wikipedia is very censored, but only (or only intentionally) by the removal of un-encylopaedic or illegal material. Rich Farmbrough 23:45 16 August 2011 (GMT).
I opposed them; I still oppose them. People say "spam" but then that turns out to mean "copyright" and from there it turns into "things they don't like". But this is still a community capable of coming together and sternly rejecting all such things. Wnt 05:49, 17 August 2011 (UTC)[reply]

Liability

File:Babar.svg
Elephant in the throne room, yesterday

The risk here is that we create liability. Currently you get whatever images are on the page, you take a risk that "any reasonable person" would apprehend - that on a page you might see either A. an apposite but offensive picture B. and erroneous and offensive picture, or C. a vandalistic and offensive picture. Once we have a filter in place, the user with religious views, a sensitive stomach, PTSD, or and expensive lawyer on retainer, may come across images that they believe (rightly or wrongly) the filter should hide. We have in effect given an undertaking to filter, which we are not able to deliver on, for a number of reasons possibly including:

Is this a horror image?
  1. Interpretations of the filter
  2. Error
  3. Backlog
  4. Pure vandalism
  5. Anti-censorship tag-warriors
  6. Perception of the image

And that doesn't touch on the consequences if we hide something we should not.

Rich Farmbrough 23:04 16 August 2011 (GMT).

And yet flickr have filters and aren't sued by expensive lawyers on retainer, when porn images leak past the filters. Why is that? John lilburne 11:56, 17 August 2011 (UTC)[reply]

Implementing the feature is the wrong signal to the world

When Wikipedia came up – remember: Wikipeda. The Free Encyclopedia – it sent a clear signal to tear down walls by allowing anyone access to knowledge. Implementing filters is sending a clear signal that tearing down walls hindering people to get free information was wrong. That's not what millions of users are visiting Wikipedia for. That's not why Wikipedia is widely respected. And that's not what hundreds of thousands of editors believe(d) when contributing to the encyclopedia. Implementing a filtering system is just wrong. --Matthiasb 23:43, 16 August 2011 (UTC)[reply]


Practical considerations

The whole process is waaay to complex. I spent 5 minutes now to understand what this is about. Wikipedia should NOT invite users on a normal page to joint this referendum. It's clearly for specialists. If you want input from the public, this time you failed just miserably!


I am a Wikimedia Foundation developer and I work frequently with Wikimedia Commons administrators. While I have given some input to Robert Harris when he was writing his report, I was disappointed by what was eventually proposed. Note, I am only speaking for myself as someone familiar with how Commons is run. I don't speak for the Foundation, nor do I disrespect the people involved in this proposal, as I think they are making a good faith effort to satisfy serious concerns.

I don't know which users will be satisfied by this. Censorious people won't be happy until the objectionable content cannot be accessed at all. If there are people who in good faith try to use this -- for instance, to have a classroom computer that only accesses a "clean" version of Wikipedia -- the students will get around it in seconds, via the button that we put right on the page. So I don't see who we are trying to satisfy here.

Furthermore, this proposed tool adds to the burden imposed on Commons administrators. Amended this complaint as Eloquence makes clear below they are contemplating a system that doesn't use the usual categorization systems, but something simpler for non-technical users.

Finally, it would be very difficult to implement this tool as designed. Amended since there was a discussion of which I wasn't aware, and the implementation is quite plausible. See Eloquence's statement below.

I don't agree that our problem is about images that are already correctly placed in the context of a Wikipedia or Wikimedia Commons page. Our problems are more about community members who intentionally misplace content. For instance, people who try to get the most shocking image they can onto the front page of a project. This boils down to a few people who are simply trolls, and a somewhat larger number who are I think are misguided about Commons principles.

It is also conceivable that search results might contain unnecessarily disturbing or shocking results. But this is far easier to deal with. This is a case where we are generating results dynamically, and we can easily filter for images that should not appear in a search (or that should have a kind of click-through-to-view interface).

I doubt there will be a need for a multi-dimensional categorization scheme here; we can simply have a single category "not for public areas / search results without confirmation". Reasons to categorize an image this way can be dealt with in the usual fashion, via the edit summaries.

These problems suggest different solutions, particularly a community effort to clearly define what is and isn't appropriate, new tools to boost the effectiveness and vigilance of Commons volunteers, and an effort to increase the numbers of administrators or recognize the important work they are doing. And such an approach would be necessary anyway; even if the proposal as defined was accepted.

-- NeilK 23:48, 16 August 2011 (UTC)[reply]

I think you gave us a nice conclusion. At first its not an easy task to implement this feature. At second its hard to decide which content should be excluded or not. At third it comes with a lot of more effort then most would imagine.
The implementation part is the first issue we would have to address. Given the fact that we already have enough technical issues this will stop the development on more needed features. But i guess that isn't the biggest issue, since we can "throw cats around" (stupidest thing/idea that i ever came across).
Good enough to be finalist of Picture of the Year on Commons. But, too offensive for English main page.
The question which content is objectionable an which not is a very difficult task which. I would never like to give it into the hands of the WMF, since i know how different the viewpoints are. In the end we would have hours/days/month spend on such decisions, which is in my opinion wasted time and effort. I added an simple example to right. While Commons and other projects never had an problem with the image, the English speaking Wikipedia denied it strongly. A picture that can't be featured, because it would be to controversial for the main page...
The third thing is vandalism. The admins already have a lot of trouble and many jobs to do. Now we add an extra job which isn't easy at all (see above). On top of that an admin would need to decide about a great variety of images. While experts of the topic would not see a problem, a admin might do it and the other way around. At the end we could make a poll about every image if to include/exclude or not.
I doubt that this would be a good approach at all. It's no final solution, it has much room for exploitation and it is a burden for all authors and/or admins. --Niabot 00:22, 17 August 2011 (UTC)[reply]
Niabot -- I agree with some of your points and you bring up an elephant-in-the-room problem. Commons admins and volunteers are by far the most hostile to anything that looks like censorship. And they have different values even from the English Wikipedia, let alone Arabic. So where are the volunteers going to come from, to categorize and filter images? NeilK 00:43, 17 August 2011 (UTC)[reply]
Hi Neil! I am glad you brought up better tools for Commons; you are someone who is in a great position to help define and build such tools. The last part of our resolution that led to all this in fact says "We urge the Wikimedia Foundation and community to work together in developing and implementing further new tools for using and curating Commons, to make the tasks of reviewing, deleting, categorizing, uploading and using images easier" -- and to my mind that's perhaps the most important bullet! What tools would you imagine? (And if this design for this feature won't work -- ok; it will have to change. I don't think it's gone through full technical review, you're right.) -- phoebe | talk 00:37, 17 August 2011 (UTC)[reply]
Phoebe -- As for categorization tools, I don't have a clear idea about what needs to be done next, but I do know that it's a huge pain point at the moment. NeilK 00:56, 17 August 2011 (UTC)[reply]
A clarification regarding technical assessment. Prior to Brandon's design, Brandon and I talked through the technical requirements of a personal image filter with Roan Kattouw, an experienced WMF engineer. The strategy recommended by Roan was to embed filtering information (such as a small set of relevant filter categories) in the HTML/CSS, allowing them to be cached, and then doing the filtering via JS. This avoids any "phone home" calls on the client; the server-side lookup should be a fairly efficient DB query. The parser would simply add relevant image category annotations to its output (not for all categories; just for ones relevant for filtering). This strategy is similar to the one proposed by Aryeh Gregor (another experienced MediaWiki developer) here mw:User:Simetrical/Censorship.
We discussed the issue of DOM flickering; I believe Roan stated at the time that it would likely be avoidable in most browsers provided the JS is executed at the right time, but this is something we'll need to assess in more detail. For browsers without JS, the filter option would simply not be available (and it could also be turned off in browsers which do not support it well).
I agree that the correct categorization of images on Commons is the main mechanism through which a filter would succeed or fail, and this is both an issue of efficiency of categorization tools and arguments around classification. My own view is that it's likely advisable to look further into simple tools for readers to report potentially objectionable content, which could be used either to suggest classification (through a hopefully simplified UI), or to automatically derive it using a scoring algorithm. That would help to get correct classification in places with high viewership, which is where it arguably matters most.--Eloquence 01:02, 17 August 2011 (UTC)[reply]
Okay I have amended the rant above. NeilK 01:05, 17 August 2011 (UTC)[reply]
I will reiterate the points I made in the collapsed section. Virtually every image is offensive to someone. The Ten Commandments forbid, under certain readings, images of actual things (in the heavens, on the earth, under the earth and in the seas). Various hadiths prohibit illustration of animals and people. Certain Jewish sects avoid picturing women. And that is just views from mainstream religions (albeit not incredibly widely held among those religions). What about Chinese bans on pictures of the Dali Lama and the Tibetan flag? If we implement this proposal, we do several things:
  1. Create a massive burden where ten million pictures need to be classified in probably hundreds of ways
  2. Create a chilling effect for readers, who will risk being asked "why they didn't filter the image".
  3. Create a chilling effect for contributors who wonder whether their image will be censored.
  4. Create problems for service providers (schools, libraries, cafes, ISPs, universities) who will not know what they should be doing (especially in the US where libraries are both constrained to and prohibited from applying software filtering)
  5. Create potential legal liability for ourselves when the filters fail (as they will, inevitably)
  6. Create the inevitable scenario where (as has already happened) we are asked to make "filter on" the default - and will have trouble refusing.
This is an excellent idea, unfortunately thinking it through reveals what appear to be insurmountable obstacles. Do we really want to provide a filter that excludes all photographs of women? But, in all conscience, can we not do so, if we are providing filters for those with other beliefs?
I have, on considered reflection, voted "0" for the first question in the referendum "It is important to provide this facility..." and I would urge others to do the same.
Rich Farmbrough 02:33 17 August 2011 (GMT).

Details about the flagging system are needed

I have not found enough information about the image flagging sytem. We are provided screenshots showing how the hidding option can be activated and how the resulting Wikipedia page looks like, but I don't see any screenshot of the flagging system. Without further explanation, it seems there is a contradiction between the principle expressed in Image filter referendum/FAQ/en#What are the principles behind its creation? : "It is a personal system. Decisions you make are for yourself alone" and the mere existence of a flagging system. A flagging system means taking decisions for other people. Deciding that image A belongs to category C is a decision that will affect more than one person, if my understanding of a flagging system is correct. Will the flagging system be a wiki ? Who will have the power to flag or unflag a given picture ? Will there be one flagging system for Wikimedia as a whole, or will there be as many flagging systems as language versions ? One flagging system per language? Teofilo 23:59, 16 August 2011 (UTC)[reply]

I think there is a hand-wavy idea that perhaps six or eight categories, relatively self evident, will do the job. I don't think consideration has been given to the complexities - for example "nudity" - art is classified as nude, sometimes, even when some clothing is worn - but even without that type of rider, a typical Western parent might be happy for their child to see the Venus de Milo, but not a "topless model" - other cultures might not draw the distinction at all. Then there are degrees of everything - drug usage, for example in one of the top parental control filters, is, I believe, divided into about five degrees of severity. Again different cultural viewpoints will place different items in different degrees, some places are very laid back about tobacco, for example. Fundamentally, although the categories are probably seen as "an implementation detail" they need to be thought through, in a culturally neutral way before we can understand the scale of the number of categories - this affects the whole technical design, and even more so the sociological design. Rich Farmbrough 02:52 17 August 2011 (GMT).
Honestly, I don't feel like this is supposed to succeed. There is a strange disease in America (I don't know how much further it extends) that all sorts of restrictive security measures are "feel good measures", i.e. that they're not supposed to actually work, but they have to be annoying, so that people feel like they're being protected. That applies for everything from image hiding to airplanes where you can't bring on your own cans of cola but the average agent, when allowed to test the system, can get a grenade through the scanners unchallenged. In this case my assumption is that the categories would be picked ad hoc, with a promise to improve them someday, implemented arbitrarily, and then people would try to do their best to forget about the system. And that may indeed be a pragmatic PR approach, but Wikipedia's core principles are not PR, but an intensely radical agenda to make knowledge freely available to the public without regard to socioeconomic status. Wnt 05:58, 17 August 2011 (UTC)[reply]

Are these "categories" going to be selected using the already implemented [[category:<name>]] system (en:Help:Category) or using an entirely new system with a different database ? Is it going to affect Wikimedia Commons ? If it is, how ? Wikimedia Commons is a great idea where people from all the countries are invited to share their pictures. Wikimedia Commons is something like the United Nations. If things are not thought carefully enough Commons could become the battle field of cultural world war III. My advice is to seriously consider creating the filter system on a different wiki (call it "filterwiki") or on each language version's wiki rather than do anything that would directly affect Wikimedia Commons, in particular Commons's categorizing system (commons:commons:categories). Commons is already a mess with everyone adding his own toy on picture pages (Image annotator, cumbersome banners, the long list of image uses). Please don't raise the weight and complexity of Commons' image description pages. Teofilo 08:28, 17 August 2011 (UTC)[reply]

Stay focused

The wmf:Resolution:Controversial content lists following reasons why an image might by filtered:

  • Images that are offensive because of their sexual, violent or religious nature.

As we see in the discussion on this talk page, the third point, whether an image with relation to religion is offensive, is difficult to say. It is even more difficult to put pictures into "political offensive" or "cultural offensive" categories.

So let's please stay focused on the realistic goals: Optional filtering of sexual or violent images. Similar to the Google SafeSearch, which filters sexual content and is activated on default. --Krischan111 00:18, 17 August 2011 (UTC)[reply]

Even sexual related content is already an very hot topic, with very different view angles. See two paragraphs above. --Niabot 00:24, 17 August 2011 (UTC)[reply]
They mean "Muhammad images". Probably people could name other examples, but only as cover. Wnt 05:59, 17 August 2011 (UTC)[reply]

understanding the referendum; WMF Board != Mubarak

Since i (and maybe many other people) didn't read this carefully enough the first time, here's some analysis that may help. i'll sign separately so that people can comment separately. Boud 01:36, 17 August 2011 (UTC)[reply]

will or would be implemented?

Let's read the description of the "referendum" carefully - my emphasis throughout:

  • ...feature, which will allow readers...
  • ... Such a feature was requested by the Board of Trustees in June 2011. ...
    • requested = wmf:Resolution:Controversial_content: We ask the Executive Director, in consultation with the community, to develop and implement a personal image hiding feature that will enable
  • The feature will be developed for, and implemented on, all projects.
  • What will the image hider look like?
  • this talk page, above: Hi Danaman5, yes, the Board passed a resolution asking for this kind of feature to be developed; so that's a given. The questions are asking for community input in how it is developed and implemented. -- phoebe | talk 05:00, 15 August 2011 (UTC)
  • this talk page, above: Or can some project choose, not to use this system at all. --Eingangskontrolle 17:37, 16 August 2011 (UTC) / No, the Board of Trustees has directed the Foundation to proceed with this. It will be integrated into the software, as I understand it. There will not be an opt-out. Philippe (WMF) 18:17, 16 August 2011 (UTC)

Will is clearly what the Board has decided and requested of the Executive Director. Boud 01:36, 17 August 2011 (UTC)[reply]

why/who?

So this is mostly based on a report by two people, neither of whom is WP-notable, only one of whom has Wikimedia project experience, and that experience seems to be mostly only about the controversial content study on meta, and no practical experience in editing en.Wikipedia articles or Commons content. Maybe Robertmharris edited earlier under another username, but he doesn't choose to tell us that. There's no indication of the degree of Robertmharris' multilingual fluency and worldwide experience of knowledge and censorship questions e.g. in en:Belarus, en:People's Republic of China, en:Saudi Arabia, en:Libya, or en:Egypt.

Shouldn't a minimum criterion for a study that is used as the basis for an irreversible decision (in the sense of "will" above) by the Board be that the study is either by someone well recognised (en:WP:RS) as having credentials in the free software/free knowledge/right-to-knowledge community and/or as an experienced Wikipedian? Boud 01:36, 17 August 2011 (UTC)[reply]

hypothesis

en:Sunk_costs#Loss_aversion_and_the_sunk_cost_fallacy: Maybe the Board paid Robertmharris for his "consultancy" and so felt that the result had to be followed without giving unpaid Wikipedians the chance to say no. Is this really what the Board is supposed to be about?

en:Sunk_costs#Loss_aversion_and_the_sunk_cost_fallacy argues that this behaviour is irrational. The consultancy was done and a report written. But a decision on implementation should be not be hardwired by the previous decision to pay (?) for the consultancy. Whether or not money was paid for the report is only of minor relevance here. The report was written and nothing will change that. Whether or not recommendations 7 and 9 will be implemented could, in principle, still be prevented or, at least, reverted into a decision-not-yet-made. Boud 01:36, 17 August 2011 (UTC)[reply]

what to do

It seems clear to me that if the community wants to say "no" to this feature, then the community has to somehow convince the Board so that the Board reverts its request to the Executive Director. The referendum is not going to do that (as some people have already said on this talk page).

Maybe one way to say "no", or at least to give the community the possibility to say no, is to write a formal proposal that is likely to get overwhelming consensus from the community and says that the community

  • asks for the so-called referendum to be temporarily withdrawn and replaced by a true referendum which:
    • includes a question like "Do you support the implementation of an opt-in personal image-hiding feature in the WMF projects? scale: 0 = strongly opposed, 10 = strongly in favour".
    • is written using the conditional "would" instead of the definite "will"
  • asks for the Board to withdraw its request to the Executive Director pending community consensus.

The primary decision would then go to the community, which is not presently the case.

Maybe an RFC? i'm not much involved in internal WMF project community organising - other people here almost certainly have more experience in that than me....

The Board is not Ben Ali or Mubarak (and let's hope it's not Gaddafi)! We can, in principle, still say "No" or "We will decide" to the Board. Boud 01:36, 17 August 2011 (UTC)[reply]

I would hope, and expect (and it seems the case), that at least some of the board is following this discussion. The basic premise that "a method for not seeing stuff that is really bad" is a good thing, all else being equal is sound. Unfortunately careful thought and discussion have demonstrated that all else is not equal, and the game is not worth the candle without a fundamental rethink. I hope, and expect, that the board will share that opinion, having reflected on the difficulties the proposed system would bring. The matter needs to be taken back to the drawing board to find if there is a viable alternative. Rich Farmbrough 02:40 17 August 2011 (GMT).

It's enough! We, the ordinary Wikipedians have to recognize that the Foundation no longer stands on our side (at least it looks like that). They just started to waft and lost contact to the base. They deside on their own and they don't bother to ask us. The so called "referendum" is a farce. We can't change anything. The intoduction of the censorship filter has already been decided. Alea iacta est. That's not a bit democratic. Such a hubris destroys the keystones of this project. But the Foundation may never forget that WE the Wikipedians created the content of Wikipedia. WE made Wikipedia to what it is now. I have to warn you: It was never a very splendid idea to ignore the populace... (and you are wondering why we lose contributors) *facepalm* Chaddy 05:37, 17 August 2011 (UTC)[reply]

i have posted a page move proposal. See Image filter referendum and go to Meta:Proposed_page_moves#Image_filter_referendum for support or opposition to the move proposal. This alone will not override the Board's decision, but hopefully we should be able to clarify the situation. Boud 10:26, 17 August 2011 (UTC)[reply]
Thank you for the information. Chaddy 15:30, 17 August 2011 (UTC)[reply]

Categories - it doesn't work

"We believe the designation of controversial should be placed on categories of images, not individual images, and the categories deemed controversial would receive this designation when there is verifiable evidence that they have received this designation by the communities the projects serve." - from the report - suggesting how contentious items would be identified.

We categorise pictures based (primarily) on what they are of - or sometimes who they are by. Offensiveness does not follow content category lines. For example en:File:Dali Temple image.jpg belongs to no categories that would suggest it is problematic. Similarly I don't believe we have "Category:Images containing XXX" where XXX is merely incidental to the image, but not necessarily to the offensiveness. Rich Farmbrough 04:34 17 August 2011 (GMT).

Does this mean that Wikimedia Foundation is taking a position on what constitutes objectionable material?

It seems to me that by creating categories of possibly objectionable material, Wikimedia Foundation would be taking a position on what can be considered objectionable. That would be unfortunate.--Wikimedes 07:15, 17 August 2011 (UTC)[reply]

Impractical and Dangerous

Well, I was going to say "stupid and dangerous", but I decided I'm a nice guy. No, but seriously, it is stup...I mean, impractical because

  • It puts strain on the already overstrained servers; this feature is probably very software-intensive. We already have bimonthly panhandling from Jimbo to improve the servers; there is a limit to how much people are willing to donate, and I don't think this limit is significantly increased by an "opt in" image filter.
  • It puts strain on the already overstrained editors: now, on top of adding content, fighting trolls, shills, zealots, cranks, vandals and cud-chewers and trying to enforce and discuss, literally, hundreds of policies, they shall also decide the merits of tags.

It is also, in spite of what optimists, outraged citizens and do-goody-goodies say, dangerous, because:

  • The slippery slope is there. Wikipedia has bent to pressure before, most notably when it implemented a BLP policy that differs from the policy applied to all other articles. There are hundreds of well-argumented pages with buzzwords like "responsibility" to justify it but the bottom line is: living people litigate. Now, pretty much everything on the Internet is subject to the Maude Flanders attack, which consists of screaming inconsistently, loudly and repeatedly "Will anybody think about the children?" and demanding that policies are implemented to protect "them" from the evils of...whatever bigots think is evil. An uncategorized encyclopedia is quite impervious to this, as it is very hard to considered it offensive as a whole. Once parts of it are tagged, it's a breeze to demand that they are censored, "for the children's sake". Or to "safeguard religious sensitivities" other than the single reader's. Or to "comply to a nation's governing law". Or to abide to special copyright laws.
  • There is not enough detail on the technical implementation so far, but it is entirely possible that a firewall could be easily configured to force filtering, or to directly filter. This would mean wikipedia editors and programmers would be working to make things easier for the Great Firewall of China and the Saudi government.
  • Assuming the above is not true, there are still two possibilities: people will not use the image filters, in which case its implementation is useless, or they will. If they do, censorship will simply consist of mass tagging unwelcome images with impopular tags, most likely "pedophilia" or "possible copyright violation". Should I remind you that a consistent part of the editors is made of shills of corporate, governmental or religious organisations?

Well, these were my two pennies; in the end, all policies are voted in, as the enthusiastic are, by definition, more active than the cautious. In six months' time, I'd like to point to this discussion and say "I told you", but truth is, everybody who'd care will be too disgusted by then. 77.233.239.172 09:07, 17 August 2011 (UTC)[reply]

  • While I don't agree about the parallel with BLP (there is a substantial difference between trying to prevent readers being disturbed by imaginary offense in patterns of colored dots and preventing real harm to third parties), those are all very good reasons why this idea should be scrapped. — Coren (talk) / (en-wiki) 11:08, 17 August 2011 (UTC)[reply]

How will category pages and galleries look like ?

Image filter referendum/en#What will the image hider look like? tells how it looks on article pages. How would it look on a category page? For example how would commons:Category:La Liberté guidant le peuple (a famous painting by en:Eugène Delacroix) look like for people who want to remove images of topless women (or commons:Category:Auguste Rodin for people who want to see only the non-nude portraits) ? The size of the "show image" button added to the length of the "filter settings" link seems to be too much to fit the tiny space allowed by tiny thumbnails on category pages. By the way, the same size problem might exist on galleries using the < gallery > < / gallery > tags. Teofilo 09:14, 17 August 2011 (UTC)[reply]

clarification added

I've added the following clarification. AFAIK the Wiki(p|m)edia community generally dislikes ambiguous statements, so hopefully it should not be difficult to reach consensus on exact wording, pending consensus on the title change. i added:

<!-- clarification --> ''This page concerns a consultation about the details of how to implement a decision that was taken by the WMF Board in June 2011. It does not concern the [[:w:referendum|possibility of accepting or rejecting that decision]].''

Hopefully that should clear up the repeated confusion (including my own initial confusion due to not reading carefully enough). Boud 10:49, 17 August 2011 (UTC)[reply]

Some Questions on the "Editors support"

The section "Background" reads:

"We will also make it as easy as possible for editors to support."

1. To start to talk about what the editors will do there, could we call it by another, perhaps more neutral term then "support"? What about "classify" or "categorize"?

There are image files on the local projects and image files on Commons. For some images, there are versions on different projects.

2. Will the filter allow to see any file that is not from Commons?

3. Will there be a classification system only on Commons, or will there be different classification systems on each local project?

4. and if there would be different systems, which of these different systems would the filter apply?

--Rosenkohl 10:50, 17 August 2011 (UTC)[reply]

See #Categories - it doesn't work for the reports suggestion and why that solves nothing.
There would need to be a universal classification system on any project (of the 777+) that supports images. Otherwise the system would not be culturally neutral. (This could perhaps be done by using an XML/RDF markup system such as POWDER, with chuck-sums, attribution etc.) Since the system is to be culturally neutral there will need to be dozens, perhaps hundreds, of such categories, and every image will need to be checked for every category (of course a picture of a naked women is also a picture of a woman and a picture of a person, and a picture of a living being, etc.). Quite how these categories would work, since, for example, cultural norms are constantly shifting, and vary from town to town (and even within towns) let alone between countries, is I suppose up for discussion. Perhaps a really detailed ontology and a rules based system? But I jest - we manifestly do not have the resource (or the will) for the AI that would be required to drive such a system - realistically this is a bad idea masquerading as a great idea. In this case the devil is in the detail, some ways of working around those problems - and other problems that I have outlined - would take is into even murkier territory. Rich Farmbrough 14:27 17 August 2011 (GMT).
Worse yet, there is no method to make such classification — even if it were perfect and culturally neutral — impossible to abuse in order to push an agenda. Want to alter perception of subject x? Simply manage to mark it as offensive according to y. Add a few socks to the mix, position a few community leaders on polarized ends of the debate and voila! instant shitstorm.

As someone stated above, there exists exactly two neutral filtering criteria: filter everything and filter nothing. Everything else is someone making a subjective value call that renders the system useless at best (and actively harmful as social pressure and technical enforcement will be used to turn on that filtering against readers' interests). Think this is hyperbole? Is an image of a topless man "(partial) nudity"? Is one of a woman's? What about bare arms? Or hair visible? And we're not even getting in the incredibly complicated terrain of religious imagery. — Coren (talk) / (en-wiki) 14:59, 17 August 2011 (UTC)[reply]

Please tell me: does it means, that if I create a new picture, I have to sign according these rules: violence, sex, medical, etc? Does it means that I can see all pictures while editing, but look filtered, if an edited title get ready? JZ 20:54, 17 August 2011 (UTC)[reply]

Way too complicated/burdensome process.

I came to this page expecting to click a button and have my vote count. You want me to do what?? Follow a bunch of steps hidden halfway down the page and then go to another page and copy/paste stuff and THEN follow instructions there? Nuts. This is why wikipedia editor numbers aren't exploding. You're making this stuff too hard, and too hard to get into. Guess I'm too casual for you hardcore programmer/coder types. Parjlarsson 16:43, 17 August 2011 (UTC)[reply]

Visible to anonymous users who can't vote?

Why can I see the banner when I'm not logged in, when anonymous users aren't eligible to place a vote? Surely this should only be appearing for logged-in users, unless the voting eligibility is changed (as suggested above? Mike Peel 16:53, 17 August 2011 (UTC)[reply]

Because a lot of people read without logging in. The idea is that will trigger them to log in and vote. 16:56, 17 August 2011 (UTC)
Yes - but only a tiny fraction of Wikimedia's 400 million readers have user accounts! You can't be serious that displaying this banner to everyone, where only less than 0.1% or so of those viewing it will be able to vote (who could most likely be reached by restricting the banner to logged in editors only), is a worthwhile use of reader's screen estate?? Mike Peel 17:10, 17 August 2011 (UTC)[reply]
I actually completely disagree with "who could most likely be reached by restricting the banner to logged in editors only" we have a purposely low bar (I think that's a good idea) and an enormous number of those people are NOT going to be reading logged in and would NOT get caught by restricting it to logged in editors only. I may be mistaken but I think the board elections do the same thing (with a much higher voting bar). Jalexander 17:27, 17 August 2011 (UTC)[reply]
This will only cause frustration and confusion for all the unregistered users who want to participate in the referendum. They will follow the banner link only to find out that they are not allowed to vote and that they can't even change that by registering now. Please only show the banner for logged in editors. --Wkpd 17:30, 17 August 2011 (UTC)[reply]
It will be backed way off at the end of the first week of voting... until then, let's just let it go? We know that the current system is working, based on the extremely high number of votes cast. Philippe (WMF) 17:32, 17 August 2011 (UTC)[reply]
A week? That's about 3 _billion_ banner impressions! [8] Mike Peel 17:35, 17 August 2011 (UTC)[reply]
I argued above that the barrier should be lowered even more than it has been - so every Wikipedia reader can vote - but since that hasn't been done, having this banner visible everywhere really isn't a good thing. I thought the WMF board elections were logged-in only, but I could be wrong. Either way, it would be far better to e.g. email all registered users than to stick a big banner at the top of every single pageview of Wikipedia, by anyone, everywhere. Were we in the situation where 99% of our readers were also our editors, fine - but we're clearly not in that scenario. I'll somewhat cede the point about logged-out editors not noticing it otherwise - but the cost far outweighs the benefits here. Mike Peel 17:34, 17 August 2011 (UTC)[reply]
Out of the universe of people who are eliglble to vote, the vast majority of them will not be logging in during the two-week voting period, because the vast majority of people who are eligible are former or infrequent editors. However, I would imagine that a good majority of eligible voters will be browsing Wikipedia during that two week period, because "everyone" reads Wikipedia to get some knowledge or another.. Based on that, I would believe that the best way to reach eligible voters is to display this to users who are not logged in. That's how I found out about it, anyway. I wouldn't have found out about it otherwise. I agree, e-mailing might work too, but the banner just seems easier. JYolkowski 19:15, 17 August 2011 (UTC)[reply]
It indeed seems easier. And it is very effective. However, the same goes for shooting nuclear missile at a cloud of musquitos. Surely many mosquitos will be hit. However, I have to agree with the overkill argument. We are throwing so many sitenotices to people, and this is typically one that we can do without for a huge group of people. It is not efficient at all. Goodwill is important to Wikimedia - and warming people up with a question that they can give their opinion on and then denying them to actually vote doesn't help with that. Nor does showing them banners every time while we tell them that wikipedia is so great because we dont have ads. I can imagine that several people would consider messages like this which is really internal stuff to be spam to them. Are we really serving the readership with this? Or free knowledge? Effeietsanders 19:50, 17 August 2011 (UTC)[reply]

Categories as "filter tags"

Will it be possible to use file categories (Wikipedia and Commons categories) to "batch filter" all images in one or more of a set of categories?

If this isn't the right place for this question, please point me in the right direction, thanks! --87.78.139.205 20:01, 17 August 2011 (UTC)[reply]

The idea is that the filter system would fit in directly with our existing category system. So, if I understand your question right, yes, that's the intention. :-) Cbrown1023 talk 20:10, 17 August 2011 (UTC)[reply]

Terrible idea!

This is the first I've heard of this proposal, but as a long-time Wikipedia user, I think it's an absolutely terrible idea. How on earth did this ever get off the starting blocks?

I don't have an account, and no I won't register for one, so therefore I don't have a vote. But like millions of other users, I'll be affected by whatever the outcome is. This is particularly so if the decision is that you have to log in to remove censorship.

Wikipedia was an excellent idea when it started. The encyclopedia that anyone can edit! But no more. I helped out with quite a lot of articles in the early days, but it's being going progressively downhill for the past few years. I really don't know how long it will be until I abandon it. --88.108.222.49 20:02, 17 August 2011 (UTC)[reply]

Hello, and thanks for posting. We wish we could've included anonymous users like you in the voting, but unfortunately that would have been very hard to implement technically and it would make it easier for people to use sockpuppets to game the system. Just so you know though, this filter would be completely opt-in, both for anonymous users and for logged-in users. You wouldn't have to login to remove anything, because it would only be there if you wanted it to be there. Both logged-in users and logged-out users can both choose what is filtered out for themselves. The only difference is that the logged-in users will have their preferences saved for a longer period of time, because it's associated with their account, while the logged-out users might have to update their settings periodically because it'll most likely be cookie-based. Cbrown1023 talk 20:09, 17 August 2011 (UTC)[reply]
this filter would be completely opt-in : IOW much drama is being expended on something that even if it is implemented will not so fuxored from its inception as to be useless. John lilburne 20:35, 17 August 2011 (UTC)[reply]

If I wanted to design something that worked...

To be clear, I'm against image filtering, I voted against this proposal, and I think the solution is that children and adults should learn not to be offended or overly affected by images. Children around the world have had to become accustomed to living in garbage dumps, being child soldiers, being child prostitutes - I don't think the sheltered child who runs across a naked lady or a dead animal on Wikipedia has much to complain about.

But I recognize that for certain people the risk that they, or more likely one of their children, would run into some images is disturbing, and limits their use of Wikipedia. Obviously I want them to have as much access as possible. Now if they want to cut out the images, clearly a professional service (as mandated in schools and libraries in the U.S. under w:CIPA) should provide a better answer than the general mass of Wikipedia editors who have varying beliefs. However, the basic principle of Wikipedia is that a group of interested amateurs collaborating together willingly should be able to challenge a commercial product. Thus, we could, if desired, work out a viable image hiding scheme - just not this one.

A workable software scheme to accomplish this would be as follows: the first individual user, beginning from scratch, would write down or copy the names of the images that offend him into a text file, preferably with wildcards, perhaps with a text comment field afterwards that would replace the image when displayed.

Lilith (John Collier painting).jpg I'll never get any work done...
Dianne_Feinstein_2010.jpg If I never see her face again it will be too soon
Dianne_Feinstein* That goes for all of them
Dianne*Feinstein* Missed http://en.wikipedia.org/wiki/File:DianneFeinstein.jpg

So User:First puts this in User:First/ImagePrefs, installs/activates the script, and the new program stops these named images and groups of images from appearing.

Then User:Second decides he agrees with User:First, and has some ideas of his own, so he creates a file of his own at User:Second/ImagePrefs

{{User:First/ImagePrefs}}
John Boehner*
*John Boehner (though performance considerations might prevent initial *?)
Boehnerandstivers.jpg

Now eventually of course you have some editor who wants to provide services for all her co-religionists, so she goes over many files carefully and sets up a huge directory on her page, and several of her friends do likewise. They continuously prowl the earthiest entries and categories looking for new stuff to put in their directory, and all include each other as templates. Then anyone from their church starts an ImagePrefs file consisting solely of:

User:DefenderOfTheFaith/ImagePrefs/WikiPorn

and there they have it.

So, having laid down this idea, what would we need to do things effectively?

  • A good fast search through potentially hundreds of thousands of objected-to image names in a person's ImagePrefs, to find out quickly if the image to be displayed is one of them. Question: who takes the performance hit, the user's browser or the Wikipedia server?
  • A new parser function, {{#sort|duplicates=no}}, which can be applied to a whole ImagePrefs page and spit out every single entry in an alphabetized list minus duplicates.
  • A means to suppress display of the circular template errors, while of course not following any circles.
  • A way to tackle searches with wildcards - perhaps they could substitute in the list of hits, then attach a date parameter so they would only search files from newer dates on the next iteration?
  • A forum where Wikipedia editors can find people who block out images reliably according to their particular principles. But that can be provided elsewhere. Everything here is purely personal preferences, no admin intervention of any kind required.
  • If seeking to mandate this on children, a "user protection" feature might be invented. This would be a protection template applied to a page (other than the main user talk page) in user space which allows only that user (or admins) to edit the page, working exactly like the typical admin full-protection done to articles except the user or someone designated by him can edit the page. Thus a parent who sets up a child's account would add a protection template to the ImagePrefs, allowing only the parent's own Wikipedia account, to which in theory the child doesn't have the password, to change the page. I'm not saying that's a good idea for us to get into though.

Now I don't think of these things as any kind of priority for Wikimedia - eventually we'll see the poll results and we'll know for sure. But if the voters actually want to go forward with some image scheme, then this is one of many alternate ideas that I believe would work better than the currently proposed "consensus categorization" scheme. I've also proposed a method to work with outside organizations [9] and just a very simple idea to let users specify the image resolution for all thumbnails, overriding the Wiki article's settings, and make it something very small. I continue to urge WMF not to try to have Wikipedia classify images. Wnt 21:04, 17 August 2011 (UTC)[reply]