Forgot your password?
typodupeerror
Wikipedia Censorship News

Wikipedia May Censor Images 171

Posted by Soulskill
from the good-thing-there's-no-other-nudity-on-the-internet dept.
KiloByte joins the ranks of accepted submitters, writing "To appease 'morality' watchdogs, Wikipedia is contemplating the introduction of a censorship feature, where images would be flagged for containing sexual references, nudity, 'mass graves,' and so on. At least in the initial implementation, it is supposed to be 'opt-in.' However, with such precedents as the UK censoring artistic nudity, Turkey censoring references to the Armenian genocide or China's stance on information about the Tiananmen massacre (note that any sensitive photos, like the Tank Man, are already absent!), I find it quite hard to believe this feature won't be mandatory for some groups of readers — whether it's thanks to an oppressive government, an ISP or a school."
This discussion has been archived. No new comments can be posted.

Wikipedia May Censor Images

Comments Filter:
  • Today it is, anyway.

    Hey, anyone remember when banning users was solely an ISP decision, not a government mandate [arstechnica.com]? Boy, those were the days.

    • I remeber working at my university's CS help desk, deleting email after email of usenet abuse reports. There was some masterful trolling going on in those days and I was happy to have a position to not only be audience to it, but to help it continue.

      To think future generations will be denied that by oppressive governments makes me sad.

      • Sou you are the one responsible for the demise of the Usenet! :-)

      • "Freedom of speech doesn’t protect speech you like; it protects speech you don’t like." -- Larry Flynt

        "The worst thing about censorship is ###### ####."

        Also, the "junk filter" is as bad as DHTML routines on this page. No, really. It is. Seriously.

    • My main concern with implementing this, even if it always remains "opt-in", is that you're basically creating a API that censorship tools can exploit.

      It would be possible for a proxy or national firewall to redirect all requests for Wikipedia through a particular Wikipedia account where they had set the "hide obscenity" flag. So all users within that country, by default, wouldn't see the "offensive" stuff. The hard work of tagging, categorizing images, and rewriting the HTML, would have been done by Wiki
      • Wikipedia's image filter would just hide images per default, you're still able to see them with just one click, at any time.

        This in no way helps oppressive governments. It is about a client-side cookie and that way the client can control everything at all times. (There's not even a way for a school to hide all images, since you can always override your filter settings by clicking on the image placeholder)

        If an evil government tried to filter images, they'd have to prevent pictures from actually being sent o

        • I agree. This post seems to be an inflammatory 'chicken little' knee jerk reaction (phrased in keeping with the OP). It isn't censorship if you are doing it to yourself. More like a service for people who want to narrow their view.
          • by KiloByte (825081)

            You assume no one will force that cookie onto you -- or onto Wikipedia itself.

            • Yes, that assumption is common among those who lack the kind of paranoia necessary to turn a simple opt-in filtering system in to a plot to forever prevent users from fapping to the fine quality close-ups of vaginas Jimbo has assembled for our lonely pleasure.

              That this article got posted leads me to make three assumptions.

              1) You trolled Soulskill good. Now, Soulskill being a bit of a plank means that you don't get many points for this.

              2) Soulskill was hired to post paranoid speculative shit as "news". If th

          • by Hadlock (143607)

            On the flip side, this is enabling censorship groups (or whatever you want to label them). While on a personal level, humans are generally good, but the whole of humanity can do some pretty horrific things. Censoring these ugly facts about ourselves certainly don't do anything to help bridge the wealth/income gap.

        • This in no way helps oppressive governments. It is about a client-side cookie

          The important thing is the database of "bad images" exists and requests for an image can be checked against the DB relatively easilly (otherwise the blocking feature wouldn't work). All other details of how the blocking system works for opt-in users are irrelevent to those who plan to use it as a datasource for censorship.

          If an evil government tried to filter images, they'd have to prevent pictures from actually being sent over the internet.

          Once the "evil government" has the ability to check an image against the list of "bad images" it is easy for them to build a proxy that blocks any image on the "bad images" list and redire

      • It's better to have a ReallyPedia than one that's censored, or needs opt-in or "I'm 18, show me the image". To get around the age of majority problem, it ought to be vetted for access. Images are, and history is, what it was. Those that fear history don't learn from it.

      • by RockDoctor (15477)

        My main concern with implementing this, even if it always remains "opt-in", is that you're basically creating a API that censorship tools can exploit.

        Pretty much what I was thinking when I read the "please become a beta-tester to help us develop this capability" email.

        Which I haven't answered yet. Because I'm thinking about it.

    • by JMJimmy (2036122)

      All this discussion is for not.

      http://wikimediafoundation.org/wiki/Resolution:Controversial_content [wikimediafoundation.org]

    • Yeah, opt-in....

      BT, the main ISP in the UK, recently opened everyone's home wireless routers up to allow any BT customer free access to any other BT router. When I queried this I was told we'd "been opted in" to the scheme, but could opt out if we wanted to. So for BT at least, "opt-in" means anyone who hasn't specifically opted out, and you aren't allowed to opt out until you've already been opted in.
  • by Jack Malmostoso (899729) on Friday August 19, 2011 @09:26AM (#37141424)

    The way I understand the problem is that some articles show explicit pictures, which may offend some people. Honestly it has happened to me sometimes to see pictures of illnesses or war crimes which did upset me (granted, I have a very low threshold for these things).
    I don't see how it would be bad to hide these pictures by default, with a little button "view" next to the caption.
    Of course, if the goal is to delete these pictures altogether, then I'm all against it.

    • by Kvasio (127200) on Friday August 19, 2011 @09:47AM (#37141726)

      It may end the "Endless (human anus) image contention" dispute.

      Damn, this was the most entertaining section of wikipedia.
      ( http://en.wikipedia.org/wiki/Talk:Human_anus#Endless_image_contention [wikipedia.org] )

      • by Stargoat (658863) *

        Thank you so much. That is the funniest thing I have seen all week, maybe all year. Oh man, that is absolutely hilarious. Brilliant, thank you for sharing.

    • by Idbar (1034346)
      Double standard? Ok, yes I'm old already, so I don't find much on naked and crimes. But boy, one thing not on the summary that you bring here are medical images! Some of that stuff can't be unseen!

      As much as I think that sex may not be appropriate for kids, I think many natural things are worse than that, like seen the arm of a fire-ants victim. Disturbing stuff!

      So the thing is... where do you draw the line?
    • by Anonymous Coward

      The way I understand the problem is that some articles show explicit pictures, which may offend some people. Honestly it has happened to me sometimes to see pictures of illnesses or war crimes which did upset me (granted, I have a very low threshold for these things).

      Good, anyone who can see pictures like that without getting upset is most likely a psychopath.
      Censoring those pictures are however not the answer, stopping the actions that are depicted in those pictures are.

      No matter how much you censor, bad things will still happen and covering your eyes and ears just because you don't want bad things to happen is irresponsible.

      • by dwillden (521345)
        This isn't just about morality or oversensitiveness. Those images can be a problem if they pop up on your screen at work. I have legitimate reason to be utilizing Wikipedia at work, but the last thing I want is some obscene image deciding to display on my screen just as a supervisor or someone with nothing better to do than to complain walks by.

        This is actually a good idea. The image is still there and fully available, but it gives you a one click warning to decide if it's appropriate to have it open a
    • Re: (Score:1, Flamebait)

      by garcia (6573)

      Oh noes, you're offended by reality. Get over it. Just like everyone else. It's supposed to be an encyclopedia of FACTS. Fact is, war, famine, sex, drugs, and vulgarity are part of the life we live in. Should we start adding censors to articles on "Intelligent Design" because it's offensive to the people who know it's not factual?

      Slippery slope and one that we do not need to go down.

      • by jamesh (87723)

        Oh noes, you're offended by reality. Get over it. Just like everyone else. It's supposed to be an encyclopedia of FACTS. Fact is, war, famine, sex, drugs, and vulgarity are part of the life we live in. Should we start adding censors to articles on "Intelligent Design" because it's offensive to the people who know it's not factual?

        Getting upset or offended by stuff is human nature. I bet seeing pictures of the little girl ripped to pieces by the neighbors pet dog in Melbourne the other day would be upsetting, maybe even more so if you are a parent (or maybe that's just me), and maybe even more so if you've ever been attacked by a dog yourself.

        People aren't robots, and whether rational by your standards or not, some of reality scares people. Don't you wish there was a pixelation mask over goatse with a caption "wash your eyes after cl

    • I really don't know how else to say it, but seriously - real life is horrid sometimes. IMHO, if you want to know about the subject, you should have to see it, unvarnished and as it really is. If you can get a grasp as to how horrible something truly is, maybe it'll motivate you to help try and prevent it from happening - be it a disease or a war crime. I could understand not having it shoved into your face when you're not looking it up, but at the same time... you're looking it up. If you want to know about

    • Life is full of nasty things. We're a coddled civilization, and hence a rather weak and pathetic one.

    • I support it -- as long as a third party does it. It could be done browser side with GreaseMonkey. Someone can host a mirrored censored Wikipedia. Multiple people can host censored Wikipedia's. Different groups can censor whatever they don't want their members to see. For example a fundamentalist Christian/Muslim Wikipedia could remove all references to biological evolution and astrophysics (the whole big bang thing.)

      A system whose purpose is to curate what amounts to a record of the entire record of kno
  • by tepples (727027) <tepples AT gmail DOT com> on Friday August 19, 2011 @09:29AM (#37141458) Homepage Journal
    Letting people opt-in to censor images in Category:All non-free media [wikipedia.org] has been discussed for a while, if I remember correctly.
  • by drinkypoo (153816) <martin.espinoza@gmail.com> on Friday August 19, 2011 @09:30AM (#37141468) Homepage Journal

    Censorship is teh evil!!!

    But it's better to get SOME content than NO content...

    But censorship, it's evil!

    We go through this with Google and China every so often. What I worry about isn't having stuff blocked, with a nice big notice that something was removed, but about having content replaced, so that when you go look at stuff about Chinese unrest from the USA it says "chinese know all about this stuff" and when you look it up from China it says "everything is wonderful".

    • by mirshafie (1029876) on Friday August 19, 2011 @09:50AM (#37141782)

      Choice is not censorship. As far as I can tell, Wikimedia is considering to add the option for users to block images that have been flagged as potentially offensive. Since Wikipedia covers many aspects of humanity, some of them scary, it makes sense to enable users to filter some of the more graphic aspects of this. Remember, the articles themselves will not be blocked. I think this would make Wikipedia more useful for kids that might not have the tools to deal with looking straight into another person's guts just because their reading up on surgery.

      Many other sites, such as DeviantArt, block nudity by default, and to view it you must register an account and turn the filter off. Even though this is opt-out and a bit extreme, calling the practice censorship is ridiculous.

      • by RJFerret (1279530)

        Fixed this for ya':

        Choice is not currently censorship. As far as I can tell, Wikimedia is considering to add a future requirement for users to block images that have been flagged as potentially offensive. Since Wikipedia covers many aspects of humanity, some of them scary to the ignorant, it makes sense to enable ignorant users to filter some of the more graphic aspects of this. Remember, the articles themselves will not be blocked for now. I think this would make Wikipedia more useful for adults that might not have the tools to deal with looking straight into another person's guts just because their reading up on exactly that.

        Many other sites, such as DeviantArt, block nudity by default, and to view it you must register an account and turn the filter off. Even though this is opt-out and a bit extreme, calling the practice censorship is predictive.

        Children usually don't mind seeing new things, until they are taught by adults that the imagery/context is "dirty" or bad from an adult reaction to seeing it.

        Although some are saying let's not call this a "slippery slope" yet, when do you? History has shown as soon as tools such as this are created, a government or organization somewhere requires their application. This is true to such a degree that a local library often has patrons shocked that their internet computers are not filtere

    • by stms (1132653)

      It's okay I bet the Hovering Chinese Guys [blippitt.com] are already working on fixing any fake information.

  • may? (Score:4, Informative)

    by rbrausse (1319883) on Friday August 19, 2011 @09:31AM (#37141496)

    the referendum was not about "should we add a filter" but "how should a filter implemented".

    the resolution "controversial content" [wikimedia.org] was approved 10:0 in May 2010

    We ask the Executive Director, in consultation with the community, to develop and implement a personal image hiding feature that will enable readers to easily hide images hosted on the projects that they do not wish to view

    The foundation wants a filter, the community has no way to stop such a feature.

    • The Wikipedia foundation is totally corrupted as the Open XML [wikipedia.org] article has demonstrated. Furthermore the Wikipedia is completely biased towards the United States. When you have an article on a 19 century parlor song from France the wikipedia article will concentrate on the American entertainer which covered it in the 40ths. Now the Thai King gets control over our speech, censors pictures for us, and it's so easy for the rogues because you could actually corrupt Wikipedia admins and they would put your enemi
      • Furthermore the Wikipedia is completely biased towards the United States.

        Are you talking about Wikipedia in general, or the English Wikipedia specifically? Wikipedia acknowledges its systemic bias [wikipedia.org] as a problem. The bias you speak of is caused by the tendency of people to contribute to the Wikipedia for their native language and to cite sources written in their native language. And by far, the biggest concentration of English speakers on the Internet is in the United States, and sources written in English tend to cover the views of people in anglophone countries more than others.

    • Actually, nothing will get done unless someone traels through the images and decides which to filter. This is the community. And their will be fights for borderline cases.

      The "community" is very much in charge, here.

    • by lamber45 (658956)
      Actually, the first question on the referendum is basically "how important would this feature be?" If you are eligible to vote and you are against the feature, you can certainly vote "0" on that item, "?" on the rest, and include any specific comments in the free-text field at the end. Enough "0" votes and the board may decide to postpone or rethink the feature.
  • by TheSpoom (715771) <slashdot@uberm00. n e t> on Friday August 19, 2011 @09:34AM (#37141534) Homepage Journal

    I got the email for the referendum. Let's not say "OMG slippery slope!" quite yet, ok? If this continues to be voluntary, I have absolutely no problem with it. I won't personally turn it on because very little offends me, but if someone else doesn't want to view pictures of genetalia on their respective articles, I can understand that.

    • by KiloByte (825081)

      The problem is that the moment such a filter is implemented, organizations that already use other methods to get rid of images they don't approve, will demand such filter to be enabled on their whim, without allowing people to opt out.

      • Wikipedia's image filter would just hide images per default, you're still able to see them with just one click, at any time.

        This in no way helps oppressive governments. It is about a client-side cookie and that way the client can control everything at all times. (There's not even a way for a school to hide all images, since you can always override your filter settings by clicking on the image placeholder)

        If an evil government tried to filter images, they'd have to prevent pictures from actually being sent o

      • And Wikipedia can still tell them to fuck off.

      • by kellyb9 (954229)

        The problem is that the moment such a filter is implemented, organizations that already use other methods to get rid of images they don't approve, will demand such filter to be enabled on their whim, without allowing people to opt out.

        As long as the organization's you are referring to are not ISP's then I'm fine with this too. If you're on someone else's equipment (eg: work), you are subject to their rules.

    • Let's not say "OMG slippery slope!" quite yet, ok?

      What, you want a voluntary filter as well?

    • ...but if someone else doesn't want to view pictures of genetalia on their respective articles, I can understand that.

      Especially the children of Gene Talia.

  • by Sparklepony (1088131) on Friday August 19, 2011 @09:35AM (#37141542)
    Just thought I should point out it's not China that's responsible for the Tank Man photo being missing from the Tienamen Square massacre article. It's good old western copyright law. The Tank Man photo is copyrighted and not freely licensed so Wikipedia can only include it as fair use. Fair use on Wikipedia is held to very strict standards; fair use images can only be used on articles where the image is otherwise indispensible. So you can find it over at Tank Man [wikipedia.org], which is specifically about the photo.
    • I wish accuracy was held to very strict standards.
      • by tepples (727027)
        Wikipedia is only as accurate as the scholarly and mainstream media sources that each article cites. Feel free to remove anything inaccurate that lacks a citation, as long as you mention in the edit summary that what you remove lacks a citation. If your problem is with inaccuracies in the sources themselves, as I've been told is the case with Wikipedia's article about PSP homebrew, then I guess that's a fundamental problem with Wikipedia policy.
    • by iamhassi (659463)
      I was about to point out that "Tank Man" is supposedly censored due to "China's stance on information" but somehow the execution painting [wikipedia.org] is not censored from the article. [wikipedia.org]

      Yes, I know the artist claims the painting is not a representation of what actually happened at Tiananmen (it's just "inspired" by Tiananmen), but it's pretty clear the men are being executed, I don't care if they're laughing or not. That artist is incredibly brave.
    • by Anonymous Coward

      http://en.wikipedia.org/wiki/Tank_Man

      Hey look...the photo.

  • NSFW (Score:4, Interesting)

    by aahpandasrun (948239) on Friday August 19, 2011 @09:41AM (#37141634)
    Wikipedia definitely needs a NSFW or Not Safe for School feature at least, unless it's hidden somewhere. Certain articles like defecation go a little over the top.
  • Bottom line, folks (Score:1, Insightful)

    by Anonymous Coward

    Censorship needs a foundation of classification and identification.

    Censorship cannot work without differentiation.

    Traditionally, governments have employed armies of censors to root out unapproved media and identify it for further control, whether that be by name, URL, or another identifier.

    Wikipedia's tagging of potentially offensive media is like a crowdsourced censorship bureau.

    Imagine if all images had EXIM fields of "controversial" and "pornographic." Totalitarian regimes would block all image requests

  • This isn't "censorship" or "blocking" of images. If you read the text and look at the mockup, this is an opt-in feature to keep a person from accidentally viewing controversial content when simply clicking around on Wikipedia. There's a "show content" button right where the photo would normally be! Nothing is being kept from anybody. Say you're on break at work and you run across a word you don't know. If you type that word into Wikipedia and it ends up being some sort of genital mutilation or something, y
  • by ediron2 (246908) * on Friday August 19, 2011 @09:47AM (#37141740) Journal

    I took several minutes to read this 2 days ago when I first saw the news (2 days... slashdot, what's happened to you?) and it actually looked damned uncontroversial and careful.

    First, I'd say calling this censorship is a red herring.

    Censorship = removal of information without recourse or alternative.

    Opt-in filtering = giving parents and the squeamish a way to preemptively hide images, with user-controlled overrides.

    The categories sought for filtering is also intended to be peer-managed within wikipedia, which should prevent this from becoming a tool for governmental / corporate / ISP censorship. IOW, if users guide the categorization of data (tagging images as sexually explicit, violent, etc) then a gov/corp/ISP can't 'sneak in' the censorship of an article on Turkey, Israel, Net Neutrality, Codomo, China-vs-Taiwan, China-vs-Tibet, Egyptian unrest or whatever.

    The call for comments generated by Wiki* also discussed their desire to make whatever they do overridable.

    (disclaimer: I think I've edited wiki* a few dozen times, but doubt it was anything censor-worthy).

    • by KiloByte (825081)

      Even in this very comment you already included an example of someone opting a third party in.

    • by mazarin5 (309432)

      2 days... slashdot, what's happened to you?

      I know! I love how quick they've become too!

  • I'm thinking this should be sort of like the spoiler text used on many image boards, where initially all you see is black, but if you scroll over it you can see what it actually is. I think this would work perfectly for Wikipedia; if you didn't want to see it, you didn't have to. This way it means it would be censored for anyone who didn't want to see it, and anyone who does want to see it would just have to hover over it with their mouse and it would become visible.

  • by sqrt(2) (786011) on Friday August 19, 2011 @09:49AM (#37141758) Journal

    Censorship is always more offensive than the material being censored.

    Those who can understand this are holders of a higher ethic, and it is no bad thing to force this standards on those who have yet to be elevated.

    • by bws111 (1216812)

      Do you also oppose adblock and such tools? If not, why? After all, they are 'censoring' in exactly the same way this is.

      • by KiloByte (825081)

        AdBlock is done on the client side, this means the government can't issue a law or make a deal with Wikipedia to force it upon you. They'd have to disable SSL access and run man-in-the-middle proxies on all traffic.

        • by bws111 (1216812)

          So what? All this is is a tool, to allow people to control what they do or do not want to see. That is a perfectly legitimate thing. The fact that maybe some government could abuse the tool is completely immaterial. Ever since man first sharpened a stick every tool created could potentially be abused by someone. If that happens, complain about the abuse and the abusers, not the tool.

  • by iteyoidar (972700) on Friday August 19, 2011 @09:51AM (#37141792)
    I don't want to see hi-res photos of Wikipedia editors' genitalia or nasty skin diseases at the top of an article (when an illustration would suffice) for the same reason I don't want to Wikipedia to change over to magenta text on a lime-green background. There's an issue of aesthetics and readability here.
    • Since stylized depictions of the the editors' genitalia or skin diseases would not be horrifying they would also clearly not be accurate. Don't see why Wikipedia should design levels of filters to obscure knowledge when users can just instruct their web browsers not to load images from the site.
  • FFS. (Score:5, Insightful)

    by McDutchie (151611) on Friday August 19, 2011 @10:07AM (#37142008) Homepage
    Cut it out with the reactionary rhetoric already. It's an opt-in filter that allows people who so choose to read about "controversial" subjects without being confronted with graphic images of hardcore blood, gore, pornography, etc. - and there will be categories of filters, so it may even allow Muslims to read about their prophet without having to see depictions of him, without depriving others of access to those images. This seems like a good thing.
    • by djdanlib (732853)

      Good thing, until lawyers take it this far:

      Your Honor, the client's defense crumbles in light of this new evidence. He clearly enjoys seeing photographs of lewd nature. If he did not, why would he have set the cookie on Wikipedia that allowed them to be viewed on his screen?

      (totally ignoring the possibility that maybe the guy blocked images of dismemberment while viewing an article about war one time, and didn't block other things because they weren't onscreen)

      It's a little chilling. I would prefer to be fo

    • I object to using Wikimedia Foundation funds to develop and implement what should be a commercial operation. Sure, no one would object to choice, but also no one should feel entitled to be handed the entire Web nicely categorized into "naughty" and "nice" without compensation. Please feel free to host abridged or censored mirrors of Wikipedia - as long as you don't demand Wikimedia Foundation to fund it.
  • This is not bad, because the system, as proposed, is basically going to be just an improved and integrated reimplementation of an already existing feature.

    You may have heard of it. It's called "AdBlock Plus".

    That's essentially what the Wikipedia community has been telling people to use. Offended by pictures of prophets? Ask your local friendly religious WikiProject if they have a handy ABP list or user CSS file for you to use. Offended by sexual content? Yeah, it's a lot of blocking, but it can be done.

    All

  • About 10 years ago. Good riddance to them and their power mad administrators.
  • If pictures of the topic offended them, why were they on the topic in the first place? If you don't want to see pictures of vaginas, maybe you shouldn't look up vaginas?
    • If pictures of the topic offended them, why were they on the topic in the first place? If you don't want to see pictures of vaginas, maybe you shouldn't look up vaginas?

      Indeed. Because clearly there is no better way of teaching someone biology* than to simply show them a bunch of pictures of different people's vaginas. </sarcasm>

      The main problem is that even though medical illustrations are both prevalent and often better at highlighting small details (artists can control contrast of areas very easily to show off such details), there are many exhibitionists out there who add self-made images just because they can. Most people don't post these images for their encyclo

  • Be really blunt about it to chase off the sensitive fucks who start this shit in the first place. The way to react to PC bullshit is with scorn and open hatred because courtesy is wasted on such people.

    "This site may offend you. If it is possible to offend you then go the fuck away and never come back because no one here needs you as a viewer."

    There really ARE valid reasons for harsh netiquette.

  • The Tank Man is still there on the wiki page linked in the summary. A simple click through would have revealed that. While I think this would be awful, and make wikipedia even less desirable a destination, it is rather in line with their "notability" censorship. That said, no need to inflame the issue by claiming the article on the Tiananmen massacre has already had the Tank Man picture removed.
  • While I think Wikipedia has strayed greatly from its original goals and principles, I thought one of the most important ones was maintaining a "neutral" point-of-view in articles. How is marking certain images as "offensive" showing neutrality? If an image is illustrative to the content of an article and it is legal to be used, then whether or not some people find it offensive ought to have no bearing on its inclusion. I think that the inevitable debates over whether or not an image is offensive will ser

  • they are giving you the option to censor images. I think that's a fairly large distinction.
  • How about not show the image until someone clicks on something that makes it visible? This way, if someone goes to the page on "ejaculation", they don't, by default, see an animated gif of some dude ejaculating. But if they REALLY WANT TO, they can click to view it. Either way, they still have access to the text that describes the biology.

    But maybe I'm missing the point.

  • They should have a "notability filter": instead of deleting so-called "non-notable" articles, they get added to the filter so that deletionists can see the nice, clean, austere Wikipedia they've always dreamed of, while the rest of us get the real thing.

"Floggings will continue until morale improves." -- anonymous flyer being distributed at Exxon USA

Working...