Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Wikipedia Censorship News

Wikipedia May Censor Images 171

KiloByte joins the ranks of accepted submitters, writing "To appease 'morality' watchdogs, Wikipedia is contemplating the introduction of a censorship feature, where images would be flagged for containing sexual references, nudity, 'mass graves,' and so on. At least in the initial implementation, it is supposed to be 'opt-in.' However, with such precedents as the UK censoring artistic nudity, Turkey censoring references to the Armenian genocide or China's stance on information about the Tiananmen massacre (note that any sensitive photos, like the Tank Man, are already absent!), I find it quite hard to believe this feature won't be mandatory for some groups of readers — whether it's thanks to an oppressive government, an ISP or a school."
This discussion has been archived. No new comments can be posted.

Wikipedia May Censor Images

Comments Filter:
  • by Jack Malmostoso ( 899729 ) on Friday August 19, 2011 @09:26AM (#37141424)

    The way I understand the problem is that some articles show explicit pictures, which may offend some people. Honestly it has happened to me sometimes to see pictures of illnesses or war crimes which did upset me (granted, I have a very low threshold for these things).
    I don't see how it would be bad to hide these pictures by default, with a little button "view" next to the caption.
    Of course, if the goal is to delete these pictures altogether, then I'm all against it.

  • Bottom line, folks (Score:1, Insightful)

    by Anonymous Coward on Friday August 19, 2011 @09:43AM (#37141662)

    Censorship needs a foundation of classification and identification.

    Censorship cannot work without differentiation.

    Traditionally, governments have employed armies of censors to root out unapproved media and identify it for further control, whether that be by name, URL, or another identifier.

    Wikipedia's tagging of potentially offensive media is like a crowdsourced censorship bureau.

    Imagine if all images had EXIM fields of "controversial" and "pornographic." Totalitarian regimes would block all image requests so flagged.

    We do not want to crowdsource the work of the censors.

  • by Kvasio ( 127200 ) on Friday August 19, 2011 @09:47AM (#37141726)

    It may end the "Endless (human anus) image contention" dispute.

    Damn, this was the most entertaining section of wikipedia.
    ( http://en.wikipedia.org/wiki/Talk:Human_anus#Endless_image_contention [wikipedia.org] )

  • by ediron2 ( 246908 ) * on Friday August 19, 2011 @09:47AM (#37141740) Journal

    I took several minutes to read this 2 days ago when I first saw the news (2 days... slashdot, what's happened to you?) and it actually looked damned uncontroversial and careful.

    First, I'd say calling this censorship is a red herring.

    Censorship = removal of information without recourse or alternative.

    Opt-in filtering = giving parents and the squeamish a way to preemptively hide images, with user-controlled overrides.

    The categories sought for filtering is also intended to be peer-managed within wikipedia, which should prevent this from becoming a tool for governmental / corporate / ISP censorship. IOW, if users guide the categorization of data (tagging images as sexually explicit, violent, etc) then a gov/corp/ISP can't 'sneak in' the censorship of an article on Turkey, Israel, Net Neutrality, Codomo, China-vs-Taiwan, China-vs-Tibet, Egyptian unrest or whatever.

    The call for comments generated by Wiki* also discussed their desire to make whatever they do overridable.

    (disclaimer: I think I've edited wiki* a few dozen times, but doubt it was anything censor-worthy).

  • by mirshafie ( 1029876 ) on Friday August 19, 2011 @09:50AM (#37141782)

    Choice is not censorship. As far as I can tell, Wikimedia is considering to add the option for users to block images that have been flagged as potentially offensive. Since Wikipedia covers many aspects of humanity, some of them scary, it makes sense to enable users to filter some of the more graphic aspects of this. Remember, the articles themselves will not be blocked. I think this would make Wikipedia more useful for kids that might not have the tools to deal with looking straight into another person's guts just because their reading up on surgery.

    Many other sites, such as DeviantArt, block nudity by default, and to view it you must register an account and turn the filter off. Even though this is opt-out and a bit extreme, calling the practice censorship is ridiculous.

  • by iteyoidar ( 972700 ) on Friday August 19, 2011 @09:51AM (#37141792)
    I don't want to see hi-res photos of Wikipedia editors' genitalia or nasty skin diseases at the top of an article (when an illustration would suffice) for the same reason I don't want to Wikipedia to change over to magenta text on a lime-green background. There's an issue of aesthetics and readability here.
  • FFS. (Score:5, Insightful)

    by McDutchie ( 151611 ) on Friday August 19, 2011 @10:07AM (#37142008) Homepage
    Cut it out with the reactionary rhetoric already. It's an opt-in filter that allows people who so choose to read about "controversial" subjects without being confronted with graphic images of hardcore blood, gore, pornography, etc. - and there will be categories of filters, so it may even allow Muslims to read about their prophet without having to see depictions of him, without depriving others of access to those images. This seems like a good thing.

"The one charm of marriage is that it makes a life of deception a neccessity." - Oscar Wilde

Working...