Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
The Media Google The Almighty Buck

Rupert Murdoch Says Google Is Stealing His Content 504

Hugh Pickens writes Weston Kosova writes in Newsweek that Rupert Murdoch gave an impassioned speech to media executives in Beijing decrying that search engines — in particular Google — are stealing from him, because Google links to his stories but doesn't pay News Corp. to do so. 'The aggregators and plagiarists will soon have to pay a price for the co-opting of our content,' Murdoch says. 'But if we do not take advantage of the current movement toward paid content, it will be the content creators — the people in this hall — who will pay the ultimate price and the content kleptomaniacs who triumph.' But if Murdoch really thinks Google is stealing from him, and if he really wants Google to stop driving all those readers to his Web sites at no charge, he can simply stop Google from linking to their news stories by going to his Web site's robot.txt file and adding 'Disallow.'"
This discussion has been archived. No new comments can be posted.

Rupert Murdoch Says Google Is Stealing His Content

Comments Filter:
  • Re:Dear Mr Murdoch (Score:5, Informative)

    by SEAL ( 88488 ) on Saturday October 10, 2009 @08:51PM (#29707545)

    Or make your site subscription-based. Of course you might want to talk with the guys over at Slate first to see how well that works out...

  • by Anonymous Coward on Saturday October 10, 2009 @08:53PM (#29707557)

    http://www.newscorp.com/robots.txt:
    User-Agent: *
    Disallow:

    Hmm, so they have heard of robots.txt and already made the decision not to restrict any search engines...

  • by Anonymous Coward on Saturday October 10, 2009 @08:57PM (#29707579)

    User-agent: *
    Disallow: /printer_friendly_story
    Disallow: /projects/livestream
    #
    User-agent: gsa-crawler
    Allow: /printer_friendly_story
    Allow: /google_search_index.xml
    Allow: /google_news_index.xml
    Allow: /*.xml.gz
    #
    Sitemap: http://www.foxnews.com/google_search_index.xml
    Sitemap: http://www.foxnews.com/google_news_index.xml

  • by schon ( 31600 ) on Saturday October 10, 2009 @09:03PM (#29707633)

    Technically he is right.

    No, he isn't.

    And Google really do take without providing anything back.

    Bullshit. As the summary stated: if Newscorp really was the victim here, they'd implement a robots.txt file telling Google to go away.

    The problem is that if Google went away, Newscorp would lose business.

    The rest of your post is even more idiotic than your first two sentences. (Come on, legal theft? If it was theft, it wouldn't be legal, asshat.)

    You have every choice not to deal with them. It's perfectly possible to do without - there are other search engines, other webmail providers, other banner networks. If you have a website, you can even exclude them in your robots.txt if you want.

  • by Chaos Incarnate ( 772793 ) on Saturday October 10, 2009 @09:11PM (#29707683) Homepage
    Not only that, but the one on foxnews.com provides Google sitemaps.
  • Re:Right ... (Score:5, Informative)

    by BKX ( 5066 ) on Saturday October 10, 2009 @09:19PM (#29707737) Journal

    Dude, go back school. 0.000001 only has one sigfig. 1.000000 has 7. 0.000001000000 has seven also.

  • Re:Dear Mr Murdoch (Score:2, Informative)

    by stumblingblock ( 409645 ) on Saturday October 10, 2009 @09:34PM (#29707847)

    Quite simply, Mr Murdoch wants some of
    Google's money. His business admirers agree (applause).

  • Re:dear Rupert, (Score:5, Informative)

    by tagno25 ( 1518033 ) on Saturday October 10, 2009 @09:42PM (#29707901)
    and here is foxnews.com's robot.txt

    User-agent: *
    Disallow: /printer_friendly_story
    Disallow: /projects/livestream
    #
    User-agent: gsa-crawler
    Allow: /printer_friendly_story
    Allow: /google_search_index.xml
    Allow: /google_news_index.xml
    Allow: /*.xml.gz
    #
    Sitemap: http://www.foxnews.com/google_search_index.xml [foxnews.com]
    Sitemap: http://www.foxnews.com/google_news_index.xml [foxnews.com]

    Notice the sitemap section, they are directly telling Google what news they have

  • by Pax681 ( 1002592 ) on Saturday October 10, 2009 @10:22PM (#29708139)
    Eddie Shah got that ball rolling with the the union busting.

    he was the first guy to invoke thatchers anti union laws and also he started the modernisation of the fleet street printing.
    http://en.wikipedia.org/wiki/Eddy_Shah [wikipedia.org]

    http://en.wikipedia.org/wiki/Today_(UK_newspaper) [wikipedia.org]
  • by canajin56 ( 660655 ) on Saturday October 10, 2009 @10:26PM (#29708155)
    One should also note that not only does Newscorp NOT turn away Google spiders with robots.txt, they actually redirect them to Google-specific pages to reduce bandwidth and make it easier to parse.
  • by GaryPatterson ( 852699 ) on Saturday October 10, 2009 @11:14PM (#29708363)

    Murdoch is not an Australian - he gave up his citizenship as soon as it hindered his US interests.

    He's as American as any other immigrant.

    On behalf of Australians everywhere, I'm sorry that he's your problem now.

  • Re:Dear Mr Murdoch (Score:2, Informative)

    by mysidia ( 191772 ) on Saturday October 10, 2009 @11:57PM (#29708539)

    META is not recommended primarily because it's as a lost cause/wasted effort, since most robots (even legitimate ones) don't understand it.

    robots.txt is part of the robots exclusion standard, which has been around for 20 years and should be implemented by all legitimate robots.

    Plus, if you want to pick on Google specifically, you can list their user agent in your robots.txt

    User-Agent: GoogleBot
    Disallow: /

    Or if feeling evil

    User-Agent: *
    Disallow: /
    User-Agent: GoogleBot
    Crawl-Delay: 51840000.0 Disallow:

  • by dbIII ( 701233 ) on Sunday October 11, 2009 @03:41AM (#29709371)
    After reading a bit of it I don't think that would help.
    It's not the paragraphs that are the problem, (Melvile got away with that) the content screams second language over an iPhone with the "ten random cliches" site open and a large bottle of Irish Whiskey.
    After seeing [sic] after one of the few correctly spelt words I was amused, not that spelling really matters much anyway on a forum like this.
    Sorry to be an annoying nitpicking bastard to new500 but the "freetard" insult is as annoying here as driving two herds of pigs into a synagogue and mosque at the same time while doing something unspeakable to a statue of the Virgin Mary.
  • Re:Real problem (Score:4, Informative)

    by SEE ( 7681 ) on Sunday October 11, 2009 @03:55AM (#29709427) Homepage

    No, this isn't anywhere near as simple as just using robots.txt to deter Google from indexing.

    Sure it is. If Google's spider is blocked from indexing "Murdoch" content by robots.txt, it's also blocked from caching any "Murdoch" content, the "Murdoch" headline never shows up on Google News, and there isn't any "Murdoch" text appearing to let me know if I really want to look that the whole article.

    Murdoch has, in fact, deliberately made content available for free by and through Google. Before Murdoch took over the Wall Street Journal, all Wall Street Journal news content could not be accessed by Google News, and could not be obtained by using Google News or a Google cache. You could only get WSJ content by going to the WSJ site. After Murdoch took over the Wall Street Journal, all Wall Street Journal content was made accessible to Google News. Furthermore, the WSJ paywall was deliberately lowered to allow people to read articles on the WSJ site for free if they follow a link to the article from Google News.

    Murdoch isn't letting Google access this content by accident or through ignorance. He has actively chosen to make this content available by and through Google. He can undo that any time he chooses, for any of his sites.

  • Re:Right ... (Score:3, Informative)

    by thetoadwarrior ( 1268702 ) on Sunday October 11, 2009 @06:10AM (#29709893) Homepage
    Yeah because Murdoch knows he panders to poor inbred mouth breathers and there is no money to be made from racist douche bags in a trailer park.

    Perhaps his new companies should try raising the bar on their quality rather than asking Google to fund their half assed "journalism".
  • Re:Dear Mr Murdoch (Score:4, Informative)

    by xaxa ( 988988 ) on Sunday October 11, 2009 @07:08AM (#29710121)

    Precisely because its expensive to send out correspondents to do real reporting, big media has stopped doing it.

    In the last couple of months hundreds of adverts have appeared in London (mostly on the Underground) for the Times saying how they have lots of science correspondants. Although having just searched Google for one to check I remembered it correctly, I'm no longer as impressed [theregister.co.uk].

  • Re:Dear Mr Murdoch (Score:3, Informative)

    by xaxa ( 988988 ) on Sunday October 11, 2009 @07:11AM (#29710133)

    Incidentally, the BBC had a reporter in Iran -- at least until he was expelled [guardian.co.uk], I don't know what they have there now.

  • Re:A simple solution (Score:3, Informative)

    by Wildclaw ( 15718 ) on Sunday October 11, 2009 @07:20AM (#29710163)

    socialist

    I don't think that word means what you think it does. The socialist solution would be to create a public newspaper, or in the case of extreme socialism, confiscating private newspapers companies. But you rarely here about socialist solutions nowadays, because there hardly are any real socialists in politics. Instead it is all about the government hiring private contractors, or the government paying money to private companies so they can build infrastructure. Or the government selling its property to private owners. There is nothing socialist at all about it.

    Where is the public or worker ownership? It simply isn't there. The public and working class ownership has been going downhill for 30 years in pretty much all western countries. In fact, governments and worker class people are mostly in debt nowadays. Anyone calling that socialist has been listening too much to Fox News. It is simply modern credit and banana republic capitalism intertwined.

    And beyond ownership. Lets look at salaries and taxes. A socialistic system aims to ensure that individuals get compensated roughly based on the amount of labor they put in. The quality of the labor is of secondary consideration, and while it can be used as an incentive it should be kept in control. What that means in practice, is a tax system with high top margin taxes, to ensure that a single individual doesn't greedily grab everything. Again, those margin taxes have been completely negated in the last 30 years, in a strong anti socialist movement.

    Of course, most people are gullible and think social welfare is socialism. It isn't. Social welfare is simply a way to keep a dysfunctional society (a society with huge wage differences and high unemployment) under control. As for wealth redistribution in general, it is not socialism either. It is simply sound economic policy to keep the economy balanced, ensuring that wealth doesn't get over concentrated. It does fit well with a socialistic tax system, but that is because modern capitalistic ideas simply have no idea at all of balance.

  • Re:Dear Mr Murdoch (Score:4, Informative)

    by thejynxed ( 831517 ) on Sunday October 11, 2009 @07:21AM (#29710167)

    Google News is what is he's complaining about.

    He doesn't mind the search links, the RSS feed, etc.

    He's complaining that Google News is gathering the content from his News Corp properties using their Googlebot, and taking all of the advertising revenue because Google places their own paid ads on the pages instead of the News Corp ads that would appear from the originating sites.

    This is the same issue/complaint that organizations like the AP and Reuters have with Google.

  • Re:Dear Mr Murdoch (Score:3, Informative)

    by Wowsers ( 1151731 ) on Sunday October 11, 2009 @07:26AM (#29710177) Journal

    A good few years ago now, Rupert had certain sections of The Times and The Sunday Times as subscription pages, certainly the archive section was subscription (basically any story over a week old went into the archive which you had to pay to access). They even had a CD-ROM of The Times archive (I remember using it at university - it only went back to about 1990 articles IIRC). Not enough people paid up to justify running the "archive", so it was removed and now we have the free for all, so long as Rupert allows the sites to be indexed.

    Maybe Rupert forgot that he already tried the pay per view method, and people weren't interested.

  • Re:Right ... (Score:1, Informative)

    by Anonymous Coward on Sunday October 11, 2009 @08:58AM (#29710567)

    Maybe you should work on your reading comprehension instead. Grandparent is referring to magnitude and not precision.

  • On the ground (Score:2, Informative)

    by Strange Attractor ( 18957 ) on Sunday October 11, 2009 @11:24AM (#29711241) Homepage

    I think Roger Cohen of the New York Times was there. The New Yorker also has printed a few unattributed pieces written in Iran recently.

  • Re:Right ... (Score:3, Informative)

    by TrekkieGod ( 627867 ) on Sunday October 11, 2009 @11:44AM (#29711337) Homepage Journal

    Actually I'm pretty sure that putting .0000000010000000 the second set of 0's are still significant since you're indicating a level of precision beyond the 7 significant digits...

    Uh...yeah, the second set of 0's are significant. The first set still aren't.

Anyone can make an omelet with eggs. The trick is to make one with none.

Working...