Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Youtube Technology

YouTube Says it Has 'No Obligation' To Host Anyone's Video (theverge.com) 209

Speaking of YouTube and moderation, the Google-owned video service is rolling out updated terms of service on December 10th, and a new line acts as a reminder that the company doesn't have to keep any video up that it doesn't want to. From a report: "YouTube is under no obligation to host or serve content," the new terms of service policy reads. It's another way of saying that just because YouTube is a relatively open platform, it doesn't mean that the company is required to keep videos up. YouTube has faced criticism from all sides over its video removal process. Some critics argue that YouTube could do more to take down videos that butt up against the company's rules but don't outright violate them; others argue that YouTube ought to be a fully open platform and shouldn't control what remains up and what doesn't. Executives have long defended the platform as a champion of free speech, but have started to clamp down on the type of videos allowed to circulate.
This discussion has been archived. No new comments can be posted.

YouTube Says it Has 'No Obligation' To Host Anyone's Video

Comments Filter:
  • True. (Score:4, Insightful)

    by the_skywise ( 189793 ) on Monday November 11, 2019 @01:49PM (#59403674)

    But that makes them a publisher and not a social media "bulletin board" as they're editorializing the content by picking and choosing what they allow above and beyond legal restrictions.

    • Re:True. (Score:5, Interesting)

      by RightSaidFred99 ( 874576 ) on Monday November 11, 2019 @01:52PM (#59403688)
      You act like this is some profound, dire consequence even if true. Youtube would already be liable for all sorts of shit if they left it up, I don't think this is as big a distinction as you imagine it to be.
      • by MobyDisk ( 75490 )

        I thought the DMCA said that if you don't discriminate, and follow the law, you have safe harbor against liability; but that if you did discriminate, you were liable for what was left. Is that correct or no?

        • by tepples ( 727027 ) <tepples.gmail@com> on Monday November 11, 2019 @02:15PM (#59403794) Homepage Journal

          The relevant copyright statute, 17 USC 512 [cornell.edu], states that a service provider doesn't lose its safe harbor so long as it honors copyright claimants' notices and honors uploaders' counter-notices for works subject to a copyright claim. It doesn't say a service provider is required to host any uploaded work that isn't subject to a copyright claim. In fact, section 512 encourages service providers to act on "facts or circumstances from which infringing activity is apparent," which some analysts have referred to as "red flag knowledge."

          • by Holi ( 250190 )
            Except You Tube actively monitors and automatically removes copyrighted material, no complaint necessary.
            • by tepples ( 727027 )

              Content ID is a red flag knowledge mechanism, which the DMCA encourages but currently does not require service providers to implement.

          • It doesn't say a service provider is required to host any uploaded work that isn't subject to a copyright claim.

            Except all works are subject to a copyright claim. As soon as you record anything you own the copyright to it. Now, that's not quite as formal and binding as registering a copyright with the government, but literally everything recorded and uploaded to youtube is under copyright by the person who recorded it.

            • DMCA says how those who host content to respond to copyright notices. This isn't about a copyright notice. In this instance, looking at DMCA is about as useful as looking at the speed limit. It's a law, just not one relevant to current discussion.

              See also
              https://news.slashdot.org/comm... [slashdot.org]

            • all works are subject to a copyright claim

              But just because you own copyright in this work doesn't mean I have to host it.

              You are using "copyright claim" to refer to copyright ownership. In an article about YouTube, I was using "copyright claim" to mean the same thing that YouTube means by "copyright claim": a notice of claimed infringement pursuant to 17 USC 512(c) or foreign counterparts. Though almost all works of private authorship are copyrighted upon creation, not all are subject to a notice of claimed infringement.

              Let me try to reverse engine

          • by rsilvergun ( 571051 ) on Monday November 11, 2019 @06:24PM (#59404804)
            see here [eff.org] and feel free to read it if you like. And no, it doesn't matter that the bulk of the Communications Decency Act doesn't have anything to do with Safe Harbor or copyright. Laws can serve more purposes than what's in their title.

            Also, contrary to popular opinion (especially on the right) you _want_ this. The only reason _anything_ can be posted online by somebody other than a mega corporation is Section 230 of the CDA (not to be confused with Lemming of the BDA, but you can sing both).

            So go ahead, undermine Section 230. Hell, repeal it. And sit back and watch while the Internet becomes cable television owned by a handful of billionaires and any pretense of free speech dies.
        • by Holi ( 250190 )
          RIAA and the MPAA have long insisted Youtube moderate their content, the safe harbor ship has long since passed.
        • by raymorris ( 2726007 ) on Monday November 11, 2019 @03:11PM (#59404084) Journal

          > if you don't discriminate, and follow the law, you have safe harbor against liability; but that if you did discriminate, you were liable for what was left. Is that correct or no?

            That was old law before the internet. Then a couple of important laws were passed in order to make things work with internet forums. Specifically, DMCA and CDA.

          The Digital Millennium Copyright Act (DMCA) is about cppyright. This isn't about cppyright, so that's not the right law to look at. With the all important counter-notice provision that nobody seems to know about, DMCA was an attempt to be fair to everyone on copyright issues. Unfortunately nobody knows about counter-notices and we didn't foresee the volume of garbage notices that people ended up filing. (Remember at the time, there hadn't been any bad DMCA notices ever sent). Anyway, that's copyright, a separate issue.

          The relevant law here is CDA, the Communications Decency Act. It says that those who host content can make a good faith effort to remove obscene or otherwise objectionable content without incurring liability, if they follow the rules. This was needed because open forums would get spammed with horrible stuff. Slashdot doesn't control the content of what we post; yet you wouldn't expect Slashdot to leave up 1,000 links to child porn. There needs to be a middle ground of open discussion witu a reasonable policy of taking out the trash.

      • Re:True. (Score:4, Insightful)

        by alvinrod ( 889928 ) on Monday November 11, 2019 @02:15PM (#59403796)

        You act like this is some profound, dire consequence even if true. Youtube would already be liable for all sorts of shit if they left it up, I don't think this is as big a distinction as you imagine it to be.

        There’s a difference between complying with legal court orders to remove something (regardless of whether you agree with them or not) and deciding what you want to host by selectively removing content. So don’t act as though a refusal to engage in screening means a site has to be 4-Chan.

        And I’m even fine with giving YouTube some leeway as long as they’re up front about what laws or other regulations prohibit or if there’s any other content that they personally disallow even if it isn’t illegal. But they don’t have this in place and it’s always a reactionary move on their part where they’re intentionally vague about the rules or the policies.

        And I believe it’s because they don’t have any. There are likely a lot of people making arbitrary personal decisions who aren’t really accountable to anyone. So if your case comes before the wrong moderator or reviewer they’ll just ban or demonitize based on their own whims and unless you’ve got a contact at YouTube with the power to reverse it there’s nothing you can do about it. The upper management is as clueless and disconnected from it all as the users so they just flail around uselessly while trying to figure out how to turn a profit in all the madness while deflecting the old media that would love to find any reason for YouTube to crumble.

        • by meglon ( 1001833 )

          And I'm even fine with giving YouTube some leeway as long as they're up front about what laws or other regulations prohibit or if there's any other content that they personally disallow even if it isn't illegal. But they don't have this in place and it's always a reactionary move on their part where they're intentionally vague about the rules or the policies.

          You are wrong. https://www.youtube.com/about/... [youtube.com]

      • Well behaving contrary to the implicit agreement with congress can and given the bipartisan discussions we've seen very well may cause the law to be changed. So yes it is a big deal. Both the Dems and GOP have reason to dislike the big web firms, Facebook, Google, Twitter etc right now. Acting in a less than gracious manner wont exactly help them.

      • If you host content w/o editorializing, you have safe harbor protections. As long as you take down DCMA violations, there's no issue. If you're editorializing, you can now be liable for everything your submitters say. Including slander and liable. Even if it's taken down.
    • Re:True. (Score:5, Informative)

      by waspleg ( 316038 ) on Monday November 11, 2019 @01:52PM (#59403694) Journal

      They've been effectively doing this already by demonetizing. Just look up any channel covering the Hong Kong protests and you'll see they've been hit multiple times.

      • Re: (Score:3, Informative)

        by jellomizer ( 103300 )
        There is a big difference between demonetizing and removing a video.

        Monetizing something volatile like Hong Kong Protest where amateur youtubers will go into the hazardous protest to capture the violence putting themselves and other in danger (Possibly even egging on the violence) for a quick buck.
        That is big difference then YouTube Removing the video so no one can see it. In which a valid moment in history is being deleted.

        This is a much different problem. Demonetizing is affecting the professional YouTube
        • Re: (Score:2, Insightful)

          Seriously? People are fighting an authoritarian government in the streets, videoing the repression, showing the whole world what's going on at risk to their own lives, and all you can say is that you suspect they're doing it for ad clicks? You fucking commie tool.
          • Re:True. (Score:4, Insightful)

            by Holi ( 250190 ) on Monday November 11, 2019 @02:47PM (#59403958)
            "You fucking commie tool."

            Capitalist tool would be more appropriate.
          • by Junta ( 36770 )

            I interpreted the parent post as saying that demonetizing takes any hypothetical profit motive out of producing/posting that content but leaving the ability to post said content for non-fiscal reasons.

            If the counter argument is that there's no way they are doing it just for clicks, then his perspective in practice would have zero impact.

          • How is this communist? Just because I am pointing out that Capitalism doesn't always do the right thing?

            In peaceful American heartland the "Hold my Beer" videos are very popular, where people risk their lives to do a stupid stunt, just so they can get the clicks and ad-revenue.
            Is it really that far of stretch to think someone with a lot of action and history going on outside their front door, that other than staying indoor outside of trouble, to be out there capturing the violence, and worse egging people t
    • Re:True. (Score:5, Informative)

      by mysidia ( 191772 ) on Monday November 11, 2019 @02:03PM (#59403730)

      But that makes them a publisher and not a social media "bulletin board"

      Thanks to a special exception created by Section 230 of the Communications Decency Act [cornell.edu];
      they are allowed to do this without being considered a publisher.

      (c) Protection for “Good Samaritan” blocking and screening of offensive material

      "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider" (47 U.S.C. 230).

      (2) Civil liability - No provider or user of an interactive computer service shall be held liable on account of—

      (A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or

      (B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).[1]

      • Re: (Score:3, Informative)

        by sycodon ( 149926 )

        offensive material,obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable

        By stating that they have the right to remove what they want for any reason, without it having to be the above, they lose their 230 exception.

        • It contained the word "the". The word "the" is objectionable. Therefore the content is removed because is is "otherwise objectionable".

          We "object" to content that does not make us money. That content does not make us money. Therefore we are removing it because it is "otherwise objectionable".

          Cannot you speak and comprehend the English language?

          • Re: (Score:3, Interesting)

            by bobbied ( 2522392 )

            I'm not so sure the courts would agree with your reading of the law here.

            Clearly "objectionable" means material that offends or slanders somebody or something. "I cannot make money" doesn't really meet that criteria.

            If they don't stop making biased editorial decisions though, I'm guessing they might get themselves sued and the courts will sort all this out.

            • by sycodon ( 149926 )

              That's what really needs to happen. A very large and well funded class action lawsuit.

            • by mysidia ( 191772 )

              Zeran v. AOL, 1997 [techlawjournal.com]

              Holding. Both the District Court and the Court of Appeals ruled that 47 U.S.C. 230, which provides that "No provider or user of an interactive computer service shall be treated as a publisher or speaker of any information provided by another information content provider" immunizes AOL and any interactive computer service from claims based on information posted by a third party. Court lawsuits seeking to hold a service provider liable for its exercise of a publisher's traditional edito

        • Re:True. (Score:5, Informative)

          by tepples ( 727027 ) <tepples.gmail@com> on Monday November 11, 2019 @02:27PM (#59403866) Homepage Journal

          By stating that they have the right to remove what they want for any reason, without it having to be the above, they lose their 230 exception.

          The text of 47 USC 230 [cornell.edu] states the exact opposite:

          No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

          No provider or user of an interactive computer service shall be held liable on account of any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.

          (In this statute, "interactive computer service" means a platform, and "information content provider" means an author.)

          Which line of the statute threatens loss of the protection?

          (See also EFF's explanation of 230 [eff.org].)

          • by sycodon ( 149926 )

            I reasonable reading would suggest that the protections only extend to actions restricting access to, "material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected."

        • by AmiMoJo ( 196126 )

          "Otherwise objectionable" means they can remove anything over any objection no matter how trivial and still be protected. "I don't like that shirt" is an objection. "It doesn't make me any ad revenue" is an objection.

          Still, I'd like to see the episode of Lawful Masses where they cover this lawsuit. They already did the one alleging 1st Amendment violations.

        • That's the entire point of Section 230 [eff.org]. It exists to allow YouTube to moderate their website without being considered a publisher.

          The law explicitly carved out an exception to the normal rules regarding publishers because it was understood those rules didn't fit with 21st century media and, well, we still had a functioning government in the 90s (before Newt Gingrich got done with it anyway, look up his "Contract with America", he and his ilk started the whole process of grinding the gov't to a halt on a
    • by Misagon ( 1135 )

      Every well-managed bulletin board does have moderators and admins that set rules about proper conduct which they enforce on posts and users.
      The fact that some large "social media" sites do not have working moderation is beside the point, and is only a failing of those sites. It does not prove your point.

      • Re:True. (Score:4, Insightful)

        by Etcetera ( 14711 ) on Monday November 11, 2019 @02:59PM (#59404008) Homepage

        Every well-managed bulletin board does have moderators and admins that set rules about proper conduct which they enforce on posts and users.
        The fact that some large "social media" sites do not have working moderation is beside the point, and is only a failing of those sites. It does not prove your point.

        I don't think this extrapolation works well at the scales of Big Tech. Joe's Transmission phpBB Board is free to moderate without societal effects, but YouTube has Billions of users. It's run by Google, the internet behemoth that has a vertical monopoly in large amounts of internet hosting infrastructure, and also runs the largest online ad network (which is essentially how the news media is surviving). That's a LOT of power, and shrugging off the cultural and social impact of that concentrated power won't lead to anything good.

    • So what if you call them a "publisher" or not? What does that have to do with anything? What "legal restrictions" are you talking about?
      • by tepples ( 727027 )

        There are laws banning publishing obscene works, defamatory works, or works that incite viewers to commit a crime. Section 230 states that a platform is not considered the publisher of such illegal works and thus not liable under those laws.

    • Regardless of precedent and established laws, they are what they are, that's a new thing and that needs appropriate law. Like it or not, through their reach youtube has become in effect a public service. Depriving access for arbitrary reasons significantly deprives people from a huge audience and there isn't a great deal of justification for it. In fact, it's as dangerous as anything they're proposing. What we're talking about is them choosing who gets how many views. In some cases they're deciding whose li
      • by DarkOx ( 621550 )

        The problem is that LAW already passed. Wishing it had not and pretending otherwise wont change it. I keep seeing people like Steven Crowder crying about youtube exercising editorial control and that being unfair and makes them not a "common carrier" etc etc, legally it does not matter! if you doubt that realize all of these organizations have large captive legal departments that have reached the same conclusion or they would not be saying and doing the things they are doing. Which is not to say that legal

    • Why are comments like this routinely moderated to +5 when they are absolutely, beyond question factually incorrect, with that always being pointed out in replies? Are the far righties and far lefties with mod points just supporting rewriting the law to make it true (spoiler alert: you won't be happy with how the other party enforces that), or do you all just not fucking get that it's a factual legal question that is 100% settled law and not true?
  • They are slowly inching their way towards charging people who upload more than X GB of videos.
  • by gillbates ( 106458 ) on Monday November 11, 2019 @02:00PM (#59403720) Homepage Journal

    When I was much younger, I used to see signs in restaurants which read,

    We reserve the right to refuse service to anyone, for any reason.

    It seemed odd to me that a business owner would want to refuse service to someone, but I reasoned that the owner had a right to use his property as he saw fit, even if I disagreed.

    Later, I would learn that certain people would interpret those signs in a racist way. Whether or not this was the original intent, I never learned, but it is certainly plausible.

    In much the same way that antidiscrimination law applies to even private businesses, I suspect it's just a matter of time before a court rules that Youtube is a "public accommodation," and subject to California's antidiscrimination laws, which protect, among other things, political opinion.

    • by MobyDisk ( 75490 )

      when i was much younger...

      I saw them too. Where are those signs today?

    • Re: (Score:2, Insightful)

      by Brett Buck ( 811747 )

      In much the same way that antidiscrimination law applies to even private businesses, I suspect it's just a matter of time before a court rules that Youtube is a "public accommodation," and subject to California's antidiscrimination laws, which protect, among other things, political opinion.

      Oh, grand! Of course, it primarily protects the *correct* political opinions, and don't expect a lot of support or legal action if you have the wrong political opinions.

      • by DogDude ( 805747 )
        Well, the rest of the country doesn't have any obligation to abide by administration laws that include "political opinion", so it's super easy to fire Trumpers.
      • Of course, it primarily protects the *correct* political opinions

        To which "correct political opinions" do you refer, other than not to incite viewers to commit crimes? For example, the Civil Rights Act and Americans with Disabilities Act as amended make it a crime for a business to discriminate against people based on membership in a marginalized ethnicity, gender, or disability group.

        • He means divisive identity politics. The kind that are killing our ability to relate to one another and stoking political polarisation.
          • by tepples ( 727027 )

            Wikipedia's article "Identity politics" [wikipedia.org] mostly frames it as marginalized groups seeking two things: 1. equal opportunity and 2. some way to diminish heritability of poverty caused by grandparents' lack of equal opportunity. With that definition in mind:

            In some cases, when YouTube has taken down a video out of "identity politics," what actually happened was that YouTube took down a video inciting viewers who operate businesses to commit crimes, particularly civil rights crimes.

    • We reserve the right to refuse service to anyone, for any reason.

      I still see these all the time. My assumption was to cover their asses when they need to eject an obnoxious customer. Now that you mention it, I suppose you could find owners who uses this in a discriminatory way ("we don't server yer kind here"). Seems like a poor business decision but I'm not the owner.

      • by tepples ( 727027 )

        I suppose you could find owners who uses this in a discriminatory way ("we don't server yer kind here"). Seems like a poor business decision but I'm not the owner.

        Consider a hypothetical population consisting of three groups: tolerant people of the privileged class, bigots of the privileged class, and people of a marginalized class. A business can serve either marginalized people or bigots, as bigots will not patronize any business that serves marginalized people. In a situation with no civil rights laws, if there are significantly more bigots or the bigots have significantly greater disposable income, a business may make more revenue by serving bigots than by servin

    • and if you are a member of a protected class YouTube cannot discriminate against you. They cannot, for example, remove TBTV [youtube.com] because Tim's Black (pun intended). They _can_ remove him because they don't like his brand of left wing, anti-establishment politics.

      They also can decide they just plain don't like Tim, and they can show him the door [xkcd.com].

      I keep saying this but if you want a place to post your stuff where there only restriction of what's legal there's an easy fix: National Public Access. Talk to yo
  • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Monday November 11, 2019 @02:23PM (#59403846) Homepage Journal

    Executives have long defended the platform as a champion of free speech, but have started to clamp down on the type of videos allowed to circulate.

    Bait and switch. They attracted their user base by pretending to be a carrier of all non-porn videos (except for waxing fetishists, anyway) and now they're showing their true colors. People paid them with attention, so the most reasonable settlement would be to force them to pay attention... to the shittiest videos on Youtube. So much video is uploaded that they could have to watch such garbage for the rest of their lives, which would keep them out of trouble.

    • Wait, wait, wait. You're saying there are waxing videos on YouTube? I'll be hanging a 'Do Not Disturb" sign on my office door for the next couple of hours.
  • by ghoul ( 157158 ) on Monday November 11, 2019 @02:27PM (#59403864)

    If Youtube is editorializing and deciding which video stays up and which does not than it is no longer protected by the DMCA and can be sued for libeleous content.

    Have the execs at Youtube thought this through? or are they so confident that the legal system is so throughly infiltrated by liberals that they can get away with this as long as they dont editorialize extremist liberal speech and only editorialize (or censor) far right speech?

    • by tepples ( 727027 )

      If Youtube is editorializing and deciding which video stays up and which does not than it is no longer protected by the DMCA and can be sued for libeleous content.

      Citation needed. As for copyright, as I understand 17 USC 512, a service provider loses its safe harbor only if it fails to act on notices of infringement. LIbel is a completely separate issue from copyright and subject to 47 USC 230's "no platform is the speaker" guarantee.

    • Ask for a refund.

  • for political advertisements there are rules and they can be link we will only have trump ad's and reject others.

  • by CaptainDork ( 3678879 ) on Monday November 11, 2019 @02:31PM (#59403890)

    The ToS is a legally binding contract that says, in paraphrase, "The only right a member has is to leave."

    Don't like the chickenshit outfit?

    Don't sign up.

  • Time to move to Bitchute.

    Starting Dec 10, 2019. Youtube will require all users agree to new TOS. After that date, Youtube will no longer inform users why they are being cancelled.

    >>
    Youtube is updating their TOS on December 10th: they can terminate anyone’s google account they deem "not commercially viable"

    https://www.reddit.com/r/technology/comments/dud0ww/youtube_is_updating_their_tos_on_december_10th/

    • by tepples ( 727027 )

      Or better yet, move to hosting your own videos on your own website [indieweb.org]. That way, you don't get blocked by overzealous censorware that sees the "female dog" word in Bitchute's name.

      • Re: (Score:3, Insightful)

        by Etcetera ( 14711 )

        Or better yet, move to hosting your own videos on your own website [indieweb.org].

        Sadly, the broad social use of the Internet in 2019 is in the form of walled gardens. In as little as 10-12 years, regular users among the general public are FAR less likely to actually "surf the web" and potentially land on self-hosted content any more. And if, for whatever reason, the social media networks decide that they don't like your link, you can't even use that to drive people there. Just ask anyone who's been on the receiving end of a "shadow ban" on Twitter, or had their page suddenly drop off th

        • by DogDude ( 805747 )
          Sadly, the broad social use of the Internet in 2019 is in the form of walled gardens. In as little as 10-12 years, regular users among the general public are FAR less likely to actually "surf the web" and potentially land on self-hosted content any more.

          Yeah, so what? If you're interested in the "popularity" of your content, then you're in business. Pay for advertising (Google, Facebook, whoever). If you're posting stuff for the general good of humankind, then who really cares how many people see it?
  • by DNS-and-BIND ( 461968 ) on Monday November 11, 2019 @02:54PM (#59403982) Homepage

    YouTube did host anyone's content. Hence the name, "You" and tube meaning viewscreen. But now they've arrived. They don't need the mid and small creators any more. Big business likes to do business with big business. Big brands don't want their ads ending up on some deplorable's video about cat litter. They want to support their own kind.

    So all the little people who made YouTube what it is are being marginalized and pushed out. What's new about that? It always happens whenever any upstart successfully attains elite status. It's like that Golden Age episode of The Simpsons: "take 5 minutes to say goodbye to your former friends and report for reassignment to a better life."

    So they built YouTube? So what? Do you let the house builder stay in your house after he built it? Hell no. He got paid, now get the fuck out.

    So the maligned creators must migrate elsewhere. The new alternative for a while was BitChute.com, but apparently YouTube is like the mafia, you can't leave. Bichute had its payment infrastructure attacked and cancelled. It's almost like elites don't want any dissenting thought to appear anywhere where people can see it. Huh. Weird.

  • Comment removed based on user account deletion
  • by Pezbian ( 1641885 ) on Monday November 11, 2019 @04:14PM (#59404330)

    Hosting is easy. Traditional revenue streams like ads are becoming obsolete. We're already in the midst of a paradigm shift toward a la carte entertainment.

    Nobody really needs more than 1080p60 video, especially on a phone.

    Patreon works because people get to vote with their wallet. Livestreams and Superchats work for the same reason.

    YouTube is basically Ebay with a different product..

  • "YouTube is under no obligation to host or serve content,"

    That is correct. Google has no contractual obligation to host or serve content. Unless, of course, you have a contract with Google in which Google agree's to an obligation to "host or serve content". Do any of the numb-nuts complaining about this have such a contract with Google? If so, and Google does not live up to the contract terms, the proper remedy is to bring an action for breach of contract.

    If there is no contract, then tough titty on you

    • Google has no contractual obligation to host or serve content. Unless, of course, you have a contract with Google in which Google agree's to an obligation to "host or serve content".

      But by all accounts, it appears the YouTube(Google) does agree to pay money to content creators based upon the advertising and sales revenue that the content generates for Google.

      Google provides a hosting platform free of charge
      Content creator or publisher puts their content on this free platform
      Google agrees to pay money to the creator if the content generates revenue

      That is a contract.

  • Comment removed based on user account deletion
    • by tepples ( 727027 )

      there are, I’m sure, other sites just WAITING to eat their lunch.

      I'm curious as to which sites you recommend that fulfill the same broad user stories as YouTube:

      1. Allows individual users to upload original videos and view others' videos on both strong and weak devices and Internet connections
      2. Recommends other videos that a user appears likely to enjoy, both from the same uploader and from other uploaders
      3. Provides a means for trustworthy uploaders to earn some revenue

  • Without Google subsidizing the hosting and streaming costs, all these people making money on Patreon are SOL. They base their entire business model on the whim of a single "free" service.
    • Without Google subsidizing the hosting and streaming costs

      First let's quantify them.

      Standard definition (480p) video takes about 1 Mbps. Say you have a web series with 65 episodes, each 10 minutes long. This means you need to store 1 Mbps * 60 s/min * 10 min/episode * 65 episodes / 8000 Mbit/GB = 4.875 GB on the server to serve SD viewers. Further, say you currently serve 4,000 watch hours per year, which is the current minimum for YouTube Partner Program eligibility. This means you would have to serve 1 Mbps * 3600 s/hr * 4000 hr/yr / 8000 Mbit/GB = 1,800 GB of d

There are two ways to write error-free programs; only the third one works.

Working...