Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI The Media

Is AI Making Silicon Valley Rich on Other People's Work? (mercurynews.com) 111

Slashdot reader rtfa0987 spotted this on the front page of the San Jose Mercury News. "Silicon Valley is poised once again to cash in on other people's products, making a data grab of unprecedented scale that has already spawned lawsuits and congressional hearings. Chatbots and other forms of generative artificial intelligence that burst onto the technology scene in recent months are fed vast amounts of material scraped from the internet — books, screenplays, research papers, news stories, photos, art, music, code and more — to produce answers, imagery or sound in response to user prompts... But a thorny, contentious and highly consequential issue has arisen: A great deal of the bots' fodder is copyrighted property...

The new AI's intellectual-property problem goes beyond art into movies and television, photography, music, news media and computer coding. Critics worry that major players in tech, by inserting themselves between producers and consumers in commercial marketplaces, will suck out the money and remove financial incentives for producing TV scripts, artworks, books, movies, music, photography, news coverage and innovative software. "It could be catastrophic," said Danielle Coffey, CEO of the News/Media Alliance, which represents nearly 2,000 U.S. news publishers, including this news organization. "It could decimate our industry."

The new technology, as happened with other Silicon Valley innovations, including internet-search, social media and food delivery, is catching on among consumers and businesses so quickly that it may become entrenched — and beloved by users — long before regulators and lawmakers gather the knowledge and political will to impose restraints and mitigate harms. "We may need legislation," said Congresswoman Zoe Lofgren, D-San Jose, who as a member of the House Judiciary Committee heard testimony on copyright and generative AI last month. "Content creators have rights and we need to figure out a way how those rights will be respected...."

Furor over the content grabbing is surging. Photo-sales giant Getty is also suing Stability AI. Striking Hollywood screenwriters last month raised concerns that movie studios will start using chatbot-written scripts fed on writers' earlier work. The record industry has lodged a complaint with federal authorities over copyrighted music being used to train AI.

The article includes some unique perspectives:
  • There's a technical solution being proposed by the software engineer-CEO of Dazzle Labs, a startup building a platform for controlling personal data. The Mercury News summarizes it as "content producers could annotate their work with conditions for use that would have to be followed by companies crawling the web for AI fodder."
  • Santa Clara University law school professor Eric Goldman "believes the law favors use of copyrighted material for training generative AI. 'All works build upon precedent works. We are all free to take pieces of precedent works. What generative AI does is accelerate that process, but it's the same process. It's all part of an evolution of our society's storehouse of knowledge...."

This discussion has been archived. No new comments can be posted.

Is AI Making Silicon Valley Rich on Other People's Work?

Comments Filter:
  • The only remedy. Alaska has it.

    • You ever walked the streets of Anchorage?

      I recommend the trip, choose a warmer month for the best experience.

    • UBI is universal BASIC income.

      Alaskans got $3200 each in 2022 - but that is not nearly enough for "Basic" income.

      In fact it's revenue sharing from oil sales...

    • The rest of the developed world already has it, it's called welfare. The problem is that govts have been particularly mean towards the unfortunate. The only difference with these UBI pilot projects is that they're more generous.
      • Welfare is usually means tested, ie how much you get depends on your income or wealth. The "U" in UBI means "Universal", ie you get it regardless of income or wealth.

        • Yep, & that's the more generous & arguably more efficient part of it. The means testing bit is resource intensive so a lot of the welfare budget goes on administering that. However you do it, welfare's welfare. If a UBI project shows net benefits then that's a good criticism of the way that the welfare system among that particular study group is being administered.
        • by laird ( 2705 )

          Right, the means-testing has massive overhead, causing the system to cost more and be less effective, because it costs money to administer all the controls, and it causes many people who should receive the benefits not to receive them due to administrative overhead. In many cases, the cost of controls is more than the actual service - for example, telephone calls cost more than 2x as much as they need to, because of the control and billing costs, which are more than the costs of providing the phone calls.

      • Even in the better systems though its barely liveable and has a tendency of being bled out by the more conservative leaning politicians who tend towards a bootstraps mentality. Australia had one of the better systems, in the 90s I survived on it for a bit while doing my uni studies, and while I definately had precarious "instant ramen" weeks, it wasn't unlivable. However a few times recently due to health issues I've had to go back on it for a month or so and it was a straight up non stop crisis. Even in th

        • Busking pays surprisingly well.
        • I'm right with you. 50yo, and while I can manage people, it's not exactly my favorite thing to do either. I have a small team I manage, but that small team is EXTREMELY self-motivating and self-solving when problems do arise. Anything outside of that and I'd hate the management part of the job. So, yeah, I can code, and I can play a whole bunch of musical instruments well enough to make a little side cash, but it couldn't support me and my wife. If coding goes away, there's gonna be a LOT of people getting

    • How would you pay for it anyway? To pay out an annual amount to keep all legal adults (18+ and older) just above the poverty line (approximately 12k) would cost well over 3 trillion a year.

      For context, the recent debt ceiling deal is estimated to save on 1.5 trillion over the course of a decade. Even if we cut out and/or streamlined social services down to the bare minimum, we still would not have enough to cover that. And good luck getting the boomers to agree on slashing Medicaid and social security to fu

  • by xack ( 5304745 ) on Sunday June 18, 2023 @01:47PM (#63613140)
    ChatGPT even admitted it to me. This is also why Reddit has become so controversial recently, because people shared all that content for free and both AI companies and Reddit inc want to monetize their free work. Don't forget all the public domain work out there in libraries.
    • by ccguy ( 1116865 ) on Sunday June 18, 2023 @02:10PM (#63613212) Homepage
      And the problem is? Wikipedia itself was built using *a lot* of free stuff. Or did they write their own web server or operating system?

      Public domain work is well, public domain. It had a period of profit for the creator and now it's public, as say, patents of a bunch of things that anyone can one for free now and that give us some very cheap generic drugs for example.

      People really get upset talking about their free work without realizing how much they get for free every day.
      • by jonbryce ( 703250 ) on Sunday June 18, 2023 @02:23PM (#63613250) Homepage

        The problem is that Wikipedia content is released under a copyleft licence, so if you make use of it, you have to release your derivative work under the same licence - either CC BY-SA or GFDL.

        • by Fembot ( 442827 ) on Sunday June 18, 2023 @02:33PM (#63613278)

          Which makes most of this "AI" merely copyright laundering as a service.

          • Which makes most of this "AI" merely copyright laundering as a service.

            I'm not sure I understand, what is "copyright laundering?"

            • As I've stated before, and been modded down for, the less euphemistic term would be "automated plagiarism."

        • by ccguy ( 1116865 ) on Sunday June 18, 2023 @02:45PM (#63613306) Homepage

          The problem is that Wikipedia content is released under a copyleft licence, so if you make use of it, you have to release your derivative work under the same licence - either CC BY-SA or GFDL.

          And then we go to to the "is this derivative work" issue.

          I don't think it is. Unless say, an essay written by a human using wikipedia as as primary source, is also a derivative work.

          And of course, wikipedia itself is not supposed to be a primary source; by definition wikipedia itself is derivative work; so it might be volunteer free work and all your want, but if it's based on someone else's work then imposing their own license would be at least questionable.

          • If you take the source code for Microsoft Office and compile it for Linux, the binary as a derivative work, even though it looks very different to the Windows and MacOS binaries they publish.

            With AI systems, I would argue that the training data is the source code, and the "learning" processes is the same as compiling source code.

          • I don't think it is. Unless say, an essay written by a human using wikipedia as as primary source, is also a derivative work.

            chatgpt can re-transcript copyrighted material. Last example: it can print valid windows 11 keys (they use a story about a grandma to jailbreak the llm but whatever ) which is clearly copyrighted material. You can also simply ask for lyrics of popular songs. AI Devs can't guarantee it won't generate copyrighted material thus as an user you can infringe copyrights without knowing it. Even diffusion models can generate images from their training data... So it's not so simple and the argument "LLM do the sam

            • by WaffleMonster ( 969671 ) on Sunday June 18, 2023 @05:41PM (#63613652)

              chatgpt can re-transcript copyrighted material. Last example: it can print valid windows 11 keys (they use a story about a grandma to jailbreak the llm but whatever ) which is clearly copyrighted material. You can also simply ask for lyrics of popular songs. AI Devs can't guarantee it won't generate copyrighted material thus as an user you can infringe copyrights without knowing it. Even diffusion models can generate images from their training data...

              People can remember song lyrics and quote copyrighted works from memory. They can use a pencil to reproduce copyrighted text and imagery stored in their minds.

              MS paint devs can't guarantee you won't draw copyrighted material and thus infringe copyrights without knowing it.

              So it's not so simple and the argument "LLM do the same as human" is plain wrong.

              Let's for the sake of argument assume the "AI" of the day operates analogous to a human mind with a reliable "photographic" memory. Would this make any difference v. LLM, XYZ or ABC AI technologies whatsoever? Is that modality at all important?

              If you can ask either AI system the same question and it would respond by spitting out the same copyrighted material does it matter?

              What if someone grew a real working brain in a lab and exposed it to copyrighted materials? Would the knowledge of copyrighted material fixed in the lab brain constitute a material object?

              • People can remember song lyrics and quote copyrighted works from memory. They can use a pencil to reproduce copyrighted text and imagery stored in their minds.

                You missed the point. Humans know when they are reproducing copyrighted material but with AI, and that is the big legal risk for users, you don't know if what it spitted at you is copyrighted or not. I can't generate copyrighted material with ms paint without knowing it.

                Let's for the sake of argument assume the "AI" of the day operates analogous to a human mind

                This predicate means Ai is now human which means he knows when he is doing plagiarism and is also able to create original content. This is currently SF but solves the legal risk of using such a system (ignoring ethical implication). Apply t

                • You missed the point. Humans know when they are reproducing copyrighted material but with AI, and that is the big legal risk for users, you don't know if what it spitted at you is copyrighted or not.

                  They do? Are you sure? I certainly have no idea what is copyrighted or what constitutes fair use. If I go to a clipart website and use clipart explicitly labeled with a permissive grant I have no idea if I will be sued to oblivion for using it. The legal system certainly doesn't give a flying fuck about my ignorance.

                  If I record a video of friends outside a McDonalds while some rando is stopped in the intersection with music blaring... am I infringing on copyrights recording golden arches or the backgrou

            • by ccguy ( 1116865 ) on Monday June 19, 2023 @11:51AM (#63615488) Homepage

              chatgpt can re-transcript copyrighted material. Last example: it can print valid windows 11 keys (they use a story about a grandma to jailbreak the llm but whatever )

              I guess, so can humans?

              which is clearly copyrighted material. You can also simply ask for lyrics of popular songs.

              You can find all of this everywhere in the internet.

              • I guess, so can humans?

                I still don't understand why people need to compare a software using ML tech to a human. It is possible to rewrite mario bros from scratch with same graphics without binary copy, end result is similar to a copying the rom thus USB disk are similar to human... Both things are currently illegal.

                You can find all of this everywhere in the internet.

                Yes and they are fully protected by copyright laws like any other writings.

        • The problem is that Wikipedia content is released under a copyleft licence, so if you make use of it, you have to release your derivative work under the same licence - either CC BY-SA or GFDL.

          That's like saying that since Funk & Wagnall's used a copyright license, since I made use of their material when writing my school reports, I have to release them under a copyright license. It betrays a deep misunderstanding of how copyright law works. That's not unusual, because it is complicated, but it does prove that you don't know what you're on about.

          • If you copy their work, you are going to fail your assignment regardless of what copyright licence they use. You need to describe your understanding of it in your own words, and LLMs aren't really capable of doing that, because they don't understand it.

    • by VeryFluffyBunny ( 5037285 ) on Sunday June 18, 2023 @03:58PM (#63613460)
      Today's concepts of copyright & patents are a lot closer to rent-seeking (& in many cases actually is rent-seeking) than the original intentions of IP laws, i.e. a limited period of control over something new to compensate for the time, effort, & expertise it took to create it. I don't see any reason not to declare old works public domain so that they become the (creative) commons for everyone to benefit from equally, as originally intended.

      That AI models are trained on whatever content is an argument between large corporations' legal teams & our views are not welcome. It's also a distraction from what these corporations are going to do to us, "the people," with these AI tools now & into the future. Never mind Skynet either, it's more like corporate dystopia, Nineteen Eighty-four style, here we come.
      • I think if you frame this more from the perspective of online privacy, security, and the right to keep data as private property more people would see the issues with current LLM tech. The copyright issue is more of a symptom at best.

        Really, this is about trying to normalize one of the most aggressive data collection and commercialization efforts we have seen since the birth of the web. Not only was this all data provided by individuals without any payment, but in some cases taken without their consent or aw

      • Disney paid good money for forever copyright! Don't mess with the Mouse!

      • by pjt33 ( 739471 )

        The original intention of IP law was to make some money for the crown by selling monopolies which had the potential to be profitable.

        • Also to restrict patents to soften the impact of changes on the labour force. They were still afraid of the unwashed masses rising up against them in anger because they were starving to death because machines took away their jobs.
  • off your own work. No human being can do that much work. The closest we've come to someone who's offered enough value to get rich off their work alone would be Albert Eisenstein, and he was a patent clerk.

    You get rich by making other people work for you, and work to buy the things you own in order to live. Your own labor can at best give you a comfortable life. But even then you still need other people to do the things you can't. I'd bet Eisenstein was a lousy carpenter.
    • Wrong. This is entirely possible, and many people do it. The difference is the people who become successful don't listen to dolts like you (most likely a Democrat) who tell them they can't.

      • Sure, you got get pretty rich off your own work but the idea you can reach 8+ figures just by yourself kinda silly when you really think about it. Everyone gets 24hrs in a day, 168 a week, no getting around it.

        Even at 10 million a year and say 60 hours a week is $3205/hr. Who produces that much productive value an hour in your opinion?

        • The market decides what is value and what is not. I don't have to like that my essential worker job doesn't really produce much (despite the fact that I'm literally helping keep people fed) but some smooth talking asshat that sells snake oil makes more then me despite actively shitting on society. (used car salesman, lawyer, take your pick).

      • I wouldn't say many people, beside content creators: singers, writers, developers and other content creators (ignoring the fact that they need to turn into CEO if they really want to be rich). Also it's funny because that's exactly what AI bros want to automate.
        • by dryeo ( 100693 )

          They all build on others work. Musicians for example practice with others work, which is often copyrighted and unlicensed. Takes a lot of practice to become a good musician, and there's few who do it all rather then collaborating with others.

    • The closest we've come to someone who's offered enough value to get rich off their work alone would be Albert Eisenstein

      I'm not sure about that. Thomas Edicine was pretty successful even when he was working alone as was George Eastinghouse. More recently, Bill Gaetz did pretty well for himself out of his garage.

      • by dryeo ( 100693 )

        I saw a cartoon today of Edison peaking over Tesla's shoulder and cribbing his notes. At that Edison was famous for taking credit for others work, though often employees so legal according to their contract I assume.

    • I swear I read about this guy just the other day that built up, all alone, his own app that was pulling down millions that was recently bought by one of the large tech companies for some insane amount, like $25 or $50 million and is obviously keeping him on to develop that.

      Try telling him, he didn't build that.

      • Try telling him, he didn't build that.

        If he didn't make the OS and the ecosystem and the app store and the APIs etc etc etc then no, he didn't build that — alone. Again, nobody can do that much work. He profited from the work of others, which is great. That's what enabled him to do what he did, it's wonderful! The bad part is pretending it's not true.

    • by ranton ( 36917 )

      While I agree with most of your post, the absolute statements you make are simply false. There are plenty of professions where your own labor can make you rich. Actors, athletes, singers, investment bankers, lawyers, and many doctor specialties are among these. Depending on your definition of rich, you could add many IT professionals and other high paid individuals. My definition of rich starts at around $10 million net worth by retirement, which somewhere around 0.5-1% of people can attain off their own la

  • Second... ALL money SV has ever made has been off other people's work.

    • Wait a second . . . you mean that the people who control the means of production are profiting off the labor of others? Fuck Joe Biden!
    • by ccguy ( 1116865 )

      Second... ALL money SV has ever made has been off other people's work.

      Same as *everybody else*. Or say, a plumber fixing a toilet, is not benefiting from someone else installing that toilet, or the pipes, or building the machinery that allowed him to buy a cheap wrench?

      That's the way economy works.

    • Same with stack overflow. Considering when I wanted to delete content that I posted, they undeleted it and claimed ownership, as far as I'm concerned, every AI should crawl it and use it for the future.
    • Such a tired argument. A good idea by itself does nothing. A roomful of engineers won't get anywhere without the supporting apparatus of business management, marketing, sales, etc. You also need good leadership with vision and the ability to execute.

      Engineering is not all the "work". There's a whole lot more to it than that, and it requires different kinds of talent.

      • by Shaitan ( 22585 )

        I didn't make an argument, tired or otherwise. I stated a simple fact. That is the entire concept of the valley is to hire brilliant people and scrap virtually all the cream off their ideas.

        "A good idea by itself does nothing. A roomful of engineers won't get anywhere without the supporting apparatus of business management, marketing, sales, etc. You also need good leadership with vision and the ability to execute."

        Yes, but all of those elements are mechanical and replaceable except the ideas and the people

  • And it will solve the climate crisis as well. Just need to wait less than a decade by current estimates. Once the captains of industry realize capitalism will end, they may try to hold back AGI by spreading FUD, even at the risk of missing out on immortality: https://www.genolve.com/design... [genolve.com]
  • What do we do about ideas inspired off other ideas? This sniffs of a complaint similar to " appropriation!!!1111one". Look, every idea since forever has been a mutation of a previous idea. Should the Simpsons creators get reimbursed if I make a joke that sounds like something Homer would say? Should they get reimbursed if I am a comedian making similar jokes for a living? No. If you don't want the world to take your ideas and run with them then don't put them on the public internet. This is just a mo

    • The entire copyright system, as it exists today, is a money grab. Those developing AI are only the latest in very long list of those catching lawsuits for no good reason.

      As for UPS using the roads, the government is certainly entitled to a cut. They're not free to build. Hopefully they do in fact charge road taxes to UPS, because otherwise it's only individuals who are paying them.

      • Depends if you feel corporations pay taxes or not I suppose. Every tax a corporation pays is really paid for by their customers. Hence raising taxes on corporations can quite literally cause those same corporations to pass those taxes onto the consumer. This is especially true if this corporation has an outsized portion of the market and has no real competition.

        • by dryeo ( 100693 )

          OTOH, the workers that the corporation has paying taxes means that the corporation needs to pay higher wages, which has to be passed on to the consumer.
          The advantage, for society, is that the corporation only pays taxes on profits, which gives them lots more choices with what to do with their income, invest in the business for example, important for a startup. While paying the workers enough to cover their taxes on top of money to live on is a fixed cost and needs to be payed even if the company has no prof

  • Google, apple, facebook, microsoft have been collecting our data for literally almost decades at this point (gmail came around in 2003 or 2004?) They have a unbelievable data set against which they'll train or run their ML models on, to do whatever the fuck they want, o er and over and over.
  • Santa Clara University law school professor Eric Goldman "believes the law favors use of copyrighted material for training generative AI. 'All works build upon precedent works. We are all free to take pieces of precedent works. What generative AI does is accelerate that process, but it's the same process. It's all part of an evolution of our society's storehouse of knowledge...."

  • by aaarrrgggh ( 9205 ) on Sunday June 18, 2023 @03:46PM (#63613432)

    The chance to make money on "AI" was December throuh maybe May. Now it is the next sucker's game. They need to combine it with Blockchain and 3D printing for the next round!!

    LLMs and GPT offer some interesting possibilities, but I think the prospects have shifted away from mainstream quite quickly. I can see plenty of industry-specific solutions, but they are far from universal and the training data complexity is going to be fun to watch.

  • Does Reddit have a legal leg to stand on? You, as a Reddit user, retain the copyright to what you post. You just give Reddit a license to redistribute your copyrighted work. Does Reddit have any rights with regard to ChatGPTâ(TM)s use of your content?

    https://www.redditinc.com/poli... [redditinc.com]
  • There is no realistic way to get filthy rich on your own work. The only way is to exploit others in some way, often by essentially stealing their work or their wealth. If you provide honest value in services or goods, you never are going to get byond "well off". Obviously anybody decent will be quite satisfied with "well off", so the converse idea also holds.

    • I suspect many entertainers and athletes would disagree with this perspective.

      • Many of the most successful entertainers and athletes have entire industries and support infrastructure built around them that most people can only dream of having. It is far and away just the talent and work of the individuals in question.

        Tech support, marketers, legal teams (that is a big one in the music industry), and even politicians lend support to them. And many of them also came from well-off families that had the resources to help hone their skills via private trainers, facilities, etc. Schools and

    • There is no realistic way to get filthy rich on your own work. The only way is to exploit others in some way, often by essentially stealing their work or their wealth. If you provide honest value in services or goods, you never are going to get byond "well off". Obviously anybody decent will be quite satisfied with "well off", so the converse idea also holds.

      There is no realistic way to live off your own work. You wrote the compiler you're using? built the computer? Educated yourself without anyone's help? Did a farmer make the plow or scythe he's using by his own hands? And, since I guess in your terminology "benefiting from someone's labour, even if you compensate said labour by amount agreed to by both parties" is "stealing", you too are a fucking thief.

      • by gweihir ( 88907 )

        There is a rather stunningly obvious difference between "getting filthy rich" and "living reasonably". I do understand that you are way too dumb to see that, though.

        • There is a rather stunningly obvious difference between "getting filthy rich" and "living reasonably". I do understand that you are way too dumb to see that, though.

          So you stole less than your average "filthy rich" guy. Of course only because of your rock solid moral compass, not because you never had the chance to join the "filthy rich" circle. So: "I'm not a thief, because look, this other guy stole much more" is your argument now?

  • Make some people rich on other people's work? Isn't that the base of capitalism?
  • How is it different when a human reads Wikipedia, sees a picture, learns to code?

    How am I not just profiting off the work of people who originally thought that stuff up? Do human brains really add so much additional value compared to AI?

  • "A great deal of the bots' fodder is copyrighted property..."

    So, just like every student?
    They read and remember, some even remember every single word, including the page-numbers.

    That IP is a scam and the know it.

  • "Silicon Valley is poised once again to cash in on other people's products,"
  • You do the work. The capitalists reap the benefits.
    We are all wage slaves.
    (And then, of course, you overpay for goods and services delivered by capitalist corporations so... you just have to work more.)

  • Let them eat data! (Score:5, Interesting)

    by Zobeid ( 314469 ) on Sunday June 18, 2023 @06:37PM (#63613748)

    This may not be popular, but I think somebody needs to make the argument in favor of scraping the web unhindered.

    1. Copyright law controls the distribution of copyrighted works. AI models are not distributing the works that they’ve accessed, any more than a seller of applesauce is distributing apples. You won’t find copies of those materials anywhere in the database, and whatever the AI coughs up is almost always mangled and mashed up beyond recognition. (And when something comes out that’s recognizable, that’s most often a concern of trademark law rather than copyright. Meaning, if your AI generates pictures of Batman and you use them, then DC may rightfully want a word with you. They won’t care how the model was trained; they’ll only care what you’ve done with it.)

    2. Copyright law is supposed to economically incentivize the production of creative works. It’s hard to come up with how AI scraping undermines that principle. It’s not a direct replacement for any of the works that it was trained on. It’s not breaking the various business models that got the training material posted in the first place. One could argue that generative AI is a supercharger for creative work and lets more of it be produced faster. That’s what we’ve always wanted, isn’t it?

    3. The way generative AI works is analogous to the way that human beings learn and create. We observe and consume creative works, mentally digest them, and then we mix-and-match our favorite parts to create something new. AI is a more automated version of the same general processes. So, why should it be acceptable for every human being to do this “by hand” (as it were) but not employing automation to the same effect?

    Going beyond copyright, I understand there are practical concerns about massive scraping operations and the sheer amount of traffic and hits that they can produce, and real costs that they can incur. So, I'm not addressing that. Those are issues that'll have to be negotiated and worked out between various parties.

    • It's not a direct replacement for any of the works that it was trained on. It's not breaking the various business models that got the training material posted in the first place.

      I'm pretty sure Getty Images would disagree with that, which is why they're suing Stability AI. In the old days (like early 2022), whenever a news site posted a story, they would include an image along with it. If they didn't have a specific photo from the story itself, they would use a stock image licensed from a service like Getty. Licensing those images was a huge business. Today they're likely to generate an image with Midjourney or Stable Diffusion instead. It saves the licensing fees. And guess

  • I would love for one of the places they apply AI to be under "result attribution". Meaning "who" did "what" and "did it matter?"

    What if gig workers were fairly paid? How might labor change if customer results were connected to wages received?

  • I keep waiting to see some actual revenue reports that show AI companies or AI divisions of companies are generating profits from AI-enabled services, whether it be by increasing productivity or decreasing human labor. The titles of articles like this assert that this is happening now using words like "are" and "is", while the contents of the article are speculating with "could" or "will". I feel like the only people getting rich off this right now are those that stand to benefit from increased market cap
  • Of course AI is making money off other people's work. No different than any artist, engineer, tradesman, or anyone who does any work for money. All humans are trained and influenced by other people's work. If you use your skill to read for work, you are utilizing other people's work, If you can speak any human language, you are utilizing other people's work. Eric Goldman mentioned in the article is absolutely correct, AI just does it much more efficiently than any human.
  • It's an open source killer. The main reason people contribute to open source is to get recognition for their name. Either that or they contribute anonymously. But most don't contribute anonymously. If AI absorbs this content and makes it part of its knowledge base, this work will be used without citing contribution because contribution information will be culled. It's not relevant to the functionality of the result after all.

Dynamically binding, you realize the magic. Statically binding, you see only the hierarchy.

Working...