Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Open Source EU Technology

Hugging Face, GitHub and More Unite To Defend Open Source in EU AI Legislation (venturebeat.com) 19

A coalition of a half-dozen open-source AI stakeholders -- Hugging Face, GitHub, EleutherAI, Creative Commons, LAION and Open Future -- are calling on EU policymakers to protect open source innovation as they finalize the EU AI Act, which will be the world's first comprehensive AI law. From a report: In a policy paper released this week, "Supporting Open Source and Open Science in the EU AI Act," the open-source AI leaders offered recommendations âoefor how to ensure the AI Act works for open source" -- with the "aim to ensure that open AI development practices are not confronted with obligations that are structurally impractical to comply with or that would be otherwise counterproductive."

According to the paper, "overbroad obligations" that favor closed and proprietary AI development -- like models from top AI companies such as OpenAI, Anthropic and Google -- "threaten to disadvantage the open AI ecosystem." The paper was released as the European Commission, Council and Parliament debate the final EU AI Act in what is known as the "trilogue," which began after the European Parliament passed its version of the bill on June 14. The goal is to finish and pass the AI Act by the end of 2023 before the next European Parliament elections.

This discussion has been archived. No new comments can be posted.

Hugging Face, GitHub and More Unite To Defend Open Source in EU AI Legislation

Comments Filter:
  • While AI is still in beta, it is potentially powerful enough for companies and countries to leave the EU over so they ,have AI sovereignty. And in the United States, all you have to do is classify AI as a gun and the 2nd amendment will cover it. Iran, China, Russia and North Korea aren't going to restrict their AIs (although they may superficially censor them to the public), so why should we.
  • Tradeoffs (Score:5, Insightful)

    by WDot ( 1286728 ) on Friday July 28, 2023 @10:55AM (#63721838)
    Something that gets lost in every thread about regulation is the need to weigh tradeoffs. Every time we discuss regulation, there is invariably a post that says “What about this worst case scenario?” as if the worst-case scenario would be the average case scenario in the absence of a given regulation.

    Perhaps we can imagine a worst-case scenario, such as the textbook AI bias case where a loan application model, systematically reject minorities. That’s bad, but also a system that produces such results is currently covered by existing regulation that does not mention AI whatsoever. You can’t run a business that discriminates against protected minorities, whether there’s AI or not!

    So what is the tradeoff? That anyone who builds an AI model, even an open-source one that they want to release for free to the community because they believe in the GPL “free as in speech” principles, will now have be on the hook for these regulatory compliance issues, including issues that may result from someone else commercializing your open source software model in a way that you never intended. The immediate result is that the AI open source ecosystem (and AI has been world-historically open source as a scientific field, with open access, open datasets, and open-source implementations of algorithms being standard), would disappear, because nobody wants to carry that much compliance weight for a fun side project. Many startups built on AI would also disappear. You would be left with... a handful of large corporations who have the money to jump through an infinite amount of regulatory hoops and will do so if it kills their competition in utero. The monopolies will be produced *as a result* of regulation, not the other way around.

    I think AI regulation is particularly egregious, because AI is not a specific thing, it is a “kind” of thing, like software, mechanical devices, or chemicals. AI could be applied to any possible problem (whether it will work or not is another story). “Kinds of things” that are applied to dangerous tasks (such as medical devices or automated systems for manufacturing explosive chemicals) are *already* regulated because of the specific risks associated with those tasks. But if we regulate “kinds of things,” why not regulate software in general? Every web app ever written, even one you run on your own private server for 3-4 friends, MUST be submitted to an EU regulatory body and you MUST submit reports with every code change guaranteeing its safety.

    No matter how anti-libertarian you are, and how much you remember horrible things that happened in other fields due to a lack of regulation, is this what you want? Do you want the default response to any human endeavor to be “did you clear it with a regulator first, and do you have a compliance strategy?” Regulation may be necessary, but to give it an unlimited role in every industry is to kill all progress.
    • by JMZero ( 449047 ) on Friday July 28, 2023 @11:27AM (#63721946) Homepage

      I read this reasonable post, and I felt I had to check your comment history.

      You are consistently posting reasonable, interesting arguments on Slashdot. How? Why?

      I have no idea why I still come to Slashdot occasionally, but any impulse I once had to "post actual thoughts" has long been beaten out of me. Very few posters left that actually want to talk about something in a reasonable way. I check comments sometimes, but only looking for dumb jokes or hilariously bad takes.

      Anyway, I don't have anything particular to add to what you've said... just wanted to salute you for fighting this fight, twenty years or so after Slashdot was last relevant, or had reasonable discussion with any frequency.

    • Do not forget the flip side to this: Someone who shares only source code and documentation as-is would not be impacted as they are neither providing a digital product nor a digital service. Regulations up the wazoo might actually help FOSS in the long-term by encouraging organisations to collaborate on joint approaches to limit liability through common codebases. It would also provide fertile ground for source-only distribution of innovative FOSS and for the general public to make matter use of their comput
      • by WDot ( 1286728 )
        For this particular EU regulation, developers who open-source a model “as-is” could be affected, as the Act does not specify that open source AI developers are some special case exempt from the regulations. This is why this particular EU AI Regulation is so controversial, without some change to the text anybody who throws a model onto Github absolutely could be responsible for compliance with these EU regulations: https://github.blog/2023-07-26... [github.blog]

        But my point is still wider than this: a gener
        • Yes, the fully-trained model itself would be regulated, as it is a product, but the source code to the algorithm, training data and automation needed to produce the model in the first place would not be. The parts analogous to software source code are not regulated, while the parts analogous to a compiled binary are. If the EU does modify the regulations to allow models themselves to be distributed with fewer restrictions, it should enforce that sources must be distributed with them and that they must be pr
          • by WDot ( 1286728 )
            The point of machine learning is that the source code is oftentimes a small, practically insubstantial part of the method compared to the data and training. Sometimes the training itself is a substantial investment in time and money that someone, with code and data in hand, cannot do themselves. If, again, a hobbyist trains models and opens the weights for free online, and is *punished* for doing so by being forced to spend time and money on satisfying regulators, then this makes the whole field less open s
          • Yes, the fully-trained model itself would be regulated, as it is a product, but the source code to the algorithm, training data and automation needed to produce the model in the first place would not be. The parts analogous to software source code are not regulated, while the parts analogous to a compiled binary are. If the EU does modify the regulations to allow models themselves to be distributed with fewer restrictions, it should enforce that sources must be distributed with them and that they must be provably reproducible based upon the distributed sources.

            All of the above are bad ideas in my view.

            The EU legislation is mostly duplicative and should be opposed in its entirety. It sets out modality based restrictions on algorithms generally when there are already domain specific requirements including for quality and testing across the covered domains.

            Anything less, and it creates a gaping loophole for companies like Microsoft to handily exploit.

            Microsoft is the one pushing for regulation in order to keep their proprietary model as a service scheme from being eroded by open collaborative efforts.

            Whatever dangers one believes "AI" represents allowing gian

        • by crbowman ( 7970 )

          How would this be much different the way crypto used to be? It just means that repos contain trained values will be hosted and distributed in locations that don't have the rules, and everything else will be hosted anywhere.

    • The main problem is not AI but generative AI. Training AI on scrapped web data is a new copyright problem. Same as cloning a voice or face, does a company should be allowed to clone a voice and be able to use it indefinitely without giving any royalties to the human? Regulation is urgently needed.
    • Complex school homework is absolutely no problem for people who do homework at a professional level. I recommend you pay your attention to article domyessay reviews https://www.outlookindia.com/o... [outlookindia.com] , with which you can read a detailed review about one of these services
    • I think the bigger question should be, what is AI before you start regulating it.

      Is an expert system an AI? After all, a well designed expert system will also give you proper appropriate responses for questions it's designed to answer.

  • by Roger W Moore ( 538166 ) on Friday July 28, 2023 @12:01PM (#63722072) Journal
    Is this some new euphemism for Facebook noting their similarity to Face Huggers [fandom.com] from the Alien films?
    • 1. Hug face.
      2. Paralyze you.
      3. Deposit egg in you.
      4. Burst through your chest after parasitic incubation.
      5. You, the host, dies.
      Yes, please?

      I'll never take that org seriously. I don't even know what they offer and avoid any articles with the name near it. Don't want to know, because if this is funny as a business-y thing, I'm out.

    • Is this some new euphemism for Facebook noting their similarity to Face Huggers from the Alien films?

      It's named after an emoji that is also their logo.
      https://blog.emojipedia.org/em... [emojipedia.org]

  • Game over, man, game over!

  • It probably doesn't matter what laws the EU passes when it comes to open source, because open source ignores the law. The prevailing attitude to patents: don't read them. When the USA tried to ban the export of some types of encryption the response was: print the code on a t-shirt and wear it while cross international borders. The UK's proposed encrypt laws: "oh, the UK has some dumb law - why didn't someone tell me?".

    An so it will be with this. At worst it might scare some open source programmers in th

As you will see, I told them, in no uncertain terms, to see Figure one. -- Dave "First Strike" Pare

Working...