Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Open Source Security

Why Are 'Supply Chain Attacks' on Open Source Libraries Getting Worse? (arstechnica.com) 44

"A rash of supply chain attacks hitting open source software over the past year shows few signs of abating, following the discovery this week of two separate backdoors slipped into a dozen libraries downloaded by hundreds of thousands of server administrators," reports Ars Technica: The compromises of Webmin and the RubyGems libraries are only the latest supply chain attacks to hit open source software. Most people don't think twice about installing software or updates from the official site of a known developer. As developers continue to make software and websites harder to exploit, black hats over the past few years have increasingly exploited this trust to spread malicious wares by poisoning code at its source...

To be fair, closed-source software also falls prey to supply-side attacks -- as evidenced by those that hit computer maker ASUS on two occasions, the malicious update to tax-accounting software M.E.Doc that seeded the NotPetya outbreak of 2017, and another backdoor that infected users of the CCleaner hard drive utility that same year. But the low-hanging fruit for supply chain attacks seems to be open source projects, in part because many don't make multi-factor authentication and code signing mandatory among its large base of contributors.

"The recent discoveries make it clear that these issues are becoming more frequent and that the security ecosystem around package publication and management isn't improving fast enough," Atredis Partners Vice President of Research and Development HD Moore told Ars. "The scary part is that each of these instances likely resulted in even more developer accounts being compromised (through captured passwords, authorization tokens, API keys, and SSH keys). The attackers likely have enough credentials at hand to do this again, repeatedly, until all credentials are reset and appropriate MFA and signing is put in place."

This discussion has been archived. No new comments can be posted.

Why Are 'Supply Chain Attacks' on Open Source Libraries Getting Worse?

Comments Filter:
  • Sure? (Score:4, Insightful)

    by Carewolf ( 581105 ) on Saturday August 24, 2019 @10:43AM (#59120600) Homepage

    Are they getting worse, or are they being discovered?

    Many of these are old.

    • Comment removed based on user account deletion
    • If I may say, "yes". As the open source libraries have source code published in public repositories, they've become more vulnerability. As the raw number of libraries have increased, they've also been built on more platforms around the world. What Eric S. Raymond described as the "bazaar" of open source has become very real. But it's very difficult, indeed, to control all vermin and all contamination in an open market.

      The problem is compounded by the poor security control of the public repositories.

      • What Eric S. Raymond described as the "bazaar" of open source has become very real.

        The bazaar of open source is like the bazaar of Amazon. Sometimes a little parental control is a good thing.

    • Look, if you install a typical modern hype framework like react.js or react native the amount of packages from npm is truly staggering. most of them end up being just few lines. the packages things depend have gone down in size as there's an incentive for people to submit patches and make as many packages as possible so they can be a package author and claim that on their cv - so what happens is people taking an existing package, isolating a few lines of code from that and turning it into a new package. som

  • by Chris Mattern ( 191822 ) on Saturday August 24, 2019 @10:56AM (#59120610)

    Because they're working.

  • I feel like this terminology is being misused, the supply chain is the outside chain of suppliers during the production of your product. Of these the only incidents which qualify IMO are the ruby libraries.

    Webmin is an end product, it wasn't compromised by its supply chain they were simply hacked as were the commercial software product.

    • by intrico ( 100334 )

      Even if there is no outside chain of suppliers, it is still reasonably accurate to describe the sequence of processes needed to deliver software to the end user (e.g. roughly: build, validate, publish/distribute) as a "supply chain".

      • Not really. The build process is internal. It isn't composed of independent suppliers 'chained' together who supply components to build a product.

        • by intrico ( 100334 )

          Different people, teams and/or systems are often involved however, which makes the term accurate enough. More so when you consider the other figurative labeling commonly used for security matters. I mean, look at terms like "evil maid attack" and "man-in-the-middle attack". Obviously, it doesn't technically have to be an maid or a man that's responsible for the breach, but figuratively, the terms make sense.

      • by Luthair ( 847766 )
        If that were the case why would we even bother having a distinct name for these, we'd just call them all security vulnerabilities.
        • by intrico ( 100334 )

          Were the case? This is the case (i.e., categorization and description of attacks with varying degrees of figurative language).

  • by isj ( 453011 ) on Saturday August 24, 2019 @11:09AM (#59120626) Homepage

    The complexity of most software has increased over the past few decades, making it difficult to not have dependencies on 3rd-party libraries.
    The more dependencies a program has the more supply-chain attacks are potentially possible.

    Then there is how those 3rd-party libraries are managed/developed. I have of course not checked all the attacks mentioned, but it seems to me that most of those libraries were primarily developed/managed on a publically available github/bitbucket/gitlab repository, so attackers only have to know 3 ecosystems.

    So for attackers it is a no-brainer: Attack the libraries - it potentially affects multiple applications and chances of staying under the radar are higher.

    That, and the wide-spread habit of relying on bleeding-edge versions of libraries instead of stable, old versions; Using the source directly instead over cloning to a private repository; relying on trivially small libraries (lpad) that are hardly worth relying on. It all adds up and leads to the increase in supply-chain attacks

    • by Junta ( 36770 ) on Saturday August 24, 2019 @11:50AM (#59120722)

      The complexity of most software has increased over the past few decades, making it difficult to not have dependencies on 3rd-party libraries.

      While there is some truth in this, it is for some bad reasons and also it is not nearly as critical to use those libraries as people act like.

      In JS land, a lot of those dependencies are due to very weak set of core implementation features. You want something as simple as basic string formatting? Go fishing for a third party library. Actually, get ready to perhaps fish for a whole bunch, as libraries provided are ridiculous, one library for left-pad, another for right-pad, oh, someone made a universal 'pad', but wait you want to do some other string formatting, go looking for more). You spend time searching through and not knowing which module is the 'most appropriate' for what you want. Javascript is missing so much in terms of basic functionality, any codebase is built out of duct tape and mindless sets of dependencies.

      Another is an extreme take on 'DRY' or rather don't reinvent the wheel. For example, if your language doesn't have basic string formatting and you need a pad, just write a quick one yourself. It would take about the same time to write a quick function as it would to type a search for an existing library. Sure, don't go overboard and implement a brand new LDAP client library from scratch, but it's ridiculous how tentative a developer is about trying to find a third party implementation of the simplest tasks.

      • by isj ( 453011 )

        I think the reliance on trivially small libraries are due to the mindset. I'm not a JS developer so from my perspective the dependency on small things like leftpad is ridiculous. Maybe JS developers see it differently.

        The reliance on way-to-many libraries isn't only a JS thing. I have seen it in java too, and I can sort of see where that is coming from. Java developers were mostly taught that code reuse is good and developing your own is bad. Combined with large frameworks like Swing, EJB, Spring I think mo

        • I think the reliance on trivially small libraries are due to the mindset. I'm not a JS developer so from my perspective the dependency on small things like leftpad is ridiculous. Maybe JS developers see it differently.

          Having shared dependencies allows you to keep the code smaller, in a world where that still matters because you're shipping it to the front-end every time a page loads. Do you want each library you use to have its own implementation of left-pad? No, they should all share the same implementation!

          That's the mindset. In practice "make small by default" happens about as often as "make fast by default" happens in C++.

        • For an alarming number of JS "devs", it's not really a matter of mindset, it's more a lack of any real skillset. Go visit a gitter or discord channel where people help each other out. Most people aren't asking with help coding, they are asking for help using a library, simple or not. Most haven't RTFM, don't want to RTFM, and couldn't care less about TFM. Web developer code camps, free or not, are churning out individuals that have little interest in coding and see it more as a way to "get rich quick". I h
    • "relying on bleeding-edge versions of libraries instead of stable, old versions"

      That's 'cuz the old version:

      - Never was stable. No one does "stable" anymore.

      - Depends on and is depended upon by a bunch of other libraries. All of which are expecting the current version, else you're going to have some serious dependency hell on your hands.

      - Is, or is presumed to be, chock full o' bugs & security vulnerabilities. That's why pinning an exact version of a dependency (rather than a compatible range) is a secu

  • by reanjr ( 588767 ) on Saturday August 24, 2019 @11:10AM (#59120630) Homepage

    Debian solved this already by implementing real security around who is authorized to publish. You need to show someone a passport or other form of government id so everyone knows who's responsible.

    But Debian has been largely superceded by Ubuntu and its ilk precisely because Debian's packages are old and stable and secure, features no one cares about.

    • by HiThere ( 15173 )

      Not totally. The Debian libraries have been infected at least once that I remember...though it was years ago. (And I think that it's happened twice, but I'm not sure.)

    • by lkcl ( 517947 )

      Debian solved this already by implementing real security around who is authorized to publish. You need to show someone a passport or other form of government id so everyone knows who's responsible.

      But Debian has been largely superceded by Ubuntu and its ilk precisely because Debian's packages are old and stable and secure, features no one cares about.

      indeed. i wrote about this in some detail when the last attack on a completely unsecured system was discovered (npm): https://it.slashdot.org/commen... [slashdot.org]

    • Good thing Slash runs Red Hat and CentOS.

      https://slashdot.org/story/07/... [slashdot.org]

    • by Uecker ( 1842596 )

      This helps a bit but also not too much because if a developer's machine is compromised, an attacker can create insert a backdoor into a debian package. The main problem is that there is still no way to verify that the distributed binaries correspond to the distributed source, i.e. reproducible builds. I complained about this in 2007 and was told that this is a waste of time. A couple of years ago they finally realized that this is important and they are now working on it.... The other major problem is that

      • But Debian is definitely working on getting there: https://wiki.debian.org/Reprod... [debian.org]. Root access can indeed be a problem, but it's hard to install and maintain a distribution that runs on bare metal if it doesn't have root access somehow, and there is such a variety of things packages do that it is not easy to come up with a solution that doesn't involve executing code supplied in a package as root and works for everything.

    • by _merlin ( 160982 )

      Yeah, that's great until those trusted people think they're clever and introduce a vulnerability [madirish.net] that isn't present upstream with their "fix" for a compiler warning. This kind of thing still happens - Debian people seem more interesting in having things compile than actually work, and I've had to reject upstream patches from them that break things in the name of fixing warnings.

  • by rsilvergun ( 571051 ) on Saturday August 24, 2019 @11:19AM (#59120648)
    engineers in places like the old Soviet Block countries and Asia, and they've got internet, advanced computer science training and not much else.
  • The strength of open source is that these attacks were discovered at all.
  • To paraphrase Stalin: He who submits the code decides nothing. He who commits the code decides everything.

    Part of the problem with Open Source is not the concept, but the way it is run. Egos, politics and conflicting interests often dominate open source projects, but the conflict specific to this problem is the conflict in time commitment required to test and audit code changes. Many of these projects are side projects for people and don't often get the commitment they need. But even when the projec
  • Attacking open source is a big target because users of these libraries are usually using powerful servers that are ideal for mining or botnetting cryptocurrency. As long as there is still money to be made it will continue.
  • by Junta ( 36770 ) on Saturday August 24, 2019 @11:37AM (#59120700)

    In the 90s, the typical linux system was a distro with a rather sparse package set with a hodgepodge of manually compiled components. Plus updating a distro frequently meant having to download a whole new iso or buying some magazine/book with a copy of the distro on CD.

    In the 2000s, the distros and network improved to have distros have much more content and be trivially updated. You had a norm dominated by centralized organizations vetting packages and managing discovered vulnerabilities. They didn't always do the greatest job, but they did respectably.

    In the 2010s, developers have jumped back to terrible impatience, sourcing straight from cpan, pypi, gems, npm, crates, go pull from github directly, and docker (because now it is as easy as using a distro, but now you can rest assured knowing you'll never find out that some feature exists that your source doesn't yet provide). Of those most have very poor authentication and even when there is authentication, you are directly trusting a large number of people with varying objectives and who knows how careful they are with their security even if they mean well. Distros have gotten really good at managing security, but their userbase is full of people who find yum/apt/etc an antiquated way of doing things compared to using the shiny language aligned repositories. Or if they use yum/apt, they pile on ppa/copr repos and produce the same experience.

    The ecosystem at large just lacks the patience for curated repositories, but they are a much better strategy for mitigating this phenomenon.

    • Basically, this. Npm and friends are all raging dumpster fires when it comes to security. Install one package from npm and you're likely pulling in hundreds or thousands of other packages as deps, and due to the fact that there's no review of any of them, you may as well be going to 1000 random websites and running random executables from them. Most people wouldn't do the latter (I hope...) because you'd have to be outrageously stupid to do so, but the former is now the industry norm and nobody seems to car

      • Perl used to have this problem, before it fell out of favor. Python and Java and nodejs have it now, because they pull in untested libraries by the dozen, or by hte hundred, to assemble their final packages. And many developers install the same module by default for their entire environment, sharing dangerious libraries without review but pulling them into other components.

  • Complexity (Score:3, Interesting)

    by hermi ( 809034 ) on Saturday August 24, 2019 @12:22PM (#59120810)

    Because people pull in more and more of libraries from the internet.
    Some are for good reasons, like the growing complexity of everything, some are for no good reason (like "let's use a huge JS framework to serve a static website").

  • Secure, Cheap, Efficient. Pick two. (If you are lucky. Often you only get to pick one.)
  • by bradley13 ( 1118935 ) on Saturday August 24, 2019 @03:02PM (#59121174) Homepage

    Part of the problem is simply the use of libraries. Too many developers - needing a simple function they could write themselves - instead bind in an external library containing far more than they need. And that library itself has other libraries as dependencies. Your simple, little program suddenly includes hundreds of thousands of lines of external code that you know nothing about.

    tl;dr: lazy developers get what they deserve

  • Everything has to be cheap and done fast these days. The cost of that is horrendous.

  • Too bad the good guys are getting dumber at twice the rate.

  • by Tom ( 822 ) on Sunday August 25, 2019 @02:24AM (#59122264) Homepage Journal

    The problem is the micro-fracturing that the library space has.

    There used to be libc. It had about 200 major functions. In todays environments, those would be at least 20 different libraries by at least 10 different people. In the Node,js world, it's probably 400 libraries by 600 people.

    When you have libc, sure your "add(a, b)" function will not evolve with full agility to satisfy stakeholder edge case requirements... but it's a ton easier to not make a stupid mistake and to watch what goes into it and to have proper revisions.

    But look at what we have today. Lots and lots of code is pulled straight from github into a production project. That's literally what npm or composer are doing. How in all hells are you even supposed to do any kind of quality control under those conditions? And I didn't even mention dependencies yet.

    The basic idea of having tiny building blocks and being able to assemble anything from it is cute. It works brilliantly in the world of LEGO. But people kinda forgot that all LEGO bricks are manufactured by the same company to the same standards when they took that metaphor and forced it on software.

  • by astrofurter ( 5464356 ) on Sunday August 25, 2019 @03:20AM (#59122318)

    I've said it before: Surveillance Valley has already cooked the goose that laid the golden FOSS eggs. Companies take and take but never give back. FOSS authors are rewarded for their efforts with zero autonomy code monkey jobs. People still release software out of goodwill/vanity. But no one really gives a fuck about maintaining it.

    Look at the dependency chain of the software you work on. Poke around a bit. Look at last commit dates on GitHub. How much appears to be abandoned? A disquietingly big amount?

    The termites have set in. It's only a matter of time until the whole house falls down.

  • ... and the community is not doing anywhere near enough as far as changing it's practices to stop them?

    If the community seriously wanted to stop supply chain attacks, then it needs to move to a more vetted model of *who* can post on the various package repositories, some kind of a peer-ranking system of new contributions, and a much more stringent code review process for the contributions.

On the eighth day, God created FORTRAN.

Working...