Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
Open Source Security Programming

Google's New Security Project 'OSS Rebuild' Tackles Package Supply Chain Verification (googleblog.com) 12

This week Google's Open Source Security Team announced "a new project to strengthen trust in open source package ecosystems" — by reproducing upstream artifacts.

It includes automation to derive declarative build definitions, new "build observability and verification tools" for security teams, and even "infrastructure definitions" to help organizations rebuild, sign, and distribute provenance by running their own OSS Rebuild instances. (And as part of the initiative, the team also published SLSA Provenance attestations "for thousands of packages across our supported ecosystems.") Our aim with OSS Rebuild is to empower the security community to deeply understand and control their supply chains by making package consumption as transparent as using a source repository. Our rebuild platform unlocks this transparency by utilizing a declarative build process, build instrumentation, and network monitoring capabilities which, within the SLSA Build framework, produces fine-grained, durable, trustworthy security metadata. Building on the hosted infrastructure model that we pioneered with OSS Fuzz for memory issue detection, OSS Rebuild similarly seeks to use hosted resources to address security challenges in open source, this time aimed at securing the software supply chain... We are committed to bringing supply chain transparency and security to all open source software development. Our initial support for the PyPI (Python), npm (JS/TS), and Crates.io (Rust) package registries — providing rebuild provenance for many of their most popular packages — is just the beginning of our journey...

OSS Rebuild helps detect several classes of supply chain compromise:

- Unsubmitted Source Code: When published packages contain code not present in the public source repository, OSS Rebuild will not attest to the artifact.

- Build Environment Compromise: By creating standardized, minimal build environments with comprehensive monitoring, OSS Rebuild can detect suspicious build activity or avoid exposure to compromised components altogether.

- Stealthy Backdoors: Even sophisticated backdoors like xz often exhibit anomalous behavioral patterns during builds. OSS Rebuild's dynamic analysis capabilities can detect unusual execution paths or suspicious operations that are otherwise impractical to identify through manual review.


For enterprises and security professionals, OSS Rebuild can...

Enhance metadata without changing registries by enriching data for upstream packages. No need to maintain custom registries or migrate to a new package ecosystem.

Augment SBOMs by adding detailed build observability information to existing Software Bills of Materials, creating a more complete security picture...

- Accelerate vulnerability response by providing a path to vendor, patch, and re-host upstream packages using our verifiable build definitions...


The easiest (but not only!) way to access OSS Rebuild attestations is to use the provided Go-based command-line interface.

"With OSS Rebuild's existing automation for PyPI, npm, and Crates.io, most packages obtain protection effortlessly without user or maintainer intervention."

Google's New Security Project 'OSS Rebuild' Tackles Package Supply Chain Verification

Comments Filter:
  • by zephvark ( 1812804 ) on Monday July 28, 2025 @08:29AM (#65549802)

    Warning: your prose has been infected by marketing weasels. Stop reading and induce vomiting.

    • Re: (Score:3, Insightful)

      by coofercat ( 719737 )

      Even after the vomiting, I'm still unclear what they're actually doing.

      It's either:
      - A sort of locked-down build area, where they import some code, build it and package it up, and presumably put it into a locked-down repository
      or:
      - A way of adding a cryptographic signature or similar to the meta data in an OSS projects releases

      I'm all fine with it, but I wonder how they keep the bitcoin miners out of the supply chain? Unless you're testing the code in some way to make sure it's at least vaguely doing the th

      • Even after the vomiting, I'm still unclear what they're actually doing.

        It's either: - A sort of locked-down build area, where they import some code, build it and package it up, and presumably put it into a locked-down repository or: - A way of adding a cryptographic signature or similar to the meta data in an OSS projects releases

        It's some of both, and some more stuff.

        It's a combination of a build environment that produces reproducible builds (meaning every time you do a build of a given source you get a bit-identical output -- this is not a property of most build systems[*]), plus signed metadata of source and reproducibly-built binaries, plus hosting of the above so that if you don't want to go to the effort of creating and checking the reproducible builds yourself you can just check against Google's system.

        Note that all of th

      • For example in Debian currently, and I assume most other distros too, there isn't actually an explicit policy that requires the source package to be capable of building the literal binary package that accompanies it, which can stymie even the most basic auditing procedures. This sounds to me like a way to encourage that by making a full suite of solutions to automate "reproducable builds" while adding some additional security cross-checks - something which Google didn't invent themselves but now appear to

      • It's either:
        - A sort of locked-down build area, where they import some code, build it and package it up, and presumably put it into a locked-down repository
        or:
        - A way of adding a cryptographic signature or similar to the meta data in an OSS projects releases

        Looks like it's mostly the second one. Pull a source package, build it in a controlled environment, analyze the build process with some AI woo-woo to detect sneaky stuff happening in build scripts ala the xz situation. Pull the binary package, and see if it's the same as the one built from the source, and if so, issue some sort of certificate attesting to that.

        Seems it's meant as a tool for packagers and distro maintainers, to automatically validate whether the binary packages they offer correspond to the

        • If that is the case, then it can best be summarized as "Nice code you've got there. Would be a shame if someone injected a trojan. For a suitable 'donation' we can make sure that doesn't happen. We're not responsible for anything that happens if you don't pay your dues."

          It's just another walled garden that developers will have to pay to enter, and consumers will have to pay to use. (And I have full confidence that Google will find a way to make both ends pay.)

          • If that is the case, then it can best be summarized as "Nice code you've got there. Would be a shame if someone injected a trojan. For a suitable 'donation' we can make sure that doesn't happen. We're not responsible for anything that happens if you don't pay your dues."

            I don't think Google intends to assume distribution duties, and it'd be quite the PR (and possibly legal) disaster if they got caught tampering with the upstream repos. It looks like they're just going to pull the source, compile it, and see if it matches the binaries and if any shenanigans occur in the build process, and if everything meets their standard, issue a certificate saying "this binary matches the source." I think the mobster attack on this one would be limited to "nice project you've got there

      • Internally Google builds everything from source. You can do that with a monorepo. My guess is they said "it's too bad open source doesn't work that way....oh wait, it can, let's do that at scale."
    • Pretty sure they just talked to an LLM to shit out that description and didn't read the output (who wants to).

      Given my experience in the industry in 2025 they probably would have been reprimanded if someone in management caught wind that they were making blog entries without an LLM writing them.

  • Detect /. dupes?
  • ...so whatever it is, it'll be rolled out half-baked with great fanfare, get forced onto various users by the sheer weight of Google's not-a-monopoly on just about everything, limp along for a year or two, and then Google will discover that the problem is more difficult than it first appeared, get bored, and kill it off in favor of the next exciting new half-baked thing.

    This could be the greatest idea ever; Google's involvement is the kiss of death for it regardless, until proven otherwise. The odds aren't

"I have more information in one place than anybody in the world." -- Jerry Pournelle, an absurd notion, apparently about the BIX BBS

Working...