In my previous blog you can read about securing the software supply chain for Docker images using GitHub actions and Sigstore. We have seen how we can sign our Docker images, as well how to generate an SBOM and build provenance. Using Sigstore/cosign we attached both the signature, SBOM and build provenance to the Docker image. Using Sigstore we get a real nice integration and developer experience to add these security features to our build pipelines for Docker images.
In this blog I want to show an approach to store SBOMs and Provenance in an OCI registry for other software release assets (e.g. npm, maven, nuget, …). To do so we will also utilize cosign to interact with the registry. The developer experience will not be that nice and integrated as we have with Docker images, but at least this can be an approach to move into a more ideal situation untill there is some more integrated solution with package managers.
Using OCI as a blob storage we can leverage this to distribute and discover attestations for software assets that are not OCI/Docker images. The idea is to come up with some naming convention that can relate for example our npm package to a Blob in the OCI registry. Using this convention we can document and make it transparent for consumers of your assets where they can find the SBOM as well the build provenance.
Assuming we already have build provenance (named provenance.att) and an SBOM (named sbom-spdx.json) for a fictitious npm package, our folder structure might look like this.
With this SBOM and provenance generated we can use an OCI registry to distribute and store the SBOM and provenance. See in the next part an example on how to do that using Cosign.
For any other package we can do similar by applying the same tagging convention («oci-registry»:«pkg-name»-«version».«attestation-type») like we applied in previous example.
With these attestations stored in the OCI registry, consumers of our assets can now easily download the attestations and verify its signature that guarantees its authenticity. We can do this using sget. Once downloaded we can write it to a file or pipe it to other shell tools to inspect/transform and use it for our usecases. More on consuming the stored attesations later in this blog, first lets have a look at a tool called fatt, which is a proof of concept (POC) to make the process of publishing and discoverability easier, by automating above steps.
Automating the upload and signing steps
To automate the steps of uploading and signing the SBOMs and provenance further we build fatt as a POC. (:warning: NOTE the signing is not yet implemented in fatt and still a manual step, see fatt#20).
Fatt requires a few command line arguments to apply the convention we did manually using cosign. To identify the attestation type fatt requires to define a URL Scheme format in front of the attestations that will be uploaded (see the example below).
Fatt uses the tagging convention as described earlier in this blog with the manual cosign steps. We also added an additional blob, attestations.txt, that we store in the OCI registry. In this attestations.txt, fatt stores the location of SBOM and provenane in purl format. Purls are commonly used in SBOMs to link to other resources. This attestations.txt is used by fatt filter capabilities to allow some advanced consumption usecases.
At this stage we have our attestations published, either via cosign or using fatt as a convenience. As we used a convention it should be more recognizable for consumers of your packages where to find the attestations. This enables consumers to automate the consumption of the attestations. E.g.:
Check SBOMs for license violations (e.g. GPL)
Check SBoms for vulnerabilities
Check provenance if artifact was produced by a trusted build environment
…
Consuming
We can utilize sget to fetch a specific blob from an OCI registry. Sget is part of the Sigstore tools. It downloads the given resource and automatically verifies the signature of this artifact to proof the authenticity.
In my case I used the ephemeral keys from sigstore (COSIGN_EXPERIMENTAL=1). If you used selfsigned keys you will have to use the --key flag to provide your public key.
SBOM example
Fetching by tag requires signature verification. In this case you will receive 2 json objects. One containing the signature data and one containing your SBOM. To get the SBOM contents we can use jq.
In case you fetch the asset by digest you can omit the signature verification. However the digest is less userfriendly to remember and type. Small advantage is you will not receive the signature in the output of the command.
Provenance example
See here an example of getting the provenance predicate that describes how the awesome-node-cli asset was produced.
Similarly like the SBOM you could also fetch the provenance by digest, which does not require signature checks and slightly changes how you apply jq in further processing.
Discovery file example
We have seen fatt was also adding the attestations.txt as an addition to the cosign steps (upload and sign). fatt list is similar to sget when it comes to fetching the ….discovery tag. It will download the blob, verify the signature and adds some additional capabilities to make more advanced commandline usecases possible. e.g.:
Filtering
Transforming purls to oci image urls
Working directly with the OCI blob
Traversing the local file system on multiple attestations.txt files
See here some examples:
Store the attestations in a local file
Recursively search the local file system for attestation.txt and filter by provenance
Recursively search the local file system for attestation.txt and filter by SBOM and output the OCI blob URLs
Specifically this last example is usefull to further automate things to automatically fetch all SBOMs for projects on your local file system (all projects do need to have an attestations.txt in their workspace and have to share the same root path to allow scanning this filepath recursively).
e.g. if you would use fatt or sget to download the attestations.txt and store them on your local disc you might end up with a folder structure like this example.
GitHub actions
To see a full end to end example of a CI pipeline you can see following workflow.
In this workflow we make a release for our software using npm pack. Then we utilize syft to generate an SBOM and we use the slsa-provenance-action to generate the provenance for our build. Last but not least we utilize fatt to publish the sbom and provenance to an OCI registry.
The second job in the workflow showcases some ways to consume the attestations. This gives you an impression on the possibilities of using these attestations in your workflows. E.g. you could check your dependencies in npm, try to fetch the SBOMs based on a convention, and if the SBOM is provided by the dependency you can perform some license checks in your CI pipeline.
Bonus
Using crane you can easily query the OCI registry to see how the attestations are stored for your software assets.
As you can see here, we have a signature for each released tag. You can also see that we didn't use the fatt commandline option --tag-prefix in an earlier release (v0.1.4-rc2). This commandline option allows you to choose between 2 publishing strategies:
In this blog we have stored attestations (SBOM, build provenance) in an OCI registry. We also have seen how we could automate those steps a little further using fatt. Aside from the publishing part we also looked at the consumption usecases. The examples where based on an NPM package, however we can apply the same for other package managers like Maven, Nuget, Go modules etc. Also take some time to explore the GitHub actions workflow to incorporate this in your own projects.
What do you think about this approach and the proof of concept using fatt? Do you see any other usecases to be applied or do you have any suggestions on taking this idea next-level? Feel free to reach out at the fatt repository, discuss at Twitter, Reddit or any other Social platform. Looking forward to your feedback.
Securing the software supply chain has been a hot topic these days. Many projects have emerged with the focus on bringing additional security to the software supply chain as well adding zero-trust capabilities to the infrastructure you are running the software on.
In this blogpost I want to introduce you to a small commandline utility (spiffe-vault) that enables a whole bunch of usecases like:
Secretless deployments
Keyless codesigning
Keyless encryption
Spiffe-vault utilizes two projects t…
With the rise of software supply chain attacks it becomes more important to secure our software supply chains. Many others have been writing about software supply chain attacks already, so I won't repeat that over here in this article. Assuming you found my article, because you want to know how to prevent them.
In this blogpost I want to show you how to secure the software supply chain by applying some SLSA requirements in the GitHub actions workflow. We will utilize Sigstore to sign and attest…
Have you ever been struggling to commit with the right email address on different repositories? It happened to me many times in the past, but for a couple of years I'm now using an approach that prevents me from making that mistake. E.g. when working on your work related machine, I'm pretty often also working on Opensource in my spare time, to build my own skills, and simply because I believe in the cause of Opensource. Also during work time I'm also sometimes contributing fixes back to Opensour…