Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Provide suite of tests for validating Fulcio releases and deployments #612

Open
k4leung4 opened this issue May 24, 2022 · 12 comments
Open
Assignees
Labels
enhancement New feature or request

Comments

@k4leung4
Copy link
Contributor

Description

To better streamline releases and deployments, we need a suite of tests that we can run to validate releases and deployments.

@haydentherapper Can you outline what tests you would like to see and what tests we have and which ones we are lacking.

@k4leung4 k4leung4 added the enhancement New feature or request label May 24, 2022
@k4leung4 k4leung4 changed the title Provide suite of tests for validating releases and deployments Provide suite of tests for validating Fulcio releases and deployments May 24, 2022
@k4leung4 k4leung4 removed this from Sigstore GA May 24, 2022
@haydentherapper
Copy link
Contributor

The gRPC test suite code is fairly thorough. It tests all endpoints with different types of issuers. For each supported issuer for the production environment (email, SPIFFE, GitHub and K8S currently), we should test a successful certificate issuance. We should also test the other GET endpoints.

Tests over HTTP could be considered, but we'd effectively be testing the gRPC-HTTP bridge, which is not necessary. If we did test using HTTP, we should at least test the V1 API.

@k4leung4
Copy link
Contributor Author

Are you proposing we should have the equivalent of the gRPC test suite, but pointed at a live instance?

As for supported issuer, do we have documentation on how to test each of the issuer? with the lack of documentation and existing test, do we actually know if the spiffe actually works, other than a user telling us it doesnt? im worried that we have previously added support for issuers but never had a way to continuously verify it works.

sounds like HTTP testing would be a nice to have, but not high on the priority list.

To better prioritize them, are these all tests that we are willing to block our next release/deployment on, or only some?

I'm trying to set a baseline of what are we comfortable doing a release/deployment with, versus what do we want to aim to have.

@haydentherapper
Copy link
Contributor

Are you proposing we should have the equivalent of the gRPC test suite, but pointed at a live instance?

Yea, I'd like a subset of the suite tests pointed against the live instance, exercised before a release:

  • Issuing a certificate with the Dex provider
  • Issuing a certificate with the GitHub provider
  • Issuing a certificate with a K8S provider
  • Checking api/v2/TrustBundle can be used to verify an issued certificate

Some of those may be harder than others to set up, particularly GitHub and K8S. If we at least cover Dex initially, that'd be good. I'm removing SPIFFE from the list because it's a federated provider currently.

As for supported issuer, do we have documentation on how to test each of the issuer? with the lack of documentation and existing test, do we actually know if the spiffe actually works, other than a user telling us it doesnt? im worried that we have previously added support for issuers but never had a way to continuously verify it works.

I can add more docs/help write the test suite, just need to know if there's a certain way to write it - Bash script? Go script?

We don't have tests for each supported OIDC issuer, though I would say it's not Fulcio's responsibility to test any federated providers (the SPIFFE ones primarily). We should make sure the default providers - GitHub, Dex, Google, K8S - are working.

To better prioritize them, are these all tests that we are willing to block our next release/deployment on, or only some?

The list above is what I'd like to have block a release. Not sure when we're planning to cut a new release. If we need a new release soon, we might need to manually test, but I think automation should be in scope for GA. Does that seem fair?

@k4leung4
Copy link
Contributor Author

I would like to see new tests in Go, as it is easier to maintain, and less likely to grow organically in an unmanageable fashion.

Agree on the verifying of GitHub/Dex/Google/K8s providers.

I think it is fair to have automation in scope for GA.

Moving forward, I would like to push to have e2e integration tests for all new features so we can avoid the situation where we don't feel confident about releases/deployments.

When do you think we will have the tests or docs ready, whether it is for the prod migration or just another general release?

@haydentherapper
Copy link
Contributor

Note to self: We need to also test:

  • Values of the certificate
  • Successful entry in a transparency log, successful verification of the SCT, and verify inclusion proof

@loosebazooka
Copy link
Member

Questions/things so I can frame this in the client perspective and then also put man power behind this:

  1. Should e2e tests live independently of the service (fulcio/rekor) repos? (in GA repo?)
  2. These test will run against prod and staging?
  3. Should/can staging always just be latest relase on each of the service repos? (is it already?)
    • alternatively can there be latest, staging, prod, where "latest" is this potentially chaotic env.
  4. I would like the test suite to be extensible (maybe via a config file or just code) to execute tests from other repo clients
    repo: "github.com/sigstore/sigstore-java"
    tests: "./gradlew e2eTests"
    

@k4leung4
Copy link
Contributor Author

k4leung4 commented Jun 8, 2022

  1. I think it makes sense to for e2e tests to live in a common repo that the service repos can access. Not sure if sigstore/sigstore would make sense or having its own repo. If it is a blocker, I would just choose one a service repo and we can migrate to another repo at a later point.
  2. The goal is to run it against staging, and as necessary against prod. I think running against prod initially will be necessary, depending on the nature of the tests.
  3. Will definitely have a latest env, and I think some version of it will need to live in the service repo,
  • in cosign: test against [prod|staging] rekor|fulcio at head
  • in rekor: test against [prod|staging] fulcio and prod cosign at head
  • in fulcio: test against [prod|staging] rekor and prod cosign at head
  1. +1

@bobcallaway bobcallaway added the ga-candidate Proposed blocking issue for GA release label Jul 20, 2022
@haydentherapper haydentherapper self-assigned this Aug 8, 2022
@dlorenc
Copy link
Member

dlorenc commented Aug 13, 2022

IMO the probers we have are sufficient for GA, although they can always be improved.

@trixor
Copy link
Member

trixor commented Aug 26, 2022

@lukehinds @dlorenc what are your thoughts on if this needs to be a GA Blocker or not?

@haydentherapper haydentherapper mentioned this issue Aug 26, 2022
2 tasks
@dlorenc
Copy link
Member

dlorenc commented Sep 20, 2022

Not a GA blocker IMO.

@trevrosen
Copy link

I'm not seeing this as a GA blocker, but it would be a good fast-follow issue, possibly for my team to tackle.

cc: @codysoyland @kommendorkapten

@haydentherapper
Copy link
Contributor

Not a blocker sounds good.

@priyawadhwa priyawadhwa removed the ga-candidate Proposed blocking issue for GA release label Sep 22, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

8 participants