integration tests and mocking #12

Open
opened 2024-06-14 14:55:41 +00:00 by twenty-panda · 7 comments
Contributor

maintaining tests for a growing matrix of forges and versions requires some kind of optimization otherwise it becomes prohibitively expensive over time.

  • Integration tests must run against a live instances (current setup)
  • Integration tests must fallback to mocks when the live instance is not available (to be done)

The general idea is similar to what is implemented in Forgejo for GitLab:

  • If the live instance is available, use it and store the HTTP requests & response as mocks
  • If the live instance is not available, use the mocks

The Forgejo code could be used in gof3 as well, unless there is a package that does the same in a better way?

maintaining tests for a growing matrix of forges and versions requires some kind of optimization otherwise it becomes prohibitively expensive over time. * Integration tests must run against a live instances (current setup) * Integration tests must fallback to mocks when the live instance is not available (to be done) The general idea is similar to what is implemented in [Forgejo for GitLab](https://codeberg.org/forgejo/forgejo/blame/branch/forgejo/models/unittest/mock_http.go): * If the live instance is available, use it and store the HTTP requests & response as mocks * If the live instance is not available, use the mocks The [Forgejo code](https://codeberg.org/forgejo/forgejo/blame/branch/forgejo/models/unittest/mock_http.go) could be used in gof3 as well, unless there is a package that does the same in a better way?
twenty-panda added the
Kind/Testing
label 2024-06-14 14:55:48 +00:00
Author
Contributor

Good fit

Not a good fit

Does not allow for capturing & storing requests, mocks have to be programmatically defined

### Good fit - https://github.com/SpectoLabs/hoverfly - HTTP(S) proxy for recording and simulating REST/SOAP APIs with extensible middleware and easy-to-use CLI. - https://github.com/seborama/govcr - HTTP mock for Golang: record and replay HTTP/HTTPS interactions for offline testing ### Not a good fit Does not allow for capturing & storing requests, mocks have to be programmatically defined - https://github.com/h2non/gock - https://github.com/jarcoal/httpmock
Author
Contributor

Hoverfly is mature and well maintained. It runs as an independent server alongside tests.

docker run -d -p 8888:8888 -p 8500:8500 spectolabs/hoverfly:latest
curl -k --proxy http://0.0.0.0:8500 https://forgejo.org/
xdg-open http://0.0.0.0:8888 # switch to simulate in the web interface
curl -k --proxy http://0.0.0.0:8500 https://forgejo.org/
xdg-open http://0.0.0.0:8888 # see there has been one capture & one simulation

image

Hoverfly is mature and well maintained. It runs as an independent server alongside tests. ```sh docker run -d -p 8888:8888 -p 8500:8500 spectolabs/hoverfly:latest curl -k --proxy http://0.0.0.0:8500 https://forgejo.org/ xdg-open http://0.0.0.0:8888 # switch to simulate in the web interface curl -k --proxy http://0.0.0.0:8500 https://forgejo.org/ xdg-open http://0.0.0.0:8888 # see there has been one capture & one simulation ``` ![image](/attachments/9d8fc8d2-f3fc-45b7-a271-ab870bd7ce9e)
Author
Contributor

it works nicely but there is an issue with random name generation (token, users etc.) because it changes the URLs and fails to match the recorded HTTP requests. Changing that to non random numbers the recorded HTTP requests are found as if the server was present and the test completes successfully.

hoverctl start
hoverctl mode capture
GOF3_FORGEJO_PORT=8781 go test -v -run TestF3PullRequest code.forgejo.org/f3/gof3/v3/tree/tests/f3/...
hoverctl mode simulate
GOF3_FORGEJO_PORT=8781 go test -v -run TestF3PullRequest code.forgejo.org/f3/gof3/v3/tree/tests/f3/...
it works nicely but there is an issue with random name generation (token, users etc.) because it changes the URLs and fails to match the recorded HTTP requests. Changing that to non random numbers the recorded HTTP requests are found as if the server was present and the test completes successfully. ```sh hoverctl start hoverctl mode capture GOF3_FORGEJO_PORT=8781 go test -v -run TestF3PullRequest code.forgejo.org/f3/gof3/v3/tree/tests/f3/... hoverctl mode simulate GOF3_FORGEJO_PORT=8781 go test -v -run TestF3PullRequest code.forgejo.org/f3/gof3/v3/tree/tests/f3/... ```
Author
Contributor

with a few adjustments (PR incoming) it works. However, running the full test suite in a single stateful capture is brittle and needlessly so. Instead each test that relies on a service running should start by initializing and new stateful capture or loading one and save the result on completion for later use.

with a few adjustments (PR incoming) it works. However, running the full test suite in a single stateful capture is brittle and needlessly so. Instead each test that relies on a service running should start by initializing and new stateful capture or loading one and save the result on completion for later use.
Owner

I think it would be best to have a per-test control of the capture / simulation. Something like:

defer hoverfly.Run("MyCaptureName")()

which would:

  • do nothing if the HOVERFLY environment variable is not set
  • capture if HOVERFLY=capture and export into $(pwd)/hoverfly/MyCaptureName.json and set the http_proxy environment variable
  • simulate if HOVERFLY=simulate by importing from $(pwd)/hoverfly/MyCaptureName.json and set the http_proxy environment variable
I think it would be best to have a per-test control of the capture / simulation. Something like: `defer hoverfly.Run("MyCaptureName")()` which would: * do nothing if the HOVERFLY environment variable is not set * capture if HOVERFLY=capture and export into `$(pwd)/hoverfly/MyCaptureName.json` and set the `http_proxy` environment variable * simulate if HOVERFLY=simulate by importing from `$(pwd)/hoverfly/MyCaptureName.json` and set the `http_proxy` environment variable
Author
Contributor

This is working fine and it is simple 👍 The problem now is that selectively doing that for the compliance suite is currently not possible because it goes like this:

  • Create a fixture that contains everything
  • Run compliance tests for each type of resource

For instance if compliance tests for milestones are to be captured while anything involving git HTTP requests is to be excluded, one would have to change the code in two places. It does not looks like this two stages test method is a requirement. It should be possible to only create the minimal set of resources to run a given test instead of having it depend on a large fixture containing resources that won't play any role in the test itself.

Making each test independent in this way means that the sum of their execution is going to be more costly because each of the will want to create fixtures in their own way instead of sharing them with other tests. But that also has two benefits:

  • tests are less co-dependent and can be improved or debugged in a standalone way
  • some tests can be captured and replayed which saves a lot of resources
This is working fine and it is simple 👍 The problem now is that selectively doing that for the compliance suite is currently not possible because it goes like this: * Create a fixture that contains everything * Run compliance tests for each type of resource For instance if compliance tests for milestones are to be captured while anything involving git HTTP requests is to be excluded, one would have to change the code in two places. It does not looks like this two stages test method is a requirement. It should be possible to only create the minimal set of resources to run a given test instead of having it depend on a large fixture containing resources that won't play any role in the test itself. Making each test independent in this way means that the sum of their execution is going to be more costly because each of the will want to create fixtures in their own way instead of sharing them with other tests. But that also has two benefits: * tests are less co-dependent and can be improved or debugged in a standalone way * some tests can be captured and replayed which saves a lot of resources
Author
Contributor
See https://code.forgejo.org/f3/gof3/pulls/16/files
twenty-panda removed the
Kind/Testing
label 2024-06-23 16:37:43 +00:00
Sign in to join this conversation.
No milestone
No project
No assignees
2 participants
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Reference: f3/gof3#12
No description provided.