i have question around testing! of now, i'm in unique situation software writing going out third party vendor , storing data. in situation, writing test confirm that data posted means using production endpoint...which doubt smart way write things.
in situation, no 1 seems have solution ensuring endpoint works other asking third party dummy/test setup. wondering if had better idea of how perform sort of interaction. how write efficient contract test when don't own bit of codebase?
please , thank ya :d
there different alternatives deal situation, depending on level of collaboration can third party vendor:
contract testing:
pact mature framework allows write unit tests make http requests against mock third party provider service, persisted document (the pact file) can shared vendor. if using pact well, vendor can use pact file run tests against provider service. consumer (your service) documents endpoints used of provider (the third party service) along expected responses, , provider validates against itself, ensuring integration.
for approach work, third party vendor has open fetch pact file , run consumer contract tests against service. contract testing allows tests work totally independent provider service, service , provider's service never connected while testing.
record , replay:
the idea behind approach write tests that, in initial run, make requests against real services , record responses. next runs of these tests not reach real services instead operate against recorded responses first run. vcr example of library enables kind of testing.
for approach work, don't need cooperation third party vendor. make requests real service time time (to keep sample responses fresh), subject availability of provider service.
test environment:
as mentioned in question, asking test environment/account provider possibility. resource, write end end tests reach realistic provider service, having access environment make assertions on state part of tests.
the challenge approach around maintenance of test environment: how can sure version same 1 integrating against in production? looks after environment's availability? creates data in environment? data realistic , representative of real universe?
semantic monitoring:
a final option write test makes sanity check of integration between service , provider's in production environment. test run after every deployment on end, or on regular basis outside of deployment windows.
for approach work don't need collaboration third party vendor, alternative doesn't scale if have lot of integration use cases, these tests tend slow run, flaky (as depend on real networks , systems availability) , pollute real environments. it's better keep these tests top of testing pyramid, focused on critical use cases. additionally, won't able test beyond happy paths, don't have control of provider service set in specific state beyond "normal" one.
Comments
Post a Comment