Let's face facts: When you publish APIs, you need to test them - and that means really test them. APIs with zero-or-limited testing are likely to be unreliable, prone to failure, riddled with inaccuracies (as your OpenAPI document and the data the API returns don’t match) and - which so many people fail to appreciate - reflect badly on you and the company you work for. Reputation capital is an hugely important asset for your organization and the products it brings to market, and poorly-performing or malfunctioning APIs have a negative impact.
There are of course, lots of different approaches and a myriad of tools to go with them and different folks favor different means for getting it right. One tool that always comes high on the recommended list is Postman, probably the foremost API design, development test tool in the API universe right now, along with its command-line buddy Newman. Developers love this Swiss Army Knife of a platform that provides the means to design your API, build tests suites and catalog your APIs across your organization.
However, it is possible that you like Postman but it’s not your API design tool of choice. You might like crafting your OpenAPI documents in a traditional IDE or another product like Stoplight. Hell you might just love vim and doing it the hard way. Whatever your choice, your OpenAPI may well not start life in Postman but you still like the idea of a Postman Collection at the end of API design - with tests to boot.
This is where Apideck’s tool Portman this your friend. In this post we’ll go into more detail on how to approach API testing and point to where Portman can help in the process.
How should you test?
There are a variety of different means to test APIs, but for the vast majority of developers and testers the focus will be on 3 types of test that cover the interface, the implementation of the API in code and a running instance of your API (using the Interface-Implementation-Instance categorisation discussed in Continuous API Management:
- Unit Testing: Does the implementation of the API in software work as required?
- Integration Testing: When I deploy an instance of the API does it work as expected?
- Contract Testing: Does the interface - the API specification - and the software - the implementation - match up?
You don’t necessarily have to do these in sequence - you might consider doing unit testing together with contract testing, for example - but each has a variety of approaches and things to consider. We’ll therefore take a look at each in turn.
Unit Testing
How you like to unit test your software is largely a matter of preference. Many API developers will opt for testing the functions that make up the implementation, using “traditional” testing methods and tools like (in the JavaScript world) Mocha and Chai.
However, there is much to be said for executing unit tests external to the code base, using a HTTP client that can fire tests at a running instance of the implementation. Doing so has the benefit of testing the controller that directs a given API call to a given module of code, and how the resultant response is generated down the wire. You are probably going to want to do this pretty close to the client with external services your implementation relies on - databases, etc. - stubbed out.
It’ll also give you a better understanding of whether your API is performing to specification. Are the endpoints correctly addressable? Are the methods correctly supported? This will give you confidence that your implementation does what it says on the tin, and is pertinent when it comes to contract testing.
From the perspective of what Portman can do for you, when you have created your API design (in an OpenAPI document) and an implementation in code generating unit tests from the OpenAPI document is extremely simple. As we covered in our previous post on Portman generating a Postman Collection is a simple few commands:
npm install -g @apideck/portman@latest
portman -u https://raw.githubusercontent.com/OAI/OpenAPI-Specification/main/examples/v3.0/petstore.yaml -t false
This will give you a barebones Postman Collection with no built-in tests (which we’ll come on to later) but the ability to fire requests at your API either interactively or from the command line using Newman. You can then parameterise these as you see fit to form the basis of meaningful tests.
Integration Testing
So you’ve unit tested and are happy that your implementation is functioning correctly. Next step is to actually put an instance of your API in the right place with the correct components hooked in. You may have had to do some of this when you unit tested if you could not, for example stub a messaging platform.
The reason integration testing is important is pretty simple. You have deployed your application and there are other components in your stack. You need to be sure that everything is functioning correctly and as-expected.
From the perspective of what Portman can do for you it’s straightforward - you’ve largely already done it:
- Extend your test cases that are already in your Collection.
- Configure your environment to reflect the running instance.
- Use Newman to automate the execution of your tests on each deployment.
By doing this you’ll have confidence in your deployment is successful and your API functioning correctly.
Contract Testing
With your implementation in place as a running instance you’ve now got the opportunity to check that your API actually does what it “says on the tin”, and do that end-to-end. Contract testing is the opportunity to ensure that what you’ve implemented in software and what you’ve published as an API specification match. Your API consumers only “know” the API specification not the implementation so as far as they are concerned your API specification is their only source of truth.
You are going to what to test things like:
- Do I consistently get a 400 when I send in invalid parameters?
- If I define a
required
property, is it always present? - Does a string defined with a minimum and maximum length always match the specified values?
- Is an unsigned integer always so?
You may be thinking, why do this now? I had my API running at the unit testing stage and was happy with the results? Executing contract tests after integration testing is important due the manner in which most APIs are layered by design. For example, you may have API management sitting in front of your application, a production data source sitting behind it which is wider and deeper than your development dataset and any number of HTTP-serving components bridging networks or providing resilience. You absolutely should account for this when performing contract testing, as your API specification may define a 405 when an unsupported HTTP method is sent to your API but your (badly-configured) API management layer steps in and returns a 500.
Whatever the scenario, you need some proof that what you send back to API consumers is as-per the specification, and having a running instance in the right place in your architecture is the best time to do this. There may also be characteristics of your API management solution - for example - that you cannot mitigate for and a design change is required.
Here Portman can again help with your efforts. If you recreate the Collection, this time omitting the -t false
flag you’ll create contract tests for your endpoints:
npm install -g @apideck/portman@latest
portman -u https://raw.githubusercontent.com/OAI/OpenAPI-Specification/main/examples/v3.0/petstore.yaml
In the Tests tab for a given operation in Postman you’ll find JavaScript that tests the responses against the expectations laid out in the OpenAPI document:
// Validate status 2xx
pm.test("[GET]::/pets - Status code is 2xx", function () {
pm.response.to.be.success;
});
// Validate if response header has matching content-type
pm.test("[GET]::/pets - Content-Type is application/json", function () {
pm.expect(pm.response.headers.get("Content-Type")).to.include("application/json");
});
// Validate if response has JSON Body
pm.test("[GET]::/pets - Response has JSON Body", function () {
pm.response.to.have.jsonBody();
});
// Response Validation
const schema = {"type":"array","items":{"type":"object","required":["id","name"],"properties":{"id":{"type":"integer","format":"int64"},"name":{"type":"string"},"tag":{"type":"string"}}}}
// Validate if response matches JSON schema
pm.test("[GET]::/pets - Schema is valid", function() {
pm.response.to.have.jsonSchema(schema,{unknownFormats: ["int32", "int64", "float", "double"]});
});
// Validate if response header is present
pm.test("[GET]::/pets - Response header x-next is present", function () {
pm.response.to.have.header("x-next");
});
Rerunning your tests - again with a representative dataset - will help you ensure you have confidence delivering your API to your developer community, and that they are getting the results they are expecting from your API.
Of course, you should generate your Collection with the contract tests from the word go - doing it later in this post is purely for the purpose of exemplar. The point is that contract testing shouldn’t be a one-off. When your instance is deployed in either production or an environment that represents production you should give them a spin.
Final Thoughts
In this post we’ve provided some rationale for the different types of testing you can do with your API and how using Portman to generate a Postman Collection can provide you with runnable tests. This post only really scratches the surface and there are other options available in Portman like variation tests that will help you test “unhappy” paths. We also do not cover important test practices like penetration testing.
How much or little you test is a matter of choice, but you should keep in mind that your level of rigor can only be of benefit to the people you really need to care about, namely your developer community. Going to market with a well-functioning API with an accurate API specification earns significant reputation and - most importantly - has the potential to deliver in revenue terms too.
Ready to get started?
Scale your integration strategy and deliver the integrations your customers need in record time.