To learn more about SoapUI and the ReadyAPI platform, click here.
1. Working with ReadyAPI
First of all thank you for downloading ReadyAPI. Now you are ready to get started with your API testing. This article will help you use all of the time saving features of Pro to set up your first API tests, as well as provide instruction on maintaining these tests on an ongoing basis!
If you don't have ReadyAPI you can either download a free 14-day trial or purchase a license today.
We will go over general test creation and maintenance where you can fully use ReadyAPI features:
- We will use the Form Pro editors to better manage and interact with our request and responses
- Use point and click testing to quickly create tests and validation rules (assertions)
- Use data driven testing to take our tests up a notch and vary the requests in our tests
- Use reporting feature to export reports and share test findings with the rest of the organization
- Use Pro features like automation with report generation, refactoring and leveraging support to maintain your ReadyAPI test cases
Hopefully it has been brought to your attention that ReadyAPI can do a lot more than just work with SOAP. It can be used for testing RESTful APIs, JMS, JDBC and AMF.
In order to keep it simple, I will focus this article on testing a SOAP service.
Let's get started. We will begin with creating a new project in ReadyAPI.
Create a Project
Go to 'File' menu then choose 'New SOAP Project'. Then provide a project name and a path to the WSDL URL. You should see the following popup:
(The WSDL-file we use in this tutorial is: http://wsf.cdyne.com/WeatherWS/Weather.asmx?WSDL)
Click "OK".
This will generate two Interfaces for WeatherSoap and WeatherSoap12. These are in the WSDL to support SOAP v1.1 and v1.2, respectively. You navigator pane should look like this:
Continue on to adding functional tests.
2. Add Functional tests
Now that we created the Interface to the web service for ReadyAPI, we can use sample requests to verify sample calls to the web service, but for serious testing we have to create test cases. ReadyAPI uses TestCases to organize API calls into logical steps that will have additional benefits of including pass/fail assertions, automating test runs, generating reporting, etc.
To add our "GetCityWeatherByZIP" operation to a TestCase, in Navigator window, browse to: Projects > Getting Started > WeatherSoap > GetCityWeatherByZIP > Request 1, and right-click on that "Request 1"
This project does not yet have a TestSuite to store a new TestCases, so we are prompted to create a TestSuite:
Click OK. You are prompted to create a new TestCase:
Finally you are prompted to add the request for "GetCityWeatherByZIP" operation to the created test case. Accept the defaults and click OK:
So now we have our very first test case created in our project:
Continue on to validating with assertions.
3. Validating with Assertions
Now let's call our web service and see the results. Double click on "GetCityWeatherByZIP - Request 1" in the request window, populate the ZIP text field with "02111" and press the green play button:
You can see that after we submitted our request, the icon turned Green. This is because by default we applied the "SOAP Response" assertion and it validated once we received the SOAP response from a web service.
Also, keep in mind that the above picture is showing the nicely rendered request/response in our Form and Overview editors that are available in Pro. The form view is where you can do point and click testing, more about that below. Behind the scenes the request and response are still sent and received in XML format:
So let's add an Assertion that will validate that in the response, the value of "Success" XML element is always "true".
On the bottom of ReadyAPI, click on "Assertions tab" then click the icon to add an assertion:
This will bring up a window to add assertions. Each assertion has a description and all 22 assertions are categorized. In our case choose "Property Content" category and "XPath Match" assertion:
This will bring up the "XPath Match" assertion window. Click the icon shown below to launch a wizard that will generate an XPath assertion for you
Click this icon to bring up the Select XPath dialog:
Select the "Success" XML element and click OK. This will now populate the "XPath Match" assertion window with XPATH expression and expected result:
Click the 'Save' button. Now you have automated your validation of a successful response. At this point as an independent exercise, try searching for ZIP of "11111" in the request and note the failed assertion turn red when the response comes back. Read on to learn how to change the input and expected values with data driven testing.
Continue on to adding data-driven input.
4. Add Data-Driven input
Now from our test case we can call out our web service with a ZIP code and be sure we get the expected response. But what happens if you want to test your web service with every available ZIP code? Or in the case of your particular web service, you may want to submit thousands of rows of data that you want to test your API. Well, in this case data driven functionality comes to the rescue!
We have to add a few new steps in a test case, so in Navigator window, double click on "TestCase 1". In the "TestCase 1" window that opens up, right click, then select "Append Step" then "DataSource".
This will add the DataSource test step. Double click on it to configure it. In the screen that shows up note the following:
- DataSource type dropdown – lets you pick which external source you would use to pull into your test case.
- Excel – Point to an Excel (xls) file
- JDBC – Connect to a database and pull data with a select statement or a stored procedure
- File – For CSV or other delimited files
- Grid – Manually define rows of data right in the ReadyAPI project
- XML – Parse values out of XML file or response
- Directory – parse out file names or file contents into your test
- Groovy – write custom logic to populate into a DataSource step
- Data Connection – alternative to JDBC type for connecting to databases
- Properties – Need to be defined as placeholders for values in ReadyAPI DataSource step. Think of them as columns. So, for example, if you have a select statement getting ten values in the select section, defining three properties would grab values out of first three columns
Select Grid from the DataSource drop down, so that we can enter in data right in ReadyAPI. Define two properties: 'ZIP' and 'City'. Then enter the following values:
ZIP |
City |
02111 |
Boston |
92199 |
San Diego |
77201 |
Houston |
11111 |
New York |
So now your DataSource step configuration should look like the following:
Now go back into your "GetCityWeatherByZIP - Request 1" step by double clicking on it in the navigator. Now let's change the ZIP so that it pulls data from our data source, instead of using a hardcoded value. For that, right-click into the ZIP field > browse to "Get Data …" > [DataSource] step > Property [ZIP]
So to put a little context on what we just did – we used ReadyAPI's point and click testing to generate property expansion syntax.
Now let's edit our assertion to also validate the cities being returned by our web service call. Click on the "Assertions" tab at the bottom of the screen and then double click on existing "XPath Match" assertion that we configured earlier. Click on the same icon to generate XPATH. Then select the "City" element from the response:
Click "OK". Now let's make sure that we don't just verify our XML value against a static city name like "Boston" but against another value in the DataSource. So let's again right click, this time in the "Expected Result" section of the "XPath Match" assertion and go to: Get Data … > DataSource step > Property [City]
So now we have an assertion that validates to dynamic references in each DataSource iteration:
You can click "Save" to apply the changes.
(Note: At this point the applied assertion may fail, but it will re-validate again later when we run the test case)
So let's finalize our test case:
- Double click on "TestCase 1" in the navigator
- Now drag the "GetCityWeatherByZIP - Request 1" step below "DataSource" for the right order of execution
- Then right click in the TestSteps tab and append a "DataSource Loop" step:
Click OK for the DataSource Loop name popup. Now double click on the "DataSource Loop" step to configure it. Then select the drop down values as shown below:
Click "OK".
What we have just done is configured our test case to:
- Go to "DataSource" step and fetch a first row into DataSource properties
- Execute the "GetCityWeatherByZIP - Request 1" step with the values populated in DataSource properties
- "DataSource Loop" step then sends the execution back to "DataSource" step, fetches the next row and runs GetCityWeatherByZIP - Request 1" request
So now let's run our test case by clicking the run button and keep an eye on the "TestCase Log" for all of the steps that are executed.
Now we can see that the Test Case looped through different values in steps 2, 4, 6, and 8 of our test case log. Step 8 failed, so let's double click directly on the step 8 and look at the request / response values that we sent and received. We can validate that the request did use our latest ZIP row value of 11111:
But the response did not find the expected "New York" value in city tag:
So it looks like in this case our ZIP of "11111" is not for "New York", it's not even a valid ZIP code. So while the root issue is in our data, I've put it into the scenario so that we do have a negative test. We can share this info with our developers so they can investigate and resolve the issue. Read on to learn how to generate and export reports.
Continue on to reporting in ReadyAPI.
5. Reporting
Now that we tested our web service and found a few issues, we want to share the results with the team. So let's generate a report.
In order to do that, let's go back into our "TestCase 1" window and click on the following button to generate our report:
Once you click that icon, you'll have the following popup that gives you a number of different format options.
- TestCase Report – generates a printable report that can be saved as PDF, HTML, RTF, Excel, etc.
- JUnit-Style HTML Report –generates the JUnit report. Used by people that are already use JUnit or for passing back data in Continuous Integration
- Data Export – Exports data in XML or CSV format that you can then use in custom reporting and integrations
In our scenario, select the "TestCase Report" option, leave all checkboxes as default, and click OK. This will generate the report in our report viewer:
So you can now save this report in: PDF, HTML, Excel, and other supported formats.
Continue on to Test Maintenance.
6. Test Maintenance
Now that we went over test setup, there are a few additional points worth mentioning in ongoing maintenance of your ReadyAPI test cases:
Now that you created you ReadyAPI project, you can share it with the team and add it to your Source Control
Also you can use our command-line utility to generate a command to run your test case and in Pro to generate your report. This can be used for scheduled regression testing or for Continuous Integration process. Though if you are using Maven, then don't forget that we have a Maven plugin for ReadyAPI
Use test case debugging to set conditional breakpoints and walk your test case step by step to help you isolate any issues
Also when maintaining your tests, your web service may completely change. If your SOAP web service changes and all of a sudden developers ask you to test against a new WSDL, then ReadyAPI has a handy refactoring feature which can save your test cases and reuse the login on the new web service
In summary, you can see how powerful and useful Pro's features are. The Form editor, point and click testing, data driven testing and reporting capabilities enable you to configure your test case scenarios quickly and easily For more information, visit 12 reasons to go Pro.
If you haven't already, you can download a free 14-day trial of ReadyAPI or purchase a license today.