Virtual User Validation¶
The virtual user validation tool helps you ensure that your Virtual User runs fine before starting a test. It is available in the right menu of the virtual user design view.
The following procedure explains how to run a Virtual User Validation:
- Select the region where to start the Virtual User (You can add your own private injectors in the Private Hosts page).
- Click on the Validate button,
- The Validate button is disabled and a progress bar is displayed. The virtual user check can be stopped any time while running,
- Wait a few seconds, while OctoPerf runs a single instance of the VU and records the requests sent, and the responses received. Several iterations of the same VU can be executed from the Configuration button.
In the interest of time, Think times are not replayed during a Virtual User validation.
You can configure how many iterations to run and browser settings by clicking on the Configuration button.
The sanity check is the first step of a Virtual User validation. This process scans your virtual user to check if anything can break its execution (i.e. missing files) or can make your load testing results hard to analyse (i.e. empty or unnamed containers).
It displays a table that lists all errors:
|Level||The error level: INFO, WARNING or ERROR, displayed as a colored icon.|
|Error description||A short message to help you identify the issue.|
|Button||A magnifier icon, you can click on it to be redirected to the variable or action that causes the issue.|
If any breaking error is detected (level ERROR, with an orange stop icon), the debugging of the virtual user is canceled. You need to fix it before running the validation once again.
|A file is missing for CSV variable||In JMeter, the behavior for a missing file would be to stop the test and log a message. Instead we use this opportunity to warn about the missing file.|
|CSVVariable has a conflicting column names||CSV variables using the same column names will be impossible to differentiate at runtime. Instead we recommend to use a prefix on column names.|
|JSR223Action is empty||An empty JSR223 action can generate a lot of logging for no purpose. It is better to remove unnecessary JSR223 actions.|
|No Server Found||If you remove a server without removing the corresponding request, then it is impossible for us to launch the test. Note that this can only happen if you play with the API, otherwise we would remove all requests when removing the server.|
|Cyclic Dependency Detected!||When using fragments, it is possible to configure a cyclic reference by using a fragment within a fragment. This behavior would prevent us from parsing the virtual user before runtime which is why it is forbidden.|
These message are meant to get your attention on potential issues.
|Clear Cookies before recording or manually remove Cookie header||If you haven't properly remove cookies before recording, you may end up reusing invalid session IDs. This can lead to unrealistic tests, so we recommend to remove the cookie header on your first request or make sure to clean cookies before recording.|
|Empty file for CSVVariable||If a CSV variable has an empty file, then there will be no values to distribute during the test. This can happen when the file you provided cannot be parsed, for instance if it is not a UTF-8 file.|
|End Of Value Policy is 'Stop VU'||In this situation, the test will stop when out of values. Make sure that's the expected behavior and that the files contains enough values.|
|file is missing for POST request||In this case a multipart POST request needs a file, but we cannot find it. Make sure the file name is correct, the path to the file (
These messages are pointing out problems that can become inconvenient later but that can usually be disregarded.
|Host header and server host are differing||Some web servers will refuse your request if Host header and URL have different values. If required, use the search and replace to change the Host header of all requests.|
|Using a JMeter generic action||You have imported a generic action from JMeter, that's not an issue at all but you may have to check if its configuration is ok in OctoPerf. Path to files is
|XXX sec thinktime is high||You have configured a very high think time or imported an HAR with a large pause somewhere. This may result in a long period of inactivity during your tests, which is why we prefer to warn you about it.|
|xxxxx should have a name||Unnamed elements will be displayed as
|xxxxx is empty||Empty nodes like controllers or other logic actions will never be executed, it is better to remove them.|
|HTTP Action has empty query parameter||Empty name and value for a parameter means you either misclicked and created one by mistake or you imported an HAR from a source that tends to add these unnecessary parameters. Either way, some web servers will fail your request in this situation so we prefer to warn you.|
When the validation is started, a button with a file icon appears at the right of the Validate one: it opens a Logs panel.
When compared to the recorded requests / responses, VU Validation information lets you know if anything went wrong. To give you a quick access to this information, OctoPerf displays colored dots on the left of the VU tree nodes:
There are 3 colors a response can get:
- The response is OK,
- The response is KO,
- At least one occurrence or children is KO.
In case one request has been executed several times, one KO over all the executions is enough to mark it as yellow.
The following table indicates the situations considered as KO/Orange during a validation.
Each line represents the validate response for a category of response time. For instance
200 and similar response codes.
Unknown response codes refers to codes that are outside the list of standard codes.
Each column represents the recorded response code in a similar fashion.
|Validation code||No recorded||Recorded 2XX||Recorded 3XX||Recorded 4XX||Recorded 5XX||Recorded unknown|
It is possible to use response assertions to ignore an error.
During runtime, the responses will be evaluated based only on their response code, as if there was no recorded response. Please refer to the first column of the table in that case.
Compare to record¶
You can also compare recorded requests / responses to Validate VU ones for each HTTP Request Action, in their Check Request / Response tab. The
Timing differences menu lets you compare basic metrics between runs:
|Save as recorded||This action will overwrite the recorded response with the new one. Be careful, it cannot be undone.|
|Open in a new tab||Sends the response content in a new browser tab to attempt rendering it.|
If you want more details on how to analyze the errors, please check the error table report item page.