Skip to content

Debug Gatling Scripts

Debugging a Gatling script in Kraken runs two Docker containers:

  • One that starts Gatling and generates a simlation.log file,
  • One that parses the generated log file and creates several Debug entries.

From the Simulations Tree

To debug a script directly from the simulations tree:

Gatling Script File Menu.

  1. Browse the simulations tree to find the wanted .scala file,
  2. Place you mouse over the file, showing the shortcut buttons,
  3. Click on the Debug Icon icon, a Debug Simulation dialog appears,
  4. Leave the package.Class field as it is,
  5. Update the Run Description to differentiate this run from the others,
  6. Select a Host,
  7. (Optional) fill in environment variables,
  8. Click on Run.

Execution Dialog

Debug Gatling Simulation Dialog

Check the Run Simulation dialog to know more about the different parameters that must be set in the Debug Simulation dialog.

The only difference is that a Debug task can only be executed on a single host.

Warning

You must configure your script to run only one iteration .inject(atOnceUsers(1) before debugging it!

You might otherwise generate a lot of debug files and crash the system.

A warning message is displayed in the dialog, for example if you configured your script with the following injection profiles:

setUp( users.inject(rampUsers(10) during (10 seconds)), admins.inject(rampUsers(2) during (10 seconds)) ).protocols(httpProtocol)

Execution Logs

Logs Panel

A new Gatling simulation execution will quickly show in the bottom Logs panel. And a task is added to the Tasks table. It displays the logs of:

  • The docker-compose up command if you are using Kraken Docker,
  • The Kubernetes events related to the created task if you are using Kraken K8S.

A new test result is added to the results table. And as the test is running, new debug entries will appear in the debug table.

By default, the logback.xml file is configured to display println results into the console. Click on any Logs Icon button in the Containers table to check them out.

The following code sample describes how to print the complete session into the command execution logs:

val scn = scenario("Scenario Name")
    .exec(http("request_1")
      .get("/"))
    .exec { session => println(session); session }

Note

This information is also available if you open a debug entry.

The following code sample describes how to print the response body into the command execution logs:

 val scn = scenario("Scenario Name")
    .exec(http("request_1")
      .get("/")
    .check(bodyString.saveAs("responseBody")))
    .exec { session => println(session("responseBody").as[String]); session}

First the body is saved into the session with the command bodyString.saveAs then printed out with println(session("responseBody").as[String]).

Note

This information is also available if you open a debug entry.

From the Scala Editor

To debug a script from the scala editor:

  1. Browse the simulations tree to find the wanted .scala file,
  2. Open it by double clicking on by using the Edit button (visible on mouse over),
  3. In the file editor, click on the Debug Icon button on the top right corner.

Compare Debug Results

Compare Debug Dialog

To compare debug entries:

  1. In the debug table, click on the Menu Icon button,
  2. A menu appears, click on the Compare With menu item,
  3. The Compare Debug dialog opens,
  4. Select the left debug entry,
  5. Select the right entry.

The comparison is displayed bellow the debug selectors. Text highlighted in green indicates differences between the selected debug chunks.

Info

You can compare debug entries from different test executions or with HAR imports.

Tip

Green (...) text indicates that the content is the same for both debug entries. Click on it to show the identical content.