End-to-End Tests

Automated End-to-end testing to thoroughly corroborate the integrity of generated projects.

Considering the importance of a reliable testing process on the parts that constitute the XOOM Designer codegen, our team created a small library of classes and resources in order to facilitate the creation of broader test cases that, along with the usual testing approaches, readily checks the integrity of projects created from XOOM Designer. Thus, the following aspects are validated:

  • Project model processing;

  • Compilation;

  • Initialization;

  • API operations;

  • Event projections;

  • Schemas registration / transpilation;

In the next section, we'll understand how to run End-to-End tests and implement new ones.

Execution

End-to-end tests are part of the Designer build lifecycle, so you can execute it by calling:

$ mvn verify -P e2e-supporting-services

Understanding what happens afterward, the sequence of Maven goals performed during the execution of this command is unit tests, packaging, End-to-End tests. So, in case you want to focus on the test results and skip the packaging step, add the only-tests profile:

$ mvn verify -P only-tests, e2e-supporting-service

Some test cases depend on external resources (e.g. XOOM Schemata) which are meant to be up and running when tests are executed. Using e2e-supporting-service profile, these resources are automatically started via Docker. Alternatively, you can manually run these dependencies. Learn more in the Dependencies section.

Implementation

Exploring the XOOM Designer codebase, End-to-End tests are found in the src\e2e-test\ and its classes have to be under the packageio.vlingo.xoom.designer.codegen.e2e.* extendingJavaBasedProjectGenerationTest. Let's take a look at the basic structure of a test class:

public class BookStoreServiceGenerationTest extends JavaBasedProjectGenerationTest {

  @BeforeAll
  public static void setUp() {
    JavaBasedProjectGenerationTest.init();
  }

  @Test
  public void testThatServiceWithStatefulEntitiesIsWorking() {
    //Load model
    //Generate and run the project
    //Assertions
  }

  @Test
  public void testThatServiceWithSourcedEntitiesIsWorking() {
    //Load model
    //Generate and run the project 
    //Assertions
  }

  @AfterEach
  public void tearDown() throws Exception {
    JavaBasedProjectGenerationTest.clear();
  }
}

JavaBasedProjectGenerationTest.init() prepares the test environment setting the internal components required for generating a project and runs the XOOM Designer Server. Whereas, clear() basically stops the application ran in each test case. As demonstrated above, ensure that both methods are invoked respectively at the right time of the test lifecycle.

The main test input is a JSON file containing a XOOM Designer model. It should be saved under src/e2e-test/resources/sample-models/[context-name]/ . the values of deployment.httpServerPort and projectDiretory have to be updated to a replacement token (%s) for the fact that these values will be dynamically resolved during the tests. Here's an example:

{  
  "deployment": {
   "httpServerPort": "%s"
  },
  ...
 "projectDirectory": "%s"
}

In the test method, the model can be loaded by creating anio.vlingo.xoom.designer.codegen.e2e.Project object which is also used to generate the project.

@Test
public void testThatGeneratedServiceWithStatefulEntitiesIsWorking() {
     
  //Loads the model passing the directory name and its name
  
  final Project projectWithStatefulEntities =
          Project.from("book-store-context", "book-store-with-stateful-entities");

  super.generateAndRun(projectWithStatefulEntities);
  
  ...
 }

The generateAndRun method receives the Project object carrying the model settings which is submitted to the XOOM Designer API. Still, this method internally asserts that the project is properly generated. If the generation succeeds, the project will also be compiled and initialized. Otherwise, a message assertion error will be shown.

Afterward, the test statements refer to the API validation. The next code snippet gives some tips on how to perform requests and assertions:

@Test
public void testThatServiceWithSourcedEntitiesIsWorking() {
 final Project projectWithStatefulEntities =
    Project.from("book-store-context", "book-store-with-sourced-entities");

 super.generateAndRun(projectWithStatefulEntities);

 final BookData newBook = BookData.sample();

 final Response response =
    super.apiOf(bookStoreProject).body(newBook).post("/books");

 final BookData responseBody =
    response.then().extract().body().as(BookData.class);

 Assertions.assertEquals(Status.Created.code, response.statusCode(), "Wrong http status while creating book " + bookStoreProject);
 Assertions.assertEquals(newBook, responseBody, "Wrong response while creating book " + bookStoreProject);
}

Other assertions should be added in order to make a meticulous verification of the project's consistency. It's also strongly recommended that each test method are commented on regarding the XOOM Designer model details.

/**
 * Test that the service is generated and working with:
 * - Stateful Entities containing only scalar-typed fields
 * - Operation-based projection
 * - Xoom Annotations + Auto-dispatch
 */
@Test
public void testThatServiceWithStatefulEntitiesIsWorking() {

  ...
}

See the full code used in this example here.

Dependencies

Whenever a XOOM Designer model is configured to produce or consume events, the generated project will depend on XOOM Schemata and XOOM Lattice/Exchange (RabbitMQ). Therefore, the installation, initialization, and closing of these resources are part of the End-to-End testing routine. The quickest way to handle this external resources management on a local environment is to use the e2e-supporting-services profile.

$ mvn verify -P e2e-supporting-services

The End-to-End test cases require that XOOM Schemata and RabbitMQ are respectively available on port 9019 and 5672. Ensure that these ports are not in use before activating the e2e-supporting-services profile.

Then, call SupportingServicesManager.run() before the test cases are executed:

public class CargoShippingServicesGenerationTest extends JavaBasedProjectGenerationTest {

  @BeforeAll
  public static void setUp() {
    SupportingServicesManager.run();
    JavaBasedProjectGenerationTest.init();
  }
  ...
}

The first statements of a test method that generates a project that has dependencies are meant to assert that the external resources are available:

@Test
public void testThatGeneratedServicesAreWorking() {
  super.assertServiceIsAvailable(SupportingServicesManager.findPortOf(SCHEMATA), "Schemata service is not available");
  super.assertServiceIsAvailable(SupportingServicesManager.findPortOf(RABBIT_MQ), "RabbitMQ service is not available");
  
  //Generate the project
}

The SupportingServicesManager automatically shut down the managed services when the test cases execution is done.

Constraints

Following the classical test pyramid concept that states that "the more high-level you get the fewer tests you should have", we should not have several End-to-End test cases once they are slower and hard to maintain. Rather, a small set of test cases that cover a number of corner cases and complex scenarios is the best goal.

Due to the complexity of covering user-interface functionalities on automated testing, the implementation of End-to-End tests on XOOM Designer does not validate the XOOM Designer UI. Also, when a XOOM Designer model has an embedded ReactJS app, the generated frontend code is not syntactically checked, taking into account that this additional validation would imply resolving the framework dependencies increasing significantly the time consumed on each test.

Last updated