Skip navigation
All Places > Alfresco Process Services & Activiti (BPM) > Blog > 2017 > October
2017

This week we had the pleasure of recieving @agoncal (Java Champion / Devoxx FR organizer) in our offices to work in a PoC using JHipster, Activiti and Keycloak.

 

DND4onoXkAA57Gh.jpg

We already have a JHipster generator, but we went further into analyzing what are the common problems that people will face when adopting architectures like the one proposed by JHipster. Luckily for us, Activiti is quite aligned with that technology stack so for the microservice approach most of the challenges are around security mechanisms, forwarding security tokens and being able to connect systems in a secure way. We are open to do these kind of workshops, because we manage to identify a couple of issues that we will need to fix to make developers experience more smooth, so they can get started in minutes.

 

 

@willkzhou worked on the ElasticSeach integration and created and abstract for DevCon ‘18

@daisuke-yoshimoto only fixed the critical bug that database operation until error occurrence is committed when java.lang.Error like StackOverflow is thrown. Also, he is preparing for the DevCon 2018.

@igdianov is working on defining the Notification component and how it will relate to our GraphQL implementations.

@constantin-ciobotaru finished basic configuration for process model service and started implementing some simple process model rest endpoints

@lucianoprea continued working with Spring Rest Docs for the runtime bundles

@balsarori Created Activiti Form Dependencies BOM and initial form runtime modules.

@ryandawsonuk started creating an initial version of process definition security restrictions for runtime bundles before pausing that to upgrade to the latest keycloak version (including a new build of the boot 2 adapter) to support the version used by JHipster

@erdemedeiros was working on IntegrationEvents and how to improve our initial version of the interaction between the engine and external connectors provided in the infrastructure.

@salaboy was working on getting an initial implementation of a Cloud Connector for twitter and an example that can demonstrate how to dynamically scale the instances of the engine based on demand.

 

This week is all about releasing our Monthly Early Access binaries to Maven central and Docker Hub. We will do some planning and I (@salaboy) will be traveling to the USA (New York and Washington) on Monday to Friday, so if you are around and want to hang out and do some open source coding get in touch via twitter.


For more info find us in Gitter: https://gitter.im/Activiti/Activiti7

Last week (16/10/17 - 22/10/17) we started several initiatives to define a new set of services to support our new set of tools. We started by creating a specification document (public) to define how the Process Model Service is going to work. We also move forward some changes on the default behaviors for the process engine, such as use UUID for ID generators and the default Service Task behavior when running inside a Runtime Bundle. We manage to merge a PR vital to push Activiti Cloud Connectors forward and a new Activti Cloud Connector Starter was created.

 

 

@willkzhou was experimenting with ElasticSearch and getting used to the new repository structure.

@daisuke-yoshimoto changed Audit Service with MongoDB to expose read-only REST API using Spring Data REST. Next, he will try the issue that provides DB-based identity service that uses old tables and is rest-based.

@igdianov started looking at graphQL subscriptions and we started discussing about how to create a notification mechanism for the entire platform.

@constantin-ciobotaru started looking at Spring Cloud Contracts and an initial version for the Process Model Service.

@lucianoprea started looking at Spring Rest Docs for our runtime bundles.

@balsarori worked on modifying the default Activiti Cloud starter configurations (removed unnecessary properties and enabled UUID generation by default). Additionally, started working on first version of Activiti Form Runtime Service.

@ryandawsonuk documented how to secure Activiti endpoints using keycloak, started planning implementation of security by process definition and worked on changes to the release process to support using jgitflow

@erdemedeiros worked on message queue implementation for service tasks. After a further investigation the correlation id seems no longer necessary and was replaced by the execution id. The related pull request is the end of the review process and should get merged soon.

@salaboy I’ve spent most of the week validating that the repository refactoring didn’t affect any module. I’ve also added new sections to our gitbook about each service and how we are going to adopt GitFlow as our main way of interactions with community and team members.


This week we will be polishing some examples on the cloud connectors side, plus we will be getting ready for the release at the end of the month. If you want to contribute with the project, this is a perfect time to get in touch and help us out. Reviews on the Gitbook are highly appreciated. For more info find us in Gitter: https://gitter.im/Activiti/Activiti7

Over the last weekend we refactored out our repositories structure to make sure that we can evolve each of the services and cloud starters separately. This was a natural step forward to make sure that our projects are aligned with the frameworks that they are depending on.

 

We have moved out the activiti-services that were originally created inside activiti/activiti repository to make sure that repositories for services are independent from each other. Activiti Cloud Services are the lowest level link between Activiti and Spring Cloud and for that reason now all the org.activiti.services (Java) packages has been renamed to org.activiti.cloud.services. The same refactoring applies for maven artifacts which involved services which are now under the org.activiti.cloud GroupId.

The following diagram shows our current repository structure that might expand into the future, but the basic structure is going to probably remain unchanged.

 

 

new-repositories

Main (Maven) Artifacts

In the previous diagram there are 4 dashed boxes which contain several repositories each Activiti, Activiti Cloud, Activiti Cloud Reference and Activiti Cloud Examples. The first layer is Activiti which includes the Build projects, Process Engine Runtime and examples. At this level we just depend on Java JDK 8. Inside the Activiti repository we have activiti-spring which serves as the basic integration against Spring 5 (version which is defined inside the Build parent project). Activiti relies on activiti-parent for 3rd party libraries dependency management. We are making sure that we deal with 3rd party dependencies and their versions in our parent poms, so it is completely forbidden to add version definitions in submodules. 

Inside the Activiti Build repository we are also providing the BOM (Bill of Materials - activiti-dependencies) that you can import into your projects for maven to handle all the dependencies version for you.  This will enable us to refactor Activiti’s internal modules without affecting applications depending on specific modules, so we recommend to use these activiti-dependencies whenever possible.

<dependencyManagement>
 <dependencies>
   <dependency>
     <groupId>org.activiti</groupId>
     <artifactId>activiti-dependencies</artifactId>
     <version>${activiti.version}</version>
     <type>pom</type>
     <scope>import</scope>
   </dependency>
 </dependencies>
</dependencyManagement>

Moving down in the hierarchy we find Activiti Cloud which represent our Spring Cloud enabled services. These services were designed to run in a Cloud Native way and for that reason they will use abstraction layers provided by Spring Cloud. All the services are independent from each other and we will try to keep base dependencies as decoupled as possible, but they do share the same parent which specifies the spring boot and spring cloud versions that they all use.

We also have our BOM for Activiti Cloud that you can include in your projects to deal with dependencies of several starters and services in a centralised way:

 

<dependencymanagement>
  <dependencies>
    <dependency>
      <groupid>org.activiti.cloud</groupid>
      <artifactid>activiti-cloud-dependencies</artifactid>
      <version>${activiti.cloud.version}</version>
      <type>pom</type>
      <scope>import</scope>
    </dependency>
  </dependencies>
</dependencymanagement>

 

There are also some cross-cutting concerns that also need to be added to each of our individual services such as security and utilities for testing. For that reason we have created a repository called activiti-cloud-service-common, which contains these shared cross-cutting features that are likely to be adopted by most of our services.

If you take a look at the following repositories (this list is likely to grow quite a lot):

You will find that in all of them we include two types of projects:

  • Base (Core) Services
  • Spring Starters

The base core services provides the business logic for each of these modules and Spring Starters allows you to add a simple dependency to your spring boot application to enable the functionality provided by the base core services to your spring boot app. We usually also provide some autoconfigurations and annotations to make sure that our base core services are easy to bootstrap.

We recommend our community members to build Services & Starters at this level. If you, for example, want to tap into events emited by the process engine, writing a service that consume those messages and execute some business logic should be trivial using Spring Cloud.

Until this point (everything inside) Activiti and Activiti Cloud will be released and available in Maven Central as maven artifacts. We will release Early Access builds every month until we have enough meat for a Beta release. We did our first EA release for the first time at the end of August 17, and we will do a our next one by the end of October.

Reference Docker Images & Examples

If you go down another level in the previous diagram, you will see a set of repositories which are being built by hub.docker.com using the automated build configuration. These repositories contain reference implementations using our spring boot starters of each of our services. By using these images you can get a basic working setup of all the services.

These docker images are being tagged every month when we go through our release process, in our examples we always point to the latest build (which happens after every commit to these repositories).

These Docker Images are used to provide examples about how to deploy all these services using Docker Compose, Kubernetes, and Kubernetes HELM charts. You can find these examples inside the activiti-cloud-examples repository.

You, as a user, are encouraged to generate your own Docker images with your required customizations. We have tried to keep these images as simple as possible.

Deprecated Repositories

We will be deleting for good a repository called activiti-cloud-starters which we created to initial host all our starter projects. If you are working against this repository please move your changes to the appropriate *-service repository.

 

Activiti adopts Git Flow

As the team is growing fast, we have adopted Git Flow as our standard flow for working on new features. We are going to use the jgitflow maven plugin to drive the flow.

This means that we are not going to work on the master branch any more, master will be used for releasing and it will always contain a stable state of the project. In other words, master will be updated every month with our monthly release process. All the work now will be based on the develop branch, that you will find in all our repositories.

You can find more information about this plugin here: Atlassian JGITFLOW and workflow provides a very good and clear explanation about how the workflow works.

In order to contribute and send Pull Request to any of our repositories you will need to use the following maven goal making sure that you have an up to date develop branch:

This will create a feature branch for you to work, push your commits and then send a pull request. These maven plugin will ask you to provide a name for your feature branch and that should follow the following format <github username>-<issue number>-<short desc>. We will then review your PR and use jgitflow:feature-finish to merge that pull request.

If you have questions about these changes, procedures for contributions drop us a line in our Gitter Channel: https://gitter.im/Activiti/Activiti7

Are you using any version of the #Activiti project? We would like to invite you to submit a paper to the Alfresco DevCon happen in Lisbon, Portugal - 16, 17, 18 January 2018 ( https://www.papercall.io/alfrescodevcon2018 ). If you are using the project or planning to, this is a great opportunity to share your project with a big community of users and a great opportunity to meet the team working on Activiti 7 & Activiti Cloud.

We will be sending some proposals about the new stuff that we are building so you can expect to see us talking about Activiti Cloud and how we are updating the engine to be Cloud Native. We will be showing examples about our brand new services designed for Kubernetes and Docker and how you can leverage the new infrastructure to build scalable solutions that doesn't mismatch with your existing infrastructure. We welcome ideas about topics that you consider worth the trip. 

Why is this a very good opportunity to share and learn about the future of Activiti? 

There are several reason why I want to meet with the large Activiti community and invite people using the project to submit an abstract for the DevCon to share with all the audience how they are using the project and what are they looking forward.

The following list depicts what, from my experience, we can all gain for meeting up in a conference like DevCon:

  • We can share our experiences with other people that is already using Activiti
  • We can find out what are the common and shared pain points that will need to be solved by future versions
  • We can learn from each other how to solve and troubleshoot problems related with implementing BPM solutions in different industries
  • We can define the future roadmap together, and you as an individual or a company can join our open source community and have a say on the future of the project
  • We can use our time together to discuss different implementation techniques and approaches
  • We can have a deep discussion about the Activiti Cloud architectural changes, why modern environments are promoting new architectural patterns and how that affects the BPM implementation and Process Engines. 

 I offer my personal time to meet with community members and organizations that are interested in the project, I also want to offer my assistance to help you to submit a paper for the conference where we can present together your use case and your expectations for the future. 

We don't have much time left, so if you are interested in this opportunity get in touch as soon as possible. You can find me every day in our Gitter Channel: https://gitter.im/Activiti/Activiti7

Or you can post a comment here, I will try to reply as soon as I can. 

See you all in Lisbon! 

Last week the team spend some time working around system to system integrations. You can find more about our Activiti Cloud Connectors Strategy here. This new strategy uses some abstraction layers to make sure that when we run inside a Kubernetes enabled environment our connectors can leverage Kubernetes Services. The JHipster integration is also moving forward, we now have clear plans of how that should look like and how our users will benefit from it (a blog post is coming about this as well). We are also very excited to see the team expanding with some experienced members from the Activiti 5 and 6 community (@balsarori , @lucianoprea and @constantin-ciobotaru ). We held some planning sessions to make sure that we can build all the new generation services in an autonomous way.

 

@willkzhou was looking at the ElasticSearch integration provided by @gmalanga to polish it and merge it into master.

@daisuke-yoshimoto created spring boot application for Audit Service with MongoDB and activiti-cloud-examples for Audit Service with MongoDB. And, he refactored Audit Service with MongoDB to remove Controller by using Spring Data REST.

@igdianov completed the graphql PoC for query module and started investigating using it to provide a notification mechanism.

@ryandawsonuk created an example Activiti project using the JHipster generator and added an Activiti Cloud example that uses the Spring Cloud Config Server.

@erdemedeiros worked on message queue implementation for service tasks. All the necessary information to retrigger the execution is now stored in the engine core database. The executionId  is no longer sent to the connector, only the correlation id.

@salaboy worked on finishing the connectors blog post and example in github. We also had very interesting meetings with new/old members of the activiti community which will be joining us on daily basis: @balsarori , @lucianoprea and @constantin-ciobotaru.

This week we will be finishing another big repository refactoring to make sure that each of our services and starters are independent from each other and we as a team along with community members doesn’t block each other. If you experience issues around these, or if you were working against an existing repo that will not be longer be supported get in touch with us via Gitter and we will guide you to the new structure. A blog post will follow with the new structure as soon as it is ready.

This blog is a continuation of my first blog around the work we Ciju Joseph and Francesco Corti did as part of Alfresco Global Virtual Hack-a-thon 2017

 

In this blog I’ll be walking you through aps-unit-test-example project we created where I’ll be using the features from the aps-unit-test-utils library which I explained in the first blog.

About the Project

This project contains a lot of examples showing:

  • how to test various components in a  BPMN (Business Process Model and Notation) model
  • how to test a DMN (Decision Model and Notation) model
  • how to test custom java classes that are supporting your process models.

Project Structure

Before even we get to the unit testing part, it is very important to understand the project structure.

As you can see from the above diagram, this is a maven project. However, if you are a “gradle” person, you should be able to do it the gradle way too! The various sections of the project are:

  1. Main java classes - located under src/main/java. This includes all the custom java code that are supporting your process/dmn models.
  2. Test classes -  located under src/test/java. The tests are again grouped into different packages depending on the type of units they are.
    1. Java class tests - This includes test classes for classes (eg: Java Delegate, Task Listener, Event Listener, Custom Rest Endpoints, Custom Extensions etc) under src/main/java.
    2. DMN tests - As you can see from the package name (com.alfresco.aps.test.dmn) itself, I’m writing all the DMN tests under this package. The pattern I followed in this example is one test class per DMN file under the directory src/main/resources/app/decision-table-models.
    3. Process(BPMN) tests - Similar to DMN tests, the package com.alfresco.aps.test.process contains all the BPMN test classes. Similar to DMN tests, I am following the pattern of one test class per BPMN file under src/main/resources/app/bpmn-models
  3. App models - All the models (forms, bpmn, dmn, data models, stencils, app.json etc) that are part of the process application is stored under the directory src/main/resources/app. When using the aps-unit-test-utils which I explained in the previous article, all the models are downloaded to this directory from APS. Once the tests are passed successfully, we will re-build the deployable process artifacts from this directory
  4. Test resources - As with any standard java projects, you can keep all your test resources in the directory src/test/resources. I’ll highlight a couple of files that you will find under this directory in the above project structure image
    1. activiti-resources.properties - This file contains the APS server configurations such as server address, api url, user credentials etc for downloading the process application into your maven project. Please refer to my previous article for a detailed explanation of this file. You wouldn’t find this file on GitHub under this project, the reason is, this file is intended to be developer specific and local to the workspace of a developer. For this reason this file is included in the project’s .gitignore file to prevent it from getting saved to GitHub.
    2. process-beans-and-mocks.xml - the purpose of this file is to mock any project/process specific classes when you run your process tests. The concept is explained in detail in my previous article when I explained a similar file called common-beans-and-mocks.xml.  
  5. Build output - In the above screenshot you can see that there are two files named aps-unit-test-example-1.0-SNAPSHOT-App.zip and aps-unit-test-example-1.0-SNAPSHOT.jar under /target directory. This is basically the build output that gets generated when you package the app using maven commands such as “mvn clean package”. The “.zip” file is the app package created from src/main/resources/app directory which you can version after every build and deploy to higher environments. The “.jar” is the standard jar output including all the classes/resources from your src/main directory.
  6. Maven pom xml - Since this is a maven based project, you need a pom.xml under the root of the project. Highlighting some of the dependencies and plugins that are used in this pom.xml
    • aps-unit-test-utils dependency - the test utils project which I explained in my previous post/blog.
      <dependency>
           <groupId>com.alfresco.aps</groupId>
           <artifactId>aps-unit-test-utils</artifactId>
           <version>[1.0-SNAPSHOT,)</version>
      </dependency>
    • maven-compiler-plugin - a maven plugin that helps compile the sources of the project
      <plugin>
           <artifactId>maven-compiler-plugin</artifactId>
           <version>3.6.2</version>
           <configuration>
                <source>1.8</source>
                <target>1.8</target>
           </configuration>
      </plugin>
    • maven-assembly-plugin - a maven plugin that is used to package the “app.zip” archive from src/main/resources/app
      <plugin>
           <artifactId>maven-assembly-plugin</artifactId>
           <version>3.1.0</version>
           <executions>
                <execution>
                     <configuration>
                          <descriptors>
                               <descriptor>src/main/resources/assembly/assembly.xml</descriptor>
                          </descriptors>
                     </configuration>
                     <id>create-distribution</id>
                     <phase>package</phase>
                     <goals>
                          <goal>single</goal>
                     </goals>
                </execution>
           </executions>
      </plugin>

Unit Test Examples

Now that you have a good understanding of all the project components, let’s take a look at some of the examples available in the project. I have tried my very best to keep the test classes and processes as simple as possible to make it easy for everyone to follow without much explanation.

Process Testing

AbstractBpmnTest.java - This class can be used as a parent class for all the BPMN test classes. To avoid writing the same logic in multiple test classes, I added a few common logic into this, they are:

  • Setup of a mock email server
  • Process deployment prior to tests
  • Clean up such as delete all deployments after each tests
  • Test coverage alerts
/* Including it in the Abstract Class to avoid writing this in all the Tests.
      * Pre-test logic flow -
      * 1)      Download from APS if system property -Daps.app.download=true
      * 2)      Find all the bpmn20.xml's in {@value
      *           BPMN_RESOURCE_PATH} and deploy to process engine
      * 3)     Find all the elements in the process that is being tested. This set will
      *           be compared with another set that contains the process elements that are
      *           covered in each tests (this get updated after each tests).
      */

     @Before
     public void before() throws Exception {

          if (System.getProperty("aps.app.download") != null && System.getProperty("aps.app.download").equals("true")) {
               ActivitiResources.forceGet(appName);
          }

          Iterator<File> it = FileUtils.iterateFiles(new File(BPMN_RESOURCE_PATH), null, false);
          while (it.hasNext()) {
               String bpmnXml = ((File) it.next()).getPath();
               String extension = FilenameUtils.getExtension(bpmnXml);
               if (extension.equals("xml")) {
                    repositoryService.createDeployment().addInputStream(bpmnXml, new FileInputStream(bpmnXml)).deploy();
               }
          }
          processDefinitionId = repositoryService.createProcessDefinitionQuery()
                    .processDefinitionKey(processDefinitionKey).singleResult().getId();
          List<Process> processList = repositoryService.getBpmnModel(processDefinitionId).getProcesses();
          for (Process proc : processList) {
               for (FlowElement flowElement : proc.getFlowElements()) {
                    if (!(flowElement instanceof SequenceFlow)) {
                         flowElementIdSet.add(flowElement.getId());
                    }
               }
          }
     }

     /*
      * Post-test logic flow -
      * 1)      Update activityIdSet (Set containing all the elements tested)
      * 2)      Delete all deployments
      */

     @After
     public void after() {
          for (HistoricActivityInstance act : historyService.createHistoricActivityInstanceQuery().list()) {
               activityIdSet.add(act.getActivityId());
          }
          List<Deployment> deploymentList = activitiRule.getRepositoryService().createDeploymentQuery().list();
          for (Deployment deployment : deploymentList) {
               activitiRule.getRepositoryService().deleteDeployment(deployment.getId(), true);
          }
     }

     /*
      * Tear down logic - Compare the flowElementIdSet with activityIdSet and
      * alert the developer if some parts are not tested
      */

     @AfterClass
     public static void afterClass() {
          if (!flowElementIdSet.equals(activityIdSet)) {
               System.out.println(
                         "***********PROCESS TEST COVERAGE WARNING: Not all paths are being tested, please review the test cases!***********");
               System.out.println("Steps In Model: " + flowElementIdSet);
               System.out.println("Steps Tested: " + activityIdSet);
          }
     }

Process Example 1

In this example we will test the following process diagram which is a simple process containing three steps Start → User Task → End

 

UserTaskUnitTest.java - test class associated with this process which tests the following

  • A process is started correctly
  • Upon start a user task is created and assigned to the correct user with the correct task due date
  • Upon completion of the user task the process is ended successfully
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(locations = { "classpath:activiti.cfg.xml", "classpath:common-beans-and-mocks.xml" })
public class UserTaskUnitTest extends AbstractBpmnTest {

     /*
      * Setting the App name to be downloaded if run with -Daps.app.download=true
      * Also set the process definition key of the process that is being tested
      */

     static {
          appName = "Test App";
          processDefinitionKey = "UserTaskProcess";
     }

     @Test
     public void testProcessExecution() throws Exception {
          /*
           * Creating a map and setting a variable called "initiator" when
           * starting the process.
           */

          Map<String, Object> processVars = new HashMap<String, Object>();
          processVars.put("initiator", "$INITIATOR");

          /*
           * Starting the process using processDefinitionKey and process variables
           */

          ProcessInstance processInstance = activitiRule.getRuntimeService()
                    .startProcessInstanceByKey(processDefinitionKey, processVars);

          /*
           * Once started assert that the process instance is not null and
           * successfully started
           */

          assertNotNull(processInstance);

          /*
           * Since the next step after start is a user task, doing a query to find
           * the user task count in the engine. Assert that it is only 1
           */

          assertEquals(1, taskService.createTaskQuery().count());

          /*
           * Get the Task object for further task assertions
           */

          Task task = taskService.createTaskQuery().singleResult();

          /*
           * Asserting the task for things such as assignee, due date etc. Also,
           * at the end of it complete the task Using the custom assertion
           * TaskAssert from the utils project here
           */

          TaskAssert.assertThat(task).hasAssignee("$INITIATOR", false, false).hasDueDate(2, TIME_UNIT_DAY).complete();

          /*
           * Using the custom assertion ProcessInstanceAssert, make sure that the
           * process is now ended.
           */

          ProcessInstanceAssert.assertThat(processInstance).isComplete();
     }

}

Process Example 2

Let’s now look at a process that is a little more complex than the previous one. As you can see from the diagrams below, there are two units that are candidates for unit test in this model, they are process model & DMN model

  • DMNProcessUnitTest.java - Similar to the above example, this is the test class associated with this process which tests the following:
    • A process is started correctly
    • Tests all possible paths in the process based on the output of rule step
    • Successful completion of process
    • Mocks the rules step - when it comes to the rules/decision step in the process, we are not invoking the actual DMN file associated with the process. From a process perspective all we care is that an appropriate variable is set at this step for it to take the respective path that is being tested. Hence the mock.
  • DmnUnitTest.java - This is the test class associated with the DMN file that is invoked from this process. More explanation in next section.

DMN Testing

AbstractDmnTest.java - Similar to the AbstractBpmnTest class I explained above, this class can be used as a parent class for all the DMN test classes. To avoid writing the same logic in multiple test classes, I added a few common logic into this, they are:

  • DMN deployment prior to tests
  • Clean up such as delete all deployments after each tests
/*
      * Including it in the Abstract Class to avoid writing this in all the
      * Tests. Pre test logic -
      * 1)      Download from APS if system property -Daps.app.download=true
      * 2)      Find all the dmn files in {@value
      * DMN_RESOURCE_PATH} and deploy to dmn engine
      */

     @Before
     public void before() throws Exception {

          if (System.getProperty("aps.app.download") != null && System.getProperty("aps.app.download").equals("true")) {
               ActivitiResources.forceGet(appName);
          }

          // Deploy the dmn files
          Iterator<File> it = FileUtils.iterateFiles(new File(DMN_RESOURCE_PATH), null, false);
          while (it.hasNext()) {
               String bpmnXml = ((File) it.next()).getPath();

               String extension = FilenameUtils.getExtension(bpmnXml);
               if (extension.equals("dmn")) {
                    DmnDeployment dmnDeplyment = repositoryService.createDeployment()
                              .addInputStream(bpmnXml, new FileInputStream(bpmnXml)).deploy();
                    deploymentList.add(dmnDeplyment.getId());
               }
          }
     }

     /*
      * Post test logic -
      * 1)      Delete all deployments
      */

     @After
     public void after() {
          for (Long deploymentId : deploymentList) {
               repositoryService.deleteDeployment(deploymentId);
          }
          deploymentList.clear();
     }

DMN Example 1

In this example we will test the following DMN model which is a very simple decision table containing three rows of rules.

  • DmnUnitTest.java - the test class associated with the above DMN model. The test cases in this file will test every row in the DMN table and verify that it is getting executed as expected. The number of rules in real life can grow in size over time, hence it is important to have test cases covering all the possible hit and miss scenarios in your test cases for a healthy maintenance of your decision management and business rules.
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(locations = { "classpath:activiti.dmn.cfg.xml" })
public class DmnUnitTest extends AbstractDmnTest {

     static {
          appName = "Test App";
          decisonTableKey = "dmntest";
     }

     /*
      * Test a successful hit using all possible inputs
      */

     @Test
     public void testDMNExecution() throws Exception {
          /*
           * Invoke with input set to xyz and assert output is equal to abc
           */

          Map<String, Object> processVariablesInput = new HashMap<>();
          processVariablesInput.put("input", "xyz");
          RuleEngineExecutionResult result = ruleService.executeDecisionByKey(decisonTableKey, processVariablesInput);
          Assert.assertNotNull(result);
          Assert.assertEquals(1, result.getResultVariables().size());
          Assert.assertSame(result.getResultVariables().get("output").getClass(), String.class);
          Assert.assertEquals(result.getResultVariables().get("output"), "abc");

          /*
           * Invoke with input set to 123 and assert output is equal to abc
           */

          processVariablesInput.put("input", "123");
          result = ruleService.executeDecisionByKey(decisonTableKey, processVariablesInput);
          Assert.assertNotNull(result);
          Assert.assertEquals(1, result.getResultVariables().size());
          Assert.assertSame(result.getResultVariables().get("output").getClass(), String.class);
          Assert.assertEquals(result.getResultVariables().get("output"), "abc");

          /*
           * Invoke with input set to abc and assert output is equal to abc
           */

          processVariablesInput.put("input", "abc");
          result = ruleService.executeDecisionByKey(decisonTableKey, processVariablesInput);
          Assert.assertNotNull(result);
          Assert.assertEquals(1, result.getResultVariables().size());
          Assert.assertSame(result.getResultVariables().get("output").getClass(), String.class);
          Assert.assertEquals(result.getResultVariables().get("output"), "abc");
     }

     /*
      * Test a miss
      */

     @Test
     public void testDMNExecutionNoMatch() throws Exception {
          Map<String, Object> processVariablesInput = new HashMap<>();
          processVariablesInput.put("input", "dfdsf");
          RuleEngineExecutionResult result = ruleService.executeDecisionByKey(decisonTableKey, processVariablesInput);
          Assert.assertEquals(0, result.getResultVariables().size());
     }

}

Custom Java Class Testing

This section is about the testing of classes that you may write to support your process models. This includes testing of Java Delegates, Task Listeners, Event Listeners, Custom Rest Endpoints, Custom Extensions etc which are available under src/main/java. The naming convention I followed for the test classes is “<ClassName>Test.java” and the package name is the same package name of the class that we are testing.

 

Let’s now inspect an example which is the testing of a task listener named TaskAssignedTaskListener.java

Example 1

The above task listener is used in a process named CustomListeners in the project.  From a process testing perspective, this TaskListener is mocked in the process test class CustomListenersUnitTest.java via process-beans-and-mocks.xml. We now have this task listener class that is still not unit tested. Let’s inspect its testing class TaskAssignedTaskListenerTest.java which is tested the following way:

  1. Set up mocks and inject mocks into classes that are being tested
  2. Set up mock answering stubs prior to execution
  3. Execute the test and assert the expected results
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
public class TaskAssignedTaskListenerTest {

     @Configuration
     static class ContextConfiguration {
          @Bean
          public TaskAssignedTaskListener taskAssignedTaskListener() {
               return new TaskAssignedTaskListener();
          }
     }

     @InjectMocks
     @Spy
     private static TaskAssignedTaskListener taskAssignedTaskListener;

     @Mock
     private DelegateTask task;

     @Before
     public void initMocks() {
          MockitoAnnotations.initMocks(this);
     }

     /*
      * Testing TaskAssignedTaskListener.notify(DelegateTask task) method using a
      * mock DelegateTask created using Mockito library
      */

     @Test
     public void test() throws Exception {

          /*
           * Creating a map which will be used during the
           * DelegateTask.getVariable() & DelegateTask.setVariable() calls from
           * TaskAssignedTaskListener as well as from this test
           */

          Map<String, Object> variableMap = new HashMap<String, Object>();

          /*
           * Stub a DelegateTask.setVariable() call
           */

          doAnswer(new Answer<Void>() {
               @Override
               public Void answer(InvocationOnMock invocation) throws Throwable {
                    Object[] arg = invocation.getArguments();
                    variableMap.put((String) arg[0], arg[1]);
                    return null;
               }
          }).when(task).setVariable(anyString(), any());

          /*
           * Stub a DelegateTask.getVariable() call
           */

          when(task.getVariable(anyString())).thenAnswer(new Answer<String>() {
               public String answer(InvocationOnMock invocation) {
                    return (String) variableMap.get(invocation.getArguments()[0]);
               }
          });
         
          /*
           * Start the test by invoking the method on task listener
           */

          taskAssignedTaskListener.notify(task);
         
          /*
           * sample assertion to make sure that the java code is setting correct
           * value
           */

          assertThat(task.getVariable("oddOrEven")).isNotNull().isIn("ODDDATE", "EVENDATE");
     }

}

 

Checkout the whole project on GitHub where we have created a lot of examples that covers the unit testing of various types of BPMN components and scenarios. We’ll be adding more to this over the long run.

 

 

Hopefully this blog along with the other two unit-testing-part-1 & aps-ci-cd-example is of some help in the Lifecycle Management of Applications built using Alfresco Process Services powered by Activiti

This blog is the first part of the two blog series around the work we Ciju Joseph & Francesco Corti did as part of Alfresco Global Virtual Hack-a-thon 2017

Hack-a-thon Project Description & Goal

Alfresco Process Services (APS) powered by Activiti has a standard way to develop custom java logic/extensions in your IDE. Typically the process models that often needs a lot of collaboration from many members of a team are developed in the web modeler of the product. From a packaging and versioning perspective, the process application, should be managed together with your java project. Since the role of unit tests is very critical during the lifecycle of these process artifacts it is important that we have good unit test coverage, testing the process models and custom java logic/extensions. The goal of this hack-a-thon project was to work on some unit test utilities and samples which can benefit the Alfresco community.

 

As part of this work we created two java projects (maven based) which are:

  1. aps-unit-test-utils - This is a utils project containing:
    1. Logic to automatically download the process app from an APS environment and make it available in your IDE/workspace.
    2. Activiti BPMN and DMN engine configuration xmls with all the necessary spring beans that you would need for your testing
    3. Mocks for OOTB (Out of the Box) APS BPMN stencils such as “Publish To Alfresco”, “REST call task” etc. Not all the components are mocked, but this gives you an idea on how to mock OOTB stencil components!
    4. Helper classes and custom assertions based on AssertJ library to help you quickly write tests over your process and decision tables.
  2. aps-unit-test-example - This project contains a lot of examples showing how to test your BPMN/DMN models and also, custom java logic associated with these models.

 

In this blog I’ll be walking you through the utils project and some of the main features available in this project

Project Features

Utility to fetch Process App from APS

One of the main features of this project is that it allows you to download(optionally) the process models that you have modeled in APS web modeler to your IDE during your local unit testing. Once you are happy with all your changes and unit test results, you can save those downloaded models into the version control repository. The reason why I highlighted the word “optionally” above is that, it is important that when you run your unit test in a proper CI/CD pipeline, you are unit testing the models that you have in your version control repository and avoid any other external dependencies.

 

The package com.alfresco.aps.testutils.resources in the project contains the classes responsible for downloading the process models from an APS environment into your IDE.

It is this method com.alfresco.aps.testutils.resources.ActivitiResources.get(String appName) which does this magic for you! You can invoke this method from your test classes at testing time to download and test your changes in web modeler. The method logic is:

  1. Read a property file named “activiti-resources.properties” containing APS environment and api details. Sample property file available at activiti-resources.properties
  2. If the app you are requesting is found on the server and if you have permissions to access the app, it is downloaded to your project under the path src/main/resources/app as a zip and then exploded into this directory. All the existing models will be deleted prior to the download and unzip
  3. From a unit testing perspective of the models, it is important we have the BPMN and DMN xmls available in the exploded directory under src/main/resources/app/bpmn-models and src/main/resources/app/decision-table-models respectively. When this method is successfully completed you will have those xmls in the respective directories ready for unit testing.

Configuration XMLs for the BPMN and DMN engines

Since Alfresco Process Services is a spring based webapp, we wrote all the utils, helper classes, examples etc using spring features. In this section, I’ll explain the two main configuration xmls present in src/main/resources directory that can be used to bootstrap the process engine and the dmn engine for unit testing.

  1. src/main/resources/activiti.cfg.xml:  Using this xml configuration a process engine is created with an in-memory h2 database. As you probably know there is a lot of configurations options available for Activiti process engine. If your test setup requires advanced configurations you should be able to do everything in this xml.
  2. src/main/resources/activiti.dmn.cfg.xml: This xml can be used to start the Activiti rule engine (DMN engine), again with an in-memory h2 database

Depending on the model that you are testing (BPMN/DMN), you can use one of the above configuration xmls to bootstrap the engine from your test cases.

Non-Engine Bean Configuration XML

src/main/resources/common-beans-and-mocks.xml: This xml can be used to configure any mock/real beans that are required for your process testing but not really part of the process/rule engine configuration. Those mock and non-mock beans are explained in the following subsections.

Mock Beans for OOTB APS BPMN Stencils

Since I think “mocks” are best explained with an example, please find below a process diagram where I’m using an OOTB APS “REST call task” stencil. I have also highlighted some of the other components in the APS editor that fall in this category of OOTB stencils.

 

In this example, from a unit testing perspective of OOTB components, you need to make sure a few things such as:

  1. This step is invoked upon a successful start of the process
  2. The expected response is set in your unit test so that you can continue with the next steps in the process.
  3. The configurations set on the model are successfully transferred to the BPMN XML upon export/deployment

However there are things that are not in-scope of unit testing of OOTB components:

  1. Test whether the REST API configured is invoked successfully - this is integration testing
  2. Testing of various configurations available on a REST task - this is the responsibility of Alfresco engineering team to make sure the configurations are working as expected

 

This is where we use mocks instead of the real classes that are behind these components. In order to create the mocks for these components, we need to first look at how these tasks look inside the deployed bpmn.xml. For example the bpmn equivalent of the above diagram is shown below:

<serviceTask id="sid-AB2E4A5F-4BF6-48BE-8FF1-CDE01687E69A" name="Rest Call" activiti:async="true" activiti:delegateExpression="${activiti_restCallDelegate}">
  <extensionElements>
    <activiti:field name="restUrl">
      <activiti:string><![CDATA[https://api.github.com/]]></activiti:string>
    </activiti:field>
    <activiti:field name="httpMethod">
      <activiti:string><![CDATA[GET]]></activiti:string>
    </activiti:field>
    <modeler:editor-resource-id><![CDATA[sid-8124CB5D-BD47-49CD-B013-F7FFB576DE8D]]></modeler:editor-resource-id>
  </extensionElements>
</serviceTask>

As you can see from the XML, the bean that is responsible for the REST call is activiti_restCallDelegate. This bean also has some fields named “httpMethod”, “restUrl” etc. Let’s now look at the mock class (given below) that I created for this bean. Since it is a mock class, all you need to do is create a java delegate with the field extensions that are present in the bpmn.xml.

package com.alfresco.aps.mockdelegates;

import org.activiti.engine.delegate.DelegateExecution;
import org.activiti.engine.delegate.JavaDelegate;
import org.activiti.engine.delegate.Expression;

public class RestCallMockClass implements JavaDelegate{
    
     Expression restUrl;
     Expression httpMethod;
     Expression baseEndpoint;
     Expression baseEndpointName;
     Expression requestMappingJSONTemplate;
    
     @Override
     public void execute(DelegateExecution execution) throws Exception {
          // TODO Auto-generated method stub
     }

}

Now that I have the mock class, next step is to create a mock bean using this class so that it is resolved correctly during unit test time. Given below is the mock bean configuration in src/main/resources/common-beans-and-mocks.xml corresponding to the above mentioned mock class.

<bean id="activiti_restCallDelegate" class="org.mockito.Mockito" factory-method="mock"> 
   <constructor-arg value="com.alfresco.aps.mockdelegates.RestCallMockClass" />
</bean>

You may have noticed that I am using a class called org.mockito.Mockito for the creation of the mocks. This is from the Mockito Library which is a great library for mocking!

I have created a few mock classes in this project which you can find in com.alfresco.aps.mockdelegates package. I have included them in the common-beans-and-mocks.xml file too. As you probably know APS contains a lot of OOTB stencils and this project contains only a very small subset. The point is, you should be able to mock any such OOTB beans using the above approach.

Common Beans

Any common beans (helper classes, utils etc) that you may require in the context of your testing can be added to common-beans-and-mocks.xml. Technically it can be separated from the mock xml, but for the sake of simplicity I kept it in the same xml.

Custom Assertions & Helper Classes

The classes in packages com.alfresco.aps.testutils and com.alfresco.aps.testutils.assertions are basically helper classes and assertions which can be re-used across all your process testing in an easy and consistent way. An approach like this will help reduce the unit test creation time and also help enforce some process modelling and unit test best practices. Highlighting some of the key features:

  1. The project contains an AbstractBpmnTest and AbstractDmnTest class which can be used as parent class to test your bpmn.xml and dmn.xml respectively.
  2. Includes a mock email transport which is set up in the AbstractBpmnTest. This can be used to test any email steps you have in the process.
  3. Custom Assertions using AssertJ Library on Activiti entities such as Task, ProcessInstance, DelegateExecution etc. Please note, the assertions I have created in this project is definitely not covering all possible assertion scenarios. However I think I have put a decent mix in there for you get started and you can add many more assertion methods depending on your test cases.  

 

Checkout the next blog Alfresco Process Services - Unit Testing # II to see the usage of some of these helper classes and assertions.

Conclusion

There are plenty of articles and blogs out there around unit testing best practices. So I’m not going to do that again here. However just wanted to stress one point with the help of an example: Do not mix integration testing with unit testing.

For example, a process containing the following steps Start → DMN (Rules) → Service Task → UserTask → End can be tested as three units

  1. Unit test for the process xml where you mock the “DMN” and “Service Task” steps
  2. Unit test for the DMN xml testing each rules in the DMN
  3. Unit test for the Service Task class

So, what next?

  • If you already have a good unit testing framework around your processes in APS, great, continue that way. Feel free to provide your feedback and contributions either as comments or as blogs here on community.
  • If you don’t have any unit testing around your processes in APS, I hope this article will help you get started. Feel free to fork it and make it your own at your organization. Good unit tests around your processes, rules, code etc will definitely help you in the long run especially when doing upgrades, migrations, change requests, bug fixes etc

 

Happy Unit Testing!

salaboy

RFC: Activiti Cloud Connectors

Posted by salaboy Employee Oct 12, 2017

Open Source Java Process Engines historically provide a way to create system to system integrations. It is something basic expected from a BPMS. The problem begins when these mechanisms impose a non-standard way of doing those integrations. By non-standard I mean, something that feels awkward coming from outside the BPM world. Nowadays with the rise of microservices, there are several alternatives on how system to system integrations are designed and implemented. Projects such as Apache Camel and Spring Integrations are quite popular and they solve most of our integration problems for us.

Today’s real life integrations push us to use a lot of pre-baked tools to add fallback mechanisms, circuit breakers, bulkheads, client and cluster side load balancing, content type negotiation, automatic scaling up and down of our services based on demand, etc. Simple REST calls are not enough anymore.

In this blog post I share the approach that we are taking for Activiti Cloud Connectors. This is a request for comments blog post and it describes the underlying technology that we are planning to use. With time, we will simplify the mechanism shown in here to make sure that we don’t push our users to add too much boilerplate.

 

 

Java Delegates & Classpath Extensions (The old way)

 

Java Delegates & Classpath extensions are quite a powerful tool to extend the behavior of the process engine and the main entry point for system integrations. Until Activiti 6.x if you wanted to interact with external (running outside of the process engine) services you were responsible for writing a Java Class that implements the "org.activiti.engine.delegate.JavaDelegate" interface which exposes a single method:

[code language="java"]void execute(DelegateExecution execution);[/code]

As you can imagine, you can do all sorts of things with this. It’s simple and extremely flexible. However, there are downsides to this approach too that are common across any in-process extension model. For example, the JavaDelegate classes are directly referenced from within the process definitions, and this introduces a tighter coupling than is ideal between definition and implementation. Also, if you do make an error when coding your JavaDelegate then you run the risk of bringing down the JVM that is also running other processes. One other area that can cause difficulty is managing the Java class path and ensuring that it is replicated consistently across each member of your cluster of process engines and Job Executors.

For these and other reasons we want to introduce the concept of Activiti Cloud Connectors and decouple the responsibility of dealing with system to system interactions from the Process Engine.

Just to be clear, we are not removing JavaDelegates, instead we are providing a new recommended out-of-the-box mechanism that will tackle these problems.

 

Spring Cloud Connectors & Kubernetes

 

The Spring framework provides the concept and abstraction layer on top of Service Connectors and how these can be integrated in different cloud platforms. The main concept behind service connectors is to make sure that if your service depends on another service you delegate to the infrastructure the lookup to that service. This is usually referred as service/resource binding. You know that you want to interact with a very specific type of service but you don’t know where it is or how to locate it, so you delegate that responsibility to another layer that will look up to the available types of services, locate them and provide your service a way to interact with it.

Spring Cloud Connectors provides this level of abstraction and you can find more about them here: http://cloud.spring.io/spring-cloud-connectors/ They provide different implementations for different cloud providers: CloudFoundry, Heroku and Local for testing and developing.

There is a Kubernetes Cloud Connector (https://github.com/spring-cloud/spring-cloud-kubernetes-connector), that we might want to pick up (the one that I’m using in the example) that looks like a very good start but it's not being actively maintained. So we will probably take the lead to push that project forward. Don’t get me wrong, the project is great, but we will need to get more juice from it. We also believe that this project can be moved to the incubator project related to kubernetes, so versions can be aligned on every front.

 

Activiti Runtime Bundle Integration Events

 

As mentioned before, we are pushing out of the Process Engine the responsibility to execute system to system integrations. From the Process Engine point of view Service Tasks (and other integrations) will be executed in an Async fashion. This means that the Process Engine will only emit an IntegrationRequestEvent and wait for an IntegrationResultEvent.

By pushing the responsibility to Connectors, the Process Engine doesn’t care anymore about what technology, practices or language do we use to perform the system to system integrations. In other words, please welcome “Polyglot Connectors!”

Screen Shot 2017-10-11 at 18.04.28.png

We are aiming to provide an out of the box solution for Service Tasks, where you don’t really need to specify which connector is going to be in charge of performing the integration. By doing this, we remove the need to add a reference to a Java Class inside the Process Definition, decoupling completely the “What needs to be done” from the “How needs to be done” and following a mode declarative approach.

Inside Activiti Cloud Connectors

Activiti Cloud Connectors will run outside of our Runtime Bundles, meaning that Runtime Bundles will not fail if a Connector fail. It also means that we will be able to scale them independently.

From a very high level perspective Activiti Cloud Connectors will be responsible for:

  • Listen to IntegrationRequestEvents
  • Process IntegrationRequestEvents (Perform the Remote/Local - Sync/Async call)
  • Return IntegrationResultEvents

Activiti Cloud Connector projects can bundle any number of system to system interactions that are related and can be managed together. This means that if you recognize a set of services that are often updated together you can bundle inside an Activiti Cloud Connector all these interactions, so your business processes can use them.

Activiti Cloud Connectors will be automatically discovered and registered inside the service registry provided by the infrastructure. This means that we can, at runtime, ask the service registry about the available connectors and their types. Our modelling tools can make use of that registry to present to the user the available connectors as well.

Screen Shot 2017-10-11 at 18.08.06

The Proof of Concept

 

You can find the PoC in my github account here: https://github.com/Salaboy/test-spring-cloud-connectors

This repository contains several pieces that demonstrate how to create connectors. You will find inside the repository:

  • Connector-consumer-app : Our connector that listen for IntegrationEvents and produces IntegrationResults after finishing the external system to system integration.
  • Payments-api : From the client (connector) point of view we just create a definition of the service that we want to connect to. This will help us to create different implementations for the same service for different environments.
  • Payments-local-connector : a local connector that will use Spring Connectors with a local service registry.
  • Payments-kube-connector : a kubernetes service connector, that allows you to discover a Kubernetes Service and connect to it.
  • Payments-service : a very simple Payment Service to connect as an example. This is a spring boot application that works in complete isolation from the Connector.

The Connector Consumer App can be packaged using different dependencies and configurations based on the selected profile. We have the “local” and “kube” profile. By default the “local” profile is active, this means that the local connector will be included.

Even if you are running with the local profile, you need to have rabbitMQ running for working with Spring Cloud Streams and for that reason, the project provides a docker-compose that will start the Payments Service plus RabbitMQ. https://github.com/Salaboy/test-spring-cloud-connectors/blob/master/docker/docker-compose.yml

Once you have that running, you can start the Connector Consumer App which also expose a REST endpoint to trigger integrationEvents:  

https://github.com/Salaboy/test-spring-cloud-connectors/blob/master/connector-consumer-app/src/main/java/org/salaboy/streams/SampleApplication.java#L57

This is just to make it simple to test for this example, but the Process Engine will produce an event that will be picked up by the @StreamListener here: https://github.com/Salaboy/test-spring-cloud-connectors/blob/master/connector-consumer-app/src/main/java/org/salaboy/streams/SampleApplication.java#L63

The service connector resolution magic happens here:       

[code language="java"] PaymentService payment = cloud.getServiceConnector("payment",PaymentService.class,null); [/code]

Where cloud is an instance of Cloud created using a CloudFactory:

[code language="java"] private Cloud cloud = new CloudFactory().getCloud(); [/code]

This allows us to detect the environment where we are running and then obtain references to the services that we want to interact in a decoupled way.

Depending on the (Spring) Service Connector that we have in the classpath different lookup mechanisms will be used to get a reference to the service. We can even have multiple (Spring) Service Connectors inside our Activiti Cloud Connector project which will be enabled and disabled based on the Cloud Platform that is detected.

If you jump now to the two connectors implementation, you will find that they share a lot of code. The only thing that change is how the service reference is obtained. One uses a property file as a service registry and the other one uses the Kubernetes Service registry filtering by labels to get hold of the service reference.

There is no magic inside the connectors (local and kube) just a RestTemplate executing an exchange:

https://github.com/Salaboy/test-spring-cloud-connectors/blob/master/payments-local-connector/src/main/java/org/salaboy/service/local/connectors/PaymentServiceImpl.java#L30

Each of these connectors will be registered to the specific cloud provider depending on which ServiceInfoCreator we provide.

[code language="java"] public class PaymentServiceInfoCreator extends LocalConfigServiceInfoCreator [/code]

And

[code language="java"] public class PaymentServiceInfoCreator extends KubernetesServiceInfoCreator [/code]

Spring does the resolution by using service loaders, and for that reason you need to register your own implementations by creating a couple of descriptors in META-INF/services/

You can take a look at this descriptors here:

 

These connectors are using different techniques to identify in which platform we are running.

In order to enable cloud connectors you need to set a property in your application.properties file:

[code language="xml"]spring.cloud.appId=myApp[/code]

In the case of Kubernetes, it looks at some ENV VARIABLES that must be set when you are running inside Kubernetes. For running the local connector you need to add a new properties file that will contain the service registry (which services are available and where they can be located). For our local payment service we have:

[code language="xml"]spring.cloud.payment=payment://localhost:8081/payment[/code]

In the case that we are running in kubernetes the connector will use the Kubernetes Registry to find out which services are available and where they are.

The following diagram show the isolation of connectors from the Process Engine (Runtime Bundle instance) and the external service. Adding an abstraction layer that allow developers to use the framework/technology of their choice to implement these connectors.

Screen Shot 2017-10-11 at 18.13.19

It is important to notice that the Payment Activiti Cloud Connector lives inside the Activiti Infrastructure. This means that it can leverage all the other components available inside the infrastructure, for example, distributed logging and configuration service, the Service Registry, etc.

To finish this long post I wanted to mention some important points:

  • This PoC represent the most complex scenario that you might find while planning your integrations, so it might not be considered as the hello world example.
  • I mentioned this before, but Activiti Cloud Connectors will be the perfect place to bundle multiple calls (integrations) against the same type of services. When using JavaDelegates you had two options: 1) Create specific JavaDelegate for a certain service call, follwing this approach you end up with very simple JavaDelegates, but with a lot of them. 2) Create a very generic JavaDelegate that can be parameterized to make different calls to the same service. With this you end up with one very complex delegate that becomes complicated to maintain. With this new approach, Activiti Cloud Connectors allow you to define the granularity and the set of filters that you want to apply to the @StreamListeners to process different types of integrations.
  • If you are only targeting Kubernetes you can use the Kubernetes APIs directly to resolve Kubernetes exposed services. The Spring Cloud Kubernetes Connector is leveraging the Spring Cloud Connectors layer to make sure that our services are not tied to Kubernetes and can run in different infrastructures. This is a common case, you will probably want to run the same application locally outside kubernetes and in kubernetes, the abstraction layer is necessary.
  • Even though I’ve included 5 projects inside the PoC repository, you can end up with only one project + the service that you are trying to integrate with. I was trying to split in different modules the different connectors to make sure that we all understand the boundaries of the services and the requirements from the client perspective.
  • The Activiti Cloud project will suggest on how to create these connectors, but you are free to use any technology inside them, at your own risk :)
  • We are currently finishing the internal bits in the process engine. You can follow the PR here ( https://github.com/Activiti/Activiti/pull/1479/files )

As soon as we have the PR ready to be merged in the process engine we will create some cloud connectors examples to demonstrate these concepts in action. Stay tuned and get in touch if you want to help or have comments about these topics. 

Last week (02/10/17 - 8/10/17) we covered numerous fronts. System to System integrations was a the core, but Igor and Ryan made huge progress on integrating with GraphQL and JHipster.

 

@igdianov finalized the GraphQL Pull Requests and improved our existing Query Code.

@ryandawsonuk created the first version of a JHipster generator for Activiti. Currently, this only supports embedding Activiti into an app rather than creating a runtime bundle and Activiti Cloud architecture but we’ve got a plan and have taken initial steps for covering the new Cloud components.

@erdemedeiros worked on message queue implementation for service tasks. The first PoC was successful: the new service task implementation sends a message to a queue and the execution waits for a new message containing the result before continuing. Improvements on the message content and filters are in progress.

@salaboy worked on getting an example with Service Cloud Connectors, Docker and Spring Cloud Streams working in kubernetes using Spring Cloud Connectors Kubernetes.


This week will be all about merging finished pull requests on the GraphQL side, refining the JHipster integration and finalizing the System to System Cloud Connectors (a blog post is coming about this). We are planning to do our last big repository refactoring related to the activiti-services and activiti-build to improve how we build and release all these new artifacts. If you experience any problem during this week, let us know so we can make sure to correct any issue that might arise. We want to welcome back @daisuke-yoshimoto who will be working on finalizing the MongoDB integration for our Audit module. We also want to welcome @willkzhou who just joined our community contributors and will be working on ElasticSearch integration.

The aim of this blog post is to show a working CI/CD example for managing process applications built using Alfresco Process Services (APS) powered by Activiti. Please note that the selection of the tools that are used in this article is my personal choice from an ever growing list of open source tools in this area! You should be able to swap one or more of these with your preferred tools/technologies. Similarly, things like release steps, versioning etc which I used in this article is just one way of doing it. I understand that every organization/team have their own established standard release processes. Again, the idea is you should be able to adjust the process to suit your needs!

CI/CD Process Steps

A typical CI/CD process for process applications built using Alfresco Process Services (APS) involves the following steps.

  1. Develop processes, forms, decision tables, data models, stencil etc. in the APS Web UI (App Designer)
  2. Group everything to an "App" and export the app.
  3. Create a java project using your IDE and write custom java extensions and delegates that are used by the process
  4. Add the exported app package into to the java project
  5. Write unit tests against the BPMN xml(s) available in the app package
  6. Configure the project to build an app.zip and app.jar upon build & package
  7. Add the java project to a version control repository
  8. Integrate the project in version control repository with an automation server and continuously build and unit test upon changes
  9. Version and upload the packages (zip and jar) built by the automation server job to an artifact repository.
  10. Download and deploy the versioned components from artifact repository to higher environments.
  11. Run any automated system integration tests that you may have after deployment.

DevOps Tools Used

The tools used in this example are:

  1. Jenkins  - a leading open-source automation server
  2. JFrog Artifactory - open-source artifact repository manager
  3. GitHub - popular web based Git version control repository
  4. Ansible - open-source deployment automation engine
  5. Apache Maven - open-source software project management tool (used for dependency management & packaging in this demo)

Component Diagram

Configuration Details

Sample Process Project (GitHub)

The first step in the demo is to create a simple process application in Alfresco Process Services. It is assumed that you are familiar with this step. If not, please refer to Get started with APS or APS User Guide. Once a process and associated components are built in the APS web modeler, group everything to an "App" and export the “App” archive. As the next step, we will save the process app components to GitHub. Please refer to GitHub: super-cool-process-app for a sample maven project structure. The process models and associated components we modeled in the web modeler are stored in this directory. The pom.xml of this project is configured to build the following two artifacts upon packaging.

  1. app.jar file - this will include all the custom java extensions that you may have to support the processes in the app.
  2. app.zip - this is just a zip archive of the app content models built using the maven assembly plugin.

 

Note: If you notice this project, the unit tests and java classes in this project are just some empty classes and not really related to process testing or process java delegates! The reason is, I just wanted to focus on the whole lifecycle in this article rather than focussing on the technical aspects. Check out the following two blogs where I have covered unit testing of APS applications in great depth

JFrog Artifactory Configuration

Artifactory is used for the resolving dependencies (including remote artifacts) at build & test time and also to store the output of each build (jar, app.zip and build info). Download and install Artifactory if you don’t already have one running. Once Artifactory is installed and running, we will do the following configuration.

 

In order to build an APS maven project, you would need to access the Alfresco Nexus Repo. I like to keep all such remote repository information at one place and just use one url to download all my dependencies rather than including various repository informations in pom files, maven settings files etc. To do this in Artifactory we can first create a remote repo in Artifactory pointing it to Alfresco Enterprise Repo and then add this remote repo to a virtual repo in Artifactory.

Creation of Remote Repository in Artifactory

  • Repository Key: activiti-enterprise-releases
  • URL: Use the alfresco enterprise repo url
  • Advanced -> Username/Password: use your alfresco nexus repo credentials

Please refer Artifactory Remote Repositories for more details on remote repositories.

Add Remote Repository to Virtual Repository in Artifactory

Now add the newly created remote repo to the default virtual release repo (named “libs-release”). Please refer Artifactory Virtual Repositories for more details on virtual repositories.

 

Ansible Scripts

Ansible is a very simple but powerful tool that can help you automate your application deployment steps! If you are not familiar with Ansible, I recommend you to check it out at https://www.ansible.com/. A typical APS process application deployment involves the deployment of two types of artifacts:

  1. Process application archive (containing process models, form models, decision tables, data models and stencils)
  2. A jar file containing the dependant java extensions and customizations

 

The Ansible scripts (Ansible Playbook) that I used to automate the deployment of above artifacts are:

  1. app-deployment playbook - deploys the process application archive using the deployment REST APIs of APS. Note: Since this deployment is saving the models to the Process Engine database, no need to run this playbook on all nodes in a cluster. So you can just run this against one node in the cluster or against a load balancer url.
  2. jar-deployment playbook - deploys/copies the process application jar file to tomcat/webapps/activiti-app/WEB-INF/lib and also deletes any old version of the same jar file. This step needs to be run on all nodes in the APS cluster if you have a clustered deployment. Things that are not covered in the playbook are:
    1. Custom property file deployments - it is also common to have property files per environment that are associated with your jar files.
    2. Restart of app server, waiting for the server to come back up, making an api call to make sure the app is up and running etc - After the jar/property file deployment, a restart is often recommended. You should be able to add those steps into your Ansible Playbooks.

Jenkins Configuration

Jenkins can be deployed in a number of ways, please refer jenkins-download for details. Once it is installed, open the Web UI of Jenkins and install the following plugins which we would be using in this demo (Jenkins -> Manage Jenkins -> Manage Plugins)

  1. Ansible Plugin
  2. Artifactory Plugin
  3. Git Plugin
  4. GitHub Plugin
  5. Pipeline

Please refer to the respective documentation for the configuration of the plugins. Once all the plugins are correctly configured, let’s create the build and deployment jobs in Jenkins that will pull the above mentioned sample code from GitHub, creates deployable artifacts and deploys the code to target environments. We’ll be creating the following two jobs:

  1. Build Job: The Jenkins job responsible for testing, building, publishing the build to Artifactory and triggering the deployment job after a successful build. Instructions on creating this job below:
    1. Jenkins -> New Item -> Select “Pipeline” type and give it a name. Eg: “super-cool-process-app” in this demo
    2. GitHub project -> Project url -> https://github.com/cijujoseph/super-cool-process-app/
    3. You can configure various build triggers. For this example, I elected to do a poll every minute against GitHub. Poll SCM -> Schedule -> * * * * *
    4. Now we will create a pipeline script as shown in pipeline-script-file. The pipeline is split to 5 stages:
      1. Clone - clone the project from GitHub
      2. Test - maven based testing which will execute all the unit tests in the project
      3. Artifactory config - configuration of Artifactory repo using the artifactory plugin. Various examples of this can be found at artifactory-project-examples
      4. Package and publish to Artifactory - this step will package and publish the build artifacts to Artifactory
      5. Kick off deployment - this step will kick off the downstream deployment job explained in next section

   

Screenshots of this config shown below:

 


 

  1. Deploy Job: This job takes care of the deployment of artifacts built in the previous “Build Job” using the above mentioned Ansible Playbooks. Instructions on creating this job below:
    1. Jenkins -> New Item -> Select “FreeStyle” type and give it a name. Eg: super-cool-process-app-deploy (this name is used in the previous pipeline script in the last stage (“Start Deployment Job”)
    2. Check (tick) the “This project is parameterized” checkbox to configure some input parameters for this job. This is because, the Ansible Playbooks I wrote is generic and can work for any process application. Hence it requires few input variables to correctly identify the artifact that is deployed.
      1. APP_NAME: Default value is set to the name of the process application. in the case “Super Cool App”
      2. ARTIFACT_NAME: Default value is set to the name of the artifact (maven project). In this case “super-cool-process-app”
      3. ARTIFACT_VERSION: We don’t set any default value. The value for this parameter will be passed when this job is triggered. For example: if triggered from a build job, pass the build version. If triggered manually, enter manually via UI.
    3. The next item “Source Code Management” is configured to point to the GitHub repository where I saved the Ansible Playbooks. Git -> Repositories -> Repository URL -> https://github.com/cijujoseph/aps-process-app-devops
    4. Build Environment -> Check the box “Delete workspace before build starts”
    5. Now we need to configure two “Invoke Ansible Playbooks” in the next section under “Build”.
      1. The first Ansible Playbook will deploy the process app package via the REST APIs of APS. Configuration as below:
        1. Ansible installation -> Select the Ansible configuration available in dropdown
        2. Playbook path ->  ${WORKSPACE}/ansible-playbooks/app-deployment/site.yml (this will resolve to app-deployment-playbook)
      2. The second Ansible Playbook will deploy the process app jar file. Configuration as below:
        1. Ansible installation -> Select the Ansible configuration available in dropdown
        2. Playbook path ->  ${WORKSPACE}/ansible-playbooks/jar-deployment/site.yml (this will resolve to jar-deployment-playbook)

 

Screenshots of this config shown below:

 

Demo Video

A short video of the whole process in action is available here

Things to consider

Since this is not a production ready sample, listing down some of the factors you may want to consider when implementing it in your production environment.

  1. Extend the pipeline to manage the APS software deployment as well - embedded, standalone in a application server, container deployment etc
  2. If using dockerized deployments, creating a docker image per build is also an option instead of copying thing like jars, property files etc to an existing image.
  3. If there are breaking changes in the java classes during process modifications, consider a proper deprecation strategy to handle any running process instances.
  4. Securely manage the credential storage using features such as "Vault"
  5. Management of environment variables
  6. Manage re-usable java libraries/stencil libraries etc separate from processes specific libraries.
  7. The app package can grow quite big if you have a lot of processes in one app and this might make the release management and versioning of individual processes difficult. For a more granular control on each processes, you can create smaller apps containing one or two processes. Then those applications can be deployed via its own CI/CD pipeline. However if you would like to expose these processes in a single app to the end user (especially if the processes are user initiated process involving start forms) you can lock down the smaller individual apps to just a small set set of PVT (Production Verification Test) users and the CI/CD user. Once the processes are verified, those processes can be made available to the end users via a separate public facing process app.

Conclusion

So, if you don’t already have a CI/CD process around your process applications in APS, go ahead and set one up and I’m sure it will make your application development experience with Alfresco Process Services much easier!

Third month in and we are making huge progress. We have an initial set of services designed with a distributed approach in mind. Powered by Spring Cloud our services will run natively in Kubernetes + Docker making sure that there is no impedance mismatch between the infrastructure and the way that the services are designed and consumed. After releasing to Maven Central and tagging our initial release in Docker Hub you can start consuming our out of the box artefacts or build your own by using our Activiti Cloud Starters.

You can always find the most up to date roadmap here: https://github.com/Activiti/Activiti/wiki/Activiti-7-Roadmap

 

Milestone #0 - July 2017 - Ended

  • Clean up Activiti
  • Domain API + HAL API + Runtime Bundle
  • XML/JSON/SVG for process definitions
  • Audit Service: Event Store for Audit Information
  • Identity Management and SSO (KeyCloak implementation)
  • First Release Process

Milestone #1 - August 2017 - Ended

  • Domain API + HAL API + Runtime Bundle
  • Improvements, refinements and additions
  • Query Service: Event Store for Runtime Information
    • Security Enabled
    • JPA - Reference Implementation
  • Infrastructure Enabled Services
    • Gateway (Zuul)
    • Application Registry (Eureka)
    • SSO & IDM* (Keycloak default implementation)
  • All Services are Docker Enabled
  • All Services can be deployed into Kubernetes
  • Cloud Examples

Milestone #2 - September 2017 - Ended

  • Domain API + HAL API + Runtime Bundle
  • Audit Service Mongo DB alternative
  • GraphQL review in progress
  • Release to Maven Central
  • Infrastructure Enabled Services
    • Helm Charts
    • Cloud Documentation
  • Cloud Examples Improvements
    • Validation Examples
    • AWS
    • Kubernetes / Minikube
    • Docker Compose

Milestone #3 - October 2017 - In Progress

October will be all about making sure that our services endpoints provides all the functionality required to create complex applications.
Cloud Connectors will help to make sure that System to System integrations are handled in an asynchronous way and minimal changes (extensions) are required to run business process model. We are working in mechanisms to make sure that we reduce our Runtime Bundle classpath extension points in order to simplify upgrades and maintenance tasks.


We started planning our Repository Service that will be used to host different types of models such as Process Definitions, Forms, Data Models, Decision Models, Stencils/Extensions, etc.

In parallel with the Repository Service, we will have a new Form Service in charge of providing all the pieces and glue to render forms using ADF components.

  • Refactor Activiti Cloud Services
  • Process Def & Instance Security model
  • Activiti Cloud Connectors
    • Process Engine refactoring to avoid JavaDelegates and classpath extensions
    • Kubernetes Service ready
  • Model Repository Service (Design and Initial Implementation)
  • Form Service (Initial discussions and planning)

Milestone #4 - November 2017

  • Polyglot Connectors Examples
  • Application Context Service - (Design and Initial Implementation)
    • Publish/Deploy Runtime Bundle Service
  • Process Engine Clean ups and refactoring
    • BPMN2 Extensions
    • History Service
    • Job Executor
    • Timers
    • Emails Service

Milestone #5 - December 2017

  • Deployment Service & Pipeline (design and initial implementation)
  • New Decision Runtime Design and Initial Implementation
  • Polyglot MicroProcessRuntime PoC

As always, if you want to participate on the development process of some of these components get in touch. We are willing to collaborate with the open community and mentor people that want to learn about the project.

We look forward all community involvement, from comments, concerns, help with documentation, tests and component implementations.

You can always find the more up to date Roadmap in our Github Wiki.

Remember, we are 24/7 in our Gitter Channel, feel free to join us there.

Stay tuned!

Last week (25/9/17 - 1/10/17) we manage to ship our first Early Access release. We are getting quite close to get an initial version of the Activiti Cloud Connectors using Spring Cloud Service Connectors. Also our GraphQL Query endpoints are looking really good thanks to @igdianov.

@daisuke-yoshimoto is doing great by adding the Mongo DB Audit Service alternative to our cloud examples.

This week, we had the pleasure of meeting some other community members that are looking forward to collaborate and use this new version of the project.

 

@igdianov improved Query Endpoints and Models to support GraphQL endpoints.

@daisuke-yoshimoto started the next issues that makes spring boot application for Audit Service with MongoDB and  activiti-cloud-examples for Audit Service with MongoDB.

@erdemedeiros currently investigating how to provide a message queue based service task implementation.

@ryandawsonuk further hardened the release process, ensuring it is fully automated, and proved that we could do a hotfix release if we needed/wanted to. Ryan then worked on creating first versions of helm charts for kubernetes deployments.

@salaboy worked on getting an example with Service Cloud Connectors, Docker and Spring Cloud Streams.

This week

Today (2nd October 2017) we are going to do our monthly retrospective to update our roadmap based on our progress. Another blog post is coming about the update roadmap.

We will be working with our community members to finish GraphQL endpoints and Cloud Connectors so we can move forward and do some final repository refactorings to make sure that our next release handle the addition of new repositories correctly.