Skip navigation
All Places > Alfresco Process Services & Activiti (BPM) > Blog > Authors cjose

Alfresco Process Services v1.9 released a few weeks ago introduced a new authentication module which is based on an open source IAM project called Keycloak which provides a wide range of authentication options! Since Keycloak supports X.509 Client Certificate User Authentication natively, the moment APS 1.9 was announced, I purchased a smart card called Yubikey which supports Personal Identity Verification (PIV) (FIPS 201, a US government standard) to test the X.509 support of Keycloak. This blog is to help Alfresco customers and/or partners looking to implement PIV authentication on Alfresco platform. The steps involved in getting this to work end-end are:

Please note that the steps in this blog is based on my experiments on my macOS. If you are not a macOS user you may want to adjust some of the config steps to match your OS

Generate SSL Certificates

The first step is to create the following certificates:

  • Server certificate issued and signed by an “Intermediate CA” which will be used to secure both Keycloak and APS apps.
  • Client certificate which can be used to authenticate the user. This client certificate will be loaded into the PIV smart card

In production scenarios, it is recommended to use internationally trusted CAs (eg: VeriSign) to sign your server and client certificates. Every organization will have best practices in place around certificate issuing and usage, so if you need SSL certificates to secure your apps, check with your security team first! For the purpose of this blog, I’ll be creating root & intermediate CAs by myself. The intermediate CA will be used to sign both the client and server certificates on behalf of the root CA.

 

Please follow the instructions in my GitHub repo to generate the following certificates which will be required for the subsequent sections of this blog:

  • Root CA pair
    • Root CA certificate - openssl-cert-gen-template/certs/ca.cert.pem
    • Intermediate CA key -  openssl-cert-gen-template/private/ca.key.pem
  • Intermediate CA pair
    • Intermediate CA certificate - openssl-cert-gen-template/intermediate/certs/intermediate.cert.pem
    • Intermediate CA key - openssl-cert-gen-template/intermediate/private/intermediate.key.pem
  • Client & server certificate
    • Client certificate - openssl-cert-gen-template/intermediate/certs/admin.cert.pem
    • Client key - openssl-cert-gen-template/intermediate/private/admin.key.pem
    • Server certificate - openssl-cert-gen-template/intermediate/certs/localhost.cert.pem
    • Server key - openssl-cert-gen-template/intermediate/private/localhost.key.pem
  • Certificate keystore - openssl-cert-gen-template/keystore/keystore.jks
  • CA truststore - openssl-cert-gen-template/truststore/truststore.jks

Configure Alfresco Process Services

Configure APS for SSL

Enable the HTTPS Connector in tomcat/conf/server.xml using the keystore created in the above mentioned “Generate SSL Certificates” section. My connector config on tomcat version 8.5.11 looks like below:

 

<Connector
           protocol="org.apache.coyote.http11.Http11NioProtocol"
           port="8443" maxThreads="200"
           scheme="https" secure="true" SSLEnabled="true"
           keystoreFile="/Users/cijujoseph/openssl-cert-gen-template/keystore/keystore.jks"
           keystorePass="keystore"
           clientAuth="false" sslProtocol="TLS"/>

 

Configure APS for Keycloak Authentication

APS version 1.9 added a new way of authentication based on Keycloak 3.4.3. I’m not going to go through the configuration details here, so please refer APS identity service documentation & APS 1.9 blog for more details on this configuration. Once APS is configured with Keycloak, the authentication flow will be driven by the configurations you make on the Keycloak server. My activiti-identity-service.properties configuration file in APS is shown below:

 

# --------------------------
# IDENTITY SERVICE
# --------------------------

# set to false to fully disable keycloak
keycloak.enabled=true
keycloak.realm=alfresco
keycloak.auth-server-url=https://localhost:8543/auth
keycloak.ssl-required=external
keycloak.resource=aps
keycloak.principal-attribute=email
keycloak.credentials.secret=5323135f-36bb-46c4-a641-907ad359827a
keycloak.always-refresh-token=true
keycloak.autodetect-bearer-only=true
keycloak.token-store=cookie
keycloak.enable-basic-auth=true
keycloak.use-resource-role-mappings=true
keycloak.public-client=false
keycloak.disable-trust-manager=true

 

Configure Yubikey (PIV/Smart Card)

I’m using a Yubikey Neo as the PIV smart card where I’ll load my client authentication certificate which will be used to login to APS. The smart card configuration steps are basically based on Yubikey documentation which you can find here.

Install Yubico PIV Tool

The Yubico PIV tool allow you to configure a PIV-enabled YubiKey through a command line interface. Download this tool and use the following commands to load the certificate and key into the authentication slot 9a of your smart card. You may need to configure the device and set a management key to run the following commands. The device setup instructions can be found here.

Commands to load the client certs into Yubikey

# Set pivtool home and openssl-cert-gen-template directories
pivtool_home=~/MyApps/yubico-piv-tool-1.5.0-mac
cert_dir=/Users/cijujoseph/openssl-cert-gen-template

# Import Certificate
$pivtool_home/bin/yubico-piv-tool -k $key -a import-certificate -s 9a < $cert_dir/intermediate/certs/admin.cert.pem

# Import Key
$pivtool_home/bin/yubico-piv-tool -k $key -a import-key -s 9a < $cert_dir/intermediate/private/admin.key.pem

Verify certificates using YubiKey PIV Manager

This is an optional step. The YubiKey PIV Manager enables you to configure a PIV-enabled YubiKey through a graphical user interface. Once the certificate and key is imported, you can verify the imported certificates via this utility. The installer can be found here. When a certificate is successfully loaded into the authentication slot of your Yubikey, the PIV manager will display it as shown below.

 

Verify certificates using YubiKey PIV Manager

Browser Configuration

Install OpenSC

As you can see from the OpenSC wiki page, this project provides a set of libraries and utilities to work with smart cards. Since I’m using a Mac, I followed the instructions on this page to get it installed using the DMG file provided by OpenSC

Configure Browser

Though I am a Chrome user, I used Firefox (version 60.0.2)  for testing the Smart Card Authentication into APS.

If you really want to test this on Chrome, you can use the Smart Card Connector Chrome app to test this. Though this app is intended for Chrome on Chrome OS, it worked for me on my Mac too. However it may prompt you for the Yubikey admin pin too many times which is quite annoying!

My recommendation is to install Firefox and configure Firefox with the OpenSC PKCS11 module as explained below!

Preferences -> Privacy & Security -> Security Devices -> Load

  1. Module name -> “PKCS#11 Module”
  2. Module filename -> “/Library/OpenSC/lib/opensc-pkcs11.so” (installed as part of the OpenSC installation above) 

Import the root and intermediate CAs openssl-cert-gen-template/certs/ca.cert.pem & openssl-cert-gen-template/intermediate/certs/intermediate.cert.pem respectively via Preferences -> Privacy & Security -> View Certificates -> Authorities -> Import so that the browser will trust servers configured with certificates issued by these CAs

Configure Keycloak

First step is to install Keycloak 3.4.3 as documented here. The Keycloak documentation is quite detailed, hence I’m not going to detail it out here again. In the next few sections, I’ll go through the X.509 specific configuration of Keycloak which is essential to get this working!

Configure Keycloak for two-way/mutual authentication

For more details on X.509 client certificate authentication configuration, please refer enable-x-509-client-certificate-user-authentication.

  • Copy the openssl-cert-gen-template/keystore/keystore.jks & openssl-cert-gen-template/truststore/truststore.jks files to $KEYCLOAK_HOME/standalone/configuration.
  • Open the standalone.xml file and add the following ssl-realm to management/security-realms group in the xml.
<security-realm name="ssl-realm">
    <server-identities>
         <ssl>
             <keystore path="keystore.jks" relative-to="jboss.server.config.dir" keystore-password="keystore"/>
         </ssl>
     </server-identities>
    <authentication>
        <truststore path="truststore.jks" relative-to="jboss.server.config.dir" keystore-password="truststore" />
    </authentication>
</security-realm>
  • Add a https-listener to the profile/subsystem[xmlns='urn:jboss:domain:undertow:4.0']/server[name="default-server"] in the standalone.xml
<subsystem xmlns="urn:jboss:domain:undertow:4.0">
    ...
    <server name="default-server">
        ...
             <https-listener name="https" socket-binding="https" security-realm="ssl-realm" verify-client="REQUESTED"/>
          ...
     </server>
    ...
</subsystem>
  • Start Keycloak standalone using the command “$KEYCLOAK_HOME/bin/standalone.sh -Djboss.socket.binding.port-offset=100 -b 0.0.0.0” which will start the server by offsetting the default ports by 100. This is helpful to avoid port conflicts on your localhost. With this command, the https port will become 8543 instead of default 8443.

Configure Keycloak authentication flows

Login to Keycloak by going to https://localhost:8543/ (admin/admin) is the default admin user credentials. Add a new user with a username that matches with the certificate attributes. Username “admin” and email “admin@app.activiti.com” by going to Keycloak -> <your realm> -> Users -> Add user

Configure Direct Grant

Configuring direct grant is the easiest way to verify the configuration. For more details refer adding-x-509-client-certificate-authentication-to-a-direct-grant-flow. Screenshots below:

Configure Direct Grant Flow

 

Configure Direct Grant

Configure Browser Flow

The following screenshots will show how to configure the browser flow to use X.509 authentication. For more details, please refer adding-x-509-client-certificate-authentication-to-a-browser-flow. Screenshots below:




Configure Authentication Bindings



Demo

Direct Grant

Use the following command to test the direct grant (please change the certificate path as per your configuration). For more details refer adding-x-509-client-certificate-authentication-to-a-direct-grant-flow

 

curl https://localhost:8543/auth/realms/alfresco/protocol/openid-connect/token \
      --insecure \
      --data "grant_type=password&scope=openid profile&client_id=aps&client_secret=5323135f-36bb-46c4-a641-907ad359827a" \
      -E /Users/cijujoseph/openssl-cert-gen-template/intermediate/certs/admin.cert.pem \
      --key /Users/cijujoseph/openssl-cert-gen-template/intermediate/private/admin.key.pem

 

Browser Auth Demo

Insert the smart card into your computer and test the browser authentication flow as shown in the below video

 

 

References

Special thanks to the following references!

This blog is a continuation of my first blog around the work we Ciju Joseph and Francesco Corti did as part of Alfresco Global Virtual Hack-a-thon 2017

 

In this blog I’ll be walking you through aps-unit-test-example project we created where I’ll be using the features from the aps-unit-test-utils library which I explained in the first blog.

About the Project

This project contains a lot of examples showing:

  • how to test various components in a  BPMN (Business Process Model and Notation) model
  • how to test a DMN (Decision Model and Notation) model
  • how to test custom java classes that are supporting your process models.

Project Structure

Before even we get to the unit testing part, it is very important to understand the project structure.

As you can see from the above diagram, this is a maven project. However, if you are a “gradle” person, you should be able to do it the gradle way too! The various sections of the project are:

  1. Main java classes - located under src/main/java. This includes all the custom java code that are supporting your process/dmn models.
  2. Test classes -  located under src/test/java. The tests are again grouped into different packages depending on the type of units they are.
    1. Java class tests - This includes test classes for classes (eg: Java Delegate, Task Listener, Event Listener, Custom Rest Endpoints, Custom Extensions etc) under src/main/java.
    2. DMN tests - As you can see from the package name (com.alfresco.aps.test.dmn) itself, I’m writing all the DMN tests under this package. The pattern I followed in this example is one test class per DMN file under the directory src/main/resources/app/decision-table-models.
    3. Process(BPMN) tests - Similar to DMN tests, the package com.alfresco.aps.test.process contains all the BPMN test classes. Similar to DMN tests, I am following the pattern of one test class per BPMN file under src/main/resources/app/bpmn-models
  3. App models - All the models (forms, bpmn, dmn, data models, stencils, app.json etc) that are part of the process application is stored under the directory src/main/resources/app. When using the aps-unit-test-utils which I explained in the previous article, all the models are downloaded to this directory from APS. Once the tests are passed successfully, we will re-build the deployable process artifacts from this directory
  4. Test resources - As with any standard java projects, you can keep all your test resources in the directory src/test/resources. I’ll highlight a couple of files that you will find under this directory in the above project structure image
    1. activiti-resources.properties - This file contains the APS server configurations such as server address, api url, user credentials etc for downloading the process application into your maven project. Please refer to my previous article for a detailed explanation of this file. You wouldn’t find this file on GitHub under this project, the reason is, this file is intended to be developer specific and local to the workspace of a developer. For this reason this file is included in the project’s .gitignore file to prevent it from getting saved to GitHub.
    2. process-beans-and-mocks.xml - the purpose of this file is to mock any project/process specific classes when you run your process tests. The concept is explained in detail in my previous article when I explained a similar file called common-beans-and-mocks.xml.  
  5. Build output - In the above screenshot you can see that there are two files named aps-unit-test-example-1.0-SNAPSHOT-App.zip and aps-unit-test-example-1.0-SNAPSHOT.jar under /target directory. This is basically the build output that gets generated when you package the app using maven commands such as “mvn clean package”. The “.zip” file is the app package created from src/main/resources/app directory which you can version after every build and deploy to higher environments. The “.jar” is the standard jar output including all the classes/resources from your src/main directory.
  6. Maven pom xml - Since this is a maven based project, you need a pom.xml under the root of the project. Highlighting some of the dependencies and plugins that are used in this pom.xml
    • aps-unit-test-utils dependency - the test utils project which I explained in my previous post/blog.
      <dependency>
           <groupId>com.alfresco.aps</groupId>
           <artifactId>aps-unit-test-utils</artifactId>
           <version>[1.0-SNAPSHOT,)</version>
      </dependency>
    • maven-compiler-plugin - a maven plugin that helps compile the sources of the project
      <plugin>
           <artifactId>maven-compiler-plugin</artifactId>
           <version>3.6.2</version>
           <configuration>
                <source>1.8</source>
                <target>1.8</target>
           </configuration>
      </plugin>
    • maven-assembly-plugin - a maven plugin that is used to package the “app.zip” archive from src/main/resources/app
      <plugin>
           <artifactId>maven-assembly-plugin</artifactId>
           <version>3.1.0</version>
           <executions>
                <execution>
                     <configuration>
                          <descriptors>
                               <descriptor>src/main/resources/assembly/assembly.xml</descriptor>
                          </descriptors>
                     </configuration>
                     <id>create-distribution</id>
                     <phase>package</phase>
                     <goals>
                          <goal>single</goal>
                     </goals>
                </execution>
           </executions>
      </plugin>

Unit Test Examples

Now that you have a good understanding of all the project components, let’s take a look at some of the examples available in the project. I have tried my very best to keep the test classes and processes as simple as possible to make it easy for everyone to follow without much explanation.

Process Testing

AbstractBpmnTest.java - This class can be used as a parent class for all the BPMN test classes. To avoid writing the same logic in multiple test classes, I added a few common logic into this, they are:

  • Setup of a mock email server
  • Process deployment prior to tests
  • Clean up such as delete all deployments after each tests
  • Test coverage alerts
/* Including it in the Abstract Class to avoid writing this in all the Tests.
      * Pre-test logic flow -
      * 1)      Download from APS if system property -Daps.app.download=true
      * 2)      Find all the bpmn20.xml's in {@value
      *           BPMN_RESOURCE_PATH} and deploy to process engine
      * 3)     Find all the elements in the process that is being tested. This set will
      *           be compared with another set that contains the process elements that are
      *           covered in each tests (this get updated after each tests).
      */

     @Before
     public void before() throws Exception {

          if (System.getProperty("aps.app.download") != null && System.getProperty("aps.app.download").equals("true")) {
               ActivitiResources.forceGet(appName);
          }

          Iterator<File> it = FileUtils.iterateFiles(new File(BPMN_RESOURCE_PATH), null, false);
          while (it.hasNext()) {
               String bpmnXml = ((File) it.next()).getPath();
               String extension = FilenameUtils.getExtension(bpmnXml);
               if (extension.equals("xml")) {
                    repositoryService.createDeployment().addInputStream(bpmnXml, new FileInputStream(bpmnXml)).deploy();
               }
          }
          processDefinitionId = repositoryService.createProcessDefinitionQuery()
                    .processDefinitionKey(processDefinitionKey).singleResult().getId();
          List<Process> processList = repositoryService.getBpmnModel(processDefinitionId).getProcesses();
          for (Process proc : processList) {
               for (FlowElement flowElement : proc.getFlowElements()) {
                    if (!(flowElement instanceof SequenceFlow)) {
                         flowElementIdSet.add(flowElement.getId());
                    }
               }
          }
     }

     /*
      * Post-test logic flow -
      * 1)      Update activityIdSet (Set containing all the elements tested)
      * 2)      Delete all deployments
      */

     @After
     public void after() {
          for (HistoricActivityInstance act : historyService.createHistoricActivityInstanceQuery().list()) {
               activityIdSet.add(act.getActivityId());
          }
          List<Deployment> deploymentList = activitiRule.getRepositoryService().createDeploymentQuery().list();
          for (Deployment deployment : deploymentList) {
               activitiRule.getRepositoryService().deleteDeployment(deployment.getId(), true);
          }
     }

     /*
      * Tear down logic - Compare the flowElementIdSet with activityIdSet and
      * alert the developer if some parts are not tested
      */

     @AfterClass
     public static void afterClass() {
          if (!flowElementIdSet.equals(activityIdSet)) {
               System.out.println(
                         "***********PROCESS TEST COVERAGE WARNING: Not all paths are being tested, please review the test cases!***********");
               System.out.println("Steps In Model: " + flowElementIdSet);
               System.out.println("Steps Tested: " + activityIdSet);
          }
     }

Process Example 1

In this example we will test the following process diagram which is a simple process containing three steps Start → User Task → End

 

UserTaskUnitTest.java - test class associated with this process which tests the following

  • A process is started correctly
  • Upon start a user task is created and assigned to the correct user with the correct task due date
  • Upon completion of the user task the process is ended successfully
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(locations = { "classpath:activiti.cfg.xml", "classpath:common-beans-and-mocks.xml" })
public class UserTaskUnitTest extends AbstractBpmnTest {

     /*
      * Setting the App name to be downloaded if run with -Daps.app.download=true
      * Also set the process definition key of the process that is being tested
      */

     static {
          appName = "Test App";
          processDefinitionKey = "UserTaskProcess";
     }

     @Test
     public void testProcessExecution() throws Exception {
          /*
           * Creating a map and setting a variable called "initiator" when
           * starting the process.
           */

          Map<String, Object> processVars = new HashMap<String, Object>();
          processVars.put("initiator", "$INITIATOR");

          /*
           * Starting the process using processDefinitionKey and process variables
           */

          ProcessInstance processInstance = activitiRule.getRuntimeService()
                    .startProcessInstanceByKey(processDefinitionKey, processVars);

          /*
           * Once started assert that the process instance is not null and
           * successfully started
           */

          assertNotNull(processInstance);

          /*
           * Since the next step after start is a user task, doing a query to find
           * the user task count in the engine. Assert that it is only 1
           */

          assertEquals(1, taskService.createTaskQuery().count());

          /*
           * Get the Task object for further task assertions
           */

          Task task = taskService.createTaskQuery().singleResult();

          /*
           * Asserting the task for things such as assignee, due date etc. Also,
           * at the end of it complete the task Using the custom assertion
           * TaskAssert from the utils project here
           */

          TaskAssert.assertThat(task).hasAssignee("$INITIATOR", false, false).hasDueDate(2, TIME_UNIT_DAY).complete();

          /*
           * Using the custom assertion ProcessInstanceAssert, make sure that the
           * process is now ended.
           */

          ProcessInstanceAssert.assertThat(processInstance).isComplete();
     }

}

Process Example 2

Let’s now look at a process that is a little more complex than the previous one. As you can see from the diagrams below, there are two units that are candidates for unit test in this model, they are process model & DMN model

  • DMNProcessUnitTest.java - Similar to the above example, this is the test class associated with this process which tests the following:
    • A process is started correctly
    • Tests all possible paths in the process based on the output of rule step
    • Successful completion of process
    • Mocks the rules step - when it comes to the rules/decision step in the process, we are not invoking the actual DMN file associated with the process. From a process perspective all we care is that an appropriate variable is set at this step for it to take the respective path that is being tested. Hence the mock.
  • DmnUnitTest.java - This is the test class associated with the DMN file that is invoked from this process. More explanation in next section.

DMN Testing

AbstractDmnTest.java - Similar to the AbstractBpmnTest class I explained above, this class can be used as a parent class for all the DMN test classes. To avoid writing the same logic in multiple test classes, I added a few common logic into this, they are:

  • DMN deployment prior to tests
  • Clean up such as delete all deployments after each tests
/*
      * Including it in the Abstract Class to avoid writing this in all the
      * Tests. Pre test logic -
      * 1)      Download from APS if system property -Daps.app.download=true
      * 2)      Find all the dmn files in {@value
      * DMN_RESOURCE_PATH} and deploy to dmn engine
      */

     @Before
     public void before() throws Exception {

          if (System.getProperty("aps.app.download") != null && System.getProperty("aps.app.download").equals("true")) {
               ActivitiResources.forceGet(appName);
          }

          // Deploy the dmn files
          Iterator<File> it = FileUtils.iterateFiles(new File(DMN_RESOURCE_PATH), null, false);
          while (it.hasNext()) {
               String bpmnXml = ((File) it.next()).getPath();

               String extension = FilenameUtils.getExtension(bpmnXml);
               if (extension.equals("dmn")) {
                    DmnDeployment dmnDeplyment = repositoryService.createDeployment()
                              .addInputStream(bpmnXml, new FileInputStream(bpmnXml)).deploy();
                    deploymentList.add(dmnDeplyment.getId());
               }
          }
     }

     /*
      * Post test logic -
      * 1)      Delete all deployments
      */

     @After
     public void after() {
          for (Long deploymentId : deploymentList) {
               repositoryService.deleteDeployment(deploymentId);
          }
          deploymentList.clear();
     }

DMN Example 1

In this example we will test the following DMN model which is a very simple decision table containing three rows of rules.

  • DmnUnitTest.java - the test class associated with the above DMN model. The test cases in this file will test every row in the DMN table and verify that it is getting executed as expected. The number of rules in real life can grow in size over time, hence it is important to have test cases covering all the possible hit and miss scenarios in your test cases for a healthy maintenance of your decision management and business rules.
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(locations = { "classpath:activiti.dmn.cfg.xml" })
public class DmnUnitTest extends AbstractDmnTest {

     static {
          appName = "Test App";
          decisonTableKey = "dmntest";
     }

     /*
      * Test a successful hit using all possible inputs
      */

     @Test
     public void testDMNExecution() throws Exception {
          /*
           * Invoke with input set to xyz and assert output is equal to abc
           */

          Map<String, Object> processVariablesInput = new HashMap<>();
          processVariablesInput.put("input", "xyz");
          RuleEngineExecutionResult result = ruleService.executeDecisionByKey(decisonTableKey, processVariablesInput);
          Assert.assertNotNull(result);
          Assert.assertEquals(1, result.getResultVariables().size());
          Assert.assertSame(result.getResultVariables().get("output").getClass(), String.class);
          Assert.assertEquals(result.getResultVariables().get("output"), "abc");

          /*
           * Invoke with input set to 123 and assert output is equal to abc
           */

          processVariablesInput.put("input", "123");
          result = ruleService.executeDecisionByKey(decisonTableKey, processVariablesInput);
          Assert.assertNotNull(result);
          Assert.assertEquals(1, result.getResultVariables().size());
          Assert.assertSame(result.getResultVariables().get("output").getClass(), String.class);
          Assert.assertEquals(result.getResultVariables().get("output"), "abc");

          /*
           * Invoke with input set to abc and assert output is equal to abc
           */

          processVariablesInput.put("input", "abc");
          result = ruleService.executeDecisionByKey(decisonTableKey, processVariablesInput);
          Assert.assertNotNull(result);
          Assert.assertEquals(1, result.getResultVariables().size());
          Assert.assertSame(result.getResultVariables().get("output").getClass(), String.class);
          Assert.assertEquals(result.getResultVariables().get("output"), "abc");
     }

     /*
      * Test a miss
      */

     @Test
     public void testDMNExecutionNoMatch() throws Exception {
          Map<String, Object> processVariablesInput = new HashMap<>();
          processVariablesInput.put("input", "dfdsf");
          RuleEngineExecutionResult result = ruleService.executeDecisionByKey(decisonTableKey, processVariablesInput);
          Assert.assertEquals(0, result.getResultVariables().size());
     }

}

Custom Java Class Testing

This section is about the testing of classes that you may write to support your process models. This includes testing of Java Delegates, Task Listeners, Event Listeners, Custom Rest Endpoints, Custom Extensions etc which are available under src/main/java. The naming convention I followed for the test classes is “<ClassName>Test.java” and the package name is the same package name of the class that we are testing.

 

Let’s now inspect an example which is the testing of a task listener named TaskAssignedTaskListener.java

Example 1

The above task listener is used in a process named CustomListeners in the project.  From a process testing perspective, this TaskListener is mocked in the process test class CustomListenersUnitTest.java via process-beans-and-mocks.xml. We now have this task listener class that is still not unit tested. Let’s inspect its testing class TaskAssignedTaskListenerTest.java which is tested the following way:

  1. Set up mocks and inject mocks into classes that are being tested
  2. Set up mock answering stubs prior to execution
  3. Execute the test and assert the expected results
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration
public class TaskAssignedTaskListenerTest {

     @Configuration
     static class ContextConfiguration {
          @Bean
          public TaskAssignedTaskListener taskAssignedTaskListener() {
               return new TaskAssignedTaskListener();
          }
     }

     @InjectMocks
     @Spy
     private static TaskAssignedTaskListener taskAssignedTaskListener;

     @Mock
     private DelegateTask task;

     @Before
     public void initMocks() {
          MockitoAnnotations.initMocks(this);
     }

     /*
      * Testing TaskAssignedTaskListener.notify(DelegateTask task) method using a
      * mock DelegateTask created using Mockito library
      */

     @Test
     public void test() throws Exception {

          /*
           * Creating a map which will be used during the
           * DelegateTask.getVariable() & DelegateTask.setVariable() calls from
           * TaskAssignedTaskListener as well as from this test
           */

          Map<String, Object> variableMap = new HashMap<String, Object>();

          /*
           * Stub a DelegateTask.setVariable() call
           */

          doAnswer(new Answer<Void>() {
               @Override
               public Void answer(InvocationOnMock invocation) throws Throwable {
                    Object[] arg = invocation.getArguments();
                    variableMap.put((String) arg[0], arg[1]);
                    return null;
               }
          }).when(task).setVariable(anyString(), any());

          /*
           * Stub a DelegateTask.getVariable() call
           */

          when(task.getVariable(anyString())).thenAnswer(new Answer<String>() {
               public String answer(InvocationOnMock invocation) {
                    return (String) variableMap.get(invocation.getArguments()[0]);
               }
          });
         
          /*
           * Start the test by invoking the method on task listener
           */

          taskAssignedTaskListener.notify(task);
         
          /*
           * sample assertion to make sure that the java code is setting correct
           * value
           */

          assertThat(task.getVariable("oddOrEven")).isNotNull().isIn("ODDDATE", "EVENDATE");
     }

}

 

Checkout the whole project on GitHub where we have created a lot of examples that covers the unit testing of various types of BPMN components and scenarios. We’ll be adding more to this over the long run.

 

 

Hopefully this blog along with the other two unit-testing-part-1 & aps-ci-cd-example is of some help in the Lifecycle Management of Applications built using Alfresco Process Services powered by Activiti

This blog is the first part of the two blog series around the work we Ciju Joseph & Francesco Corti did as part of Alfresco Global Virtual Hack-a-thon 2017

Hack-a-thon Project Description & Goal

Alfresco Process Services (APS) powered by Activiti has a standard way to develop custom java logic/extensions in your IDE. Typically the process models that often needs a lot of collaboration from many members of a team are developed in the web modeler of the product. From a packaging and versioning perspective, the process application, should be managed together with your java project. Since the role of unit tests is very critical during the lifecycle of these process artifacts it is important that we have good unit test coverage, testing the process models and custom java logic/extensions. The goal of this hack-a-thon project was to work on some unit test utilities and samples which can benefit the Alfresco community.

 

As part of this work we created two java projects (maven based) which are:

  1. aps-unit-test-utils - This is a utils project containing:
    1. Logic to automatically download the process app from an APS environment and make it available in your IDE/workspace.
    2. Activiti BPMN and DMN engine configuration xmls with all the necessary spring beans that you would need for your testing
    3. Mocks for OOTB (Out of the Box) APS BPMN stencils such as “Publish To Alfresco”, “REST call task” etc. Not all the components are mocked, but this gives you an idea on how to mock OOTB stencil components!
    4. Helper classes and custom assertions based on AssertJ library to help you quickly write tests over your process and decision tables.
  2. aps-unit-test-example - This project contains a lot of examples showing how to test your BPMN/DMN models and also, custom java logic associated with these models.

 

In this blog I’ll be walking you through the utils project and some of the main features available in this project

Project Features

Utility to fetch Process App from APS

One of the main features of this project is that it allows you to download(optionally) the process models that you have modeled in APS web modeler to your IDE during your local unit testing. Once you are happy with all your changes and unit test results, you can save those downloaded models into the version control repository. The reason why I highlighted the word “optionally” above is that, it is important that when you run your unit test in a proper CI/CD pipeline, you are unit testing the models that you have in your version control repository and avoid any other external dependencies.

 

The package com.alfresco.aps.testutils.resources in the project contains the classes responsible for downloading the process models from an APS environment into your IDE.

It is this method com.alfresco.aps.testutils.resources.ActivitiResources.get(String appName) which does this magic for you! You can invoke this method from your test classes at testing time to download and test your changes in web modeler. The method logic is:

  1. Read a property file named “activiti-resources.properties” containing APS environment and api details. Sample property file available at activiti-resources.properties
  2. If the app you are requesting is found on the server and if you have permissions to access the app, it is downloaded to your project under the path src/main/resources/app as a zip and then exploded into this directory. All the existing models will be deleted prior to the download and unzip
  3. From a unit testing perspective of the models, it is important we have the BPMN and DMN xmls available in the exploded directory under src/main/resources/app/bpmn-models and src/main/resources/app/decision-table-models respectively. When this method is successfully completed you will have those xmls in the respective directories ready for unit testing.

Configuration XMLs for the BPMN and DMN engines

Since Alfresco Process Services is a spring based webapp, we wrote all the utils, helper classes, examples etc using spring features. In this section, I’ll explain the two main configuration xmls present in src/main/resources directory that can be used to bootstrap the process engine and the dmn engine for unit testing.

  1. src/main/resources/activiti.cfg.xml:  Using this xml configuration a process engine is created with an in-memory h2 database. As you probably know there is a lot of configurations options available for Activiti process engine. If your test setup requires advanced configurations you should be able to do everything in this xml.
  2. src/main/resources/activiti.dmn.cfg.xml: This xml can be used to start the Activiti rule engine (DMN engine), again with an in-memory h2 database

Depending on the model that you are testing (BPMN/DMN), you can use one of the above configuration xmls to bootstrap the engine from your test cases.

Non-Engine Bean Configuration XML

src/main/resources/common-beans-and-mocks.xml: This xml can be used to configure any mock/real beans that are required for your process testing but not really part of the process/rule engine configuration. Those mock and non-mock beans are explained in the following subsections.

Mock Beans for OOTB APS BPMN Stencils

Since I think “mocks” are best explained with an example, please find below a process diagram where I’m using an OOTB APS “REST call task” stencil. I have also highlighted some of the other components in the APS editor that fall in this category of OOTB stencils.

 

In this example, from a unit testing perspective of OOTB components, you need to make sure a few things such as:

  1. This step is invoked upon a successful start of the process
  2. The expected response is set in your unit test so that you can continue with the next steps in the process.
  3. The configurations set on the model are successfully transferred to the BPMN XML upon export/deployment

However there are things that are not in-scope of unit testing of OOTB components:

  1. Test whether the REST API configured is invoked successfully - this is integration testing
  2. Testing of various configurations available on a REST task - this is the responsibility of Alfresco engineering team to make sure the configurations are working as expected

 

This is where we use mocks instead of the real classes that are behind these components. In order to create the mocks for these components, we need to first look at how these tasks look inside the deployed bpmn.xml. For example the bpmn equivalent of the above diagram is shown below:

<serviceTask id="sid-AB2E4A5F-4BF6-48BE-8FF1-CDE01687E69A" name="Rest Call" activiti:async="true" activiti:delegateExpression="${activiti_restCallDelegate}">
  <extensionElements>
    <activiti:field name="restUrl">
      <activiti:string><![CDATA[https://api.github.com/]]></activiti:string>
    </activiti:field>
    <activiti:field name="httpMethod">
      <activiti:string><![CDATA[GET]]></activiti:string>
    </activiti:field>
    <modeler:editor-resource-id><![CDATA[sid-8124CB5D-BD47-49CD-B013-F7FFB576DE8D]]></modeler:editor-resource-id>
  </extensionElements>
</serviceTask>

As you can see from the XML, the bean that is responsible for the REST call is activiti_restCallDelegate. This bean also has some fields named “httpMethod”, “restUrl” etc. Let’s now look at the mock class (given below) that I created for this bean. Since it is a mock class, all you need to do is create a java delegate with the field extensions that are present in the bpmn.xml.

package com.alfresco.aps.mockdelegates;

import org.activiti.engine.delegate.DelegateExecution;
import org.activiti.engine.delegate.JavaDelegate;
import org.activiti.engine.delegate.Expression;

public class RestCallMockClass implements JavaDelegate{
    
     Expression restUrl;
     Expression httpMethod;
     Expression baseEndpoint;
     Expression baseEndpointName;
     Expression requestMappingJSONTemplate;
    
     @Override
     public void execute(DelegateExecution execution) throws Exception {
          // TODO Auto-generated method stub
     }

}

Now that I have the mock class, next step is to create a mock bean using this class so that it is resolved correctly during unit test time. Given below is the mock bean configuration in src/main/resources/common-beans-and-mocks.xml corresponding to the above mentioned mock class.

<bean id="activiti_restCallDelegate" class="org.mockito.Mockito" factory-method="mock"> 
   <constructor-arg value="com.alfresco.aps.mockdelegates.RestCallMockClass" />
</bean>

You may have noticed that I am using a class called org.mockito.Mockito for the creation of the mocks. This is from the Mockito Library which is a great library for mocking!

I have created a few mock classes in this project which you can find in com.alfresco.aps.mockdelegates package. I have included them in the common-beans-and-mocks.xml file too. As you probably know APS contains a lot of OOTB stencils and this project contains only a very small subset. The point is, you should be able to mock any such OOTB beans using the above approach.

Common Beans

Any common beans (helper classes, utils etc) that you may require in the context of your testing can be added to common-beans-and-mocks.xml. Technically it can be separated from the mock xml, but for the sake of simplicity I kept it in the same xml.

Custom Assertions & Helper Classes

The classes in packages com.alfresco.aps.testutils and com.alfresco.aps.testutils.assertions are basically helper classes and assertions which can be re-used across all your process testing in an easy and consistent way. An approach like this will help reduce the unit test creation time and also help enforce some process modelling and unit test best practices. Highlighting some of the key features:

  1. The project contains an AbstractBpmnTest and AbstractDmnTest class which can be used as parent class to test your bpmn.xml and dmn.xml respectively.
  2. Includes a mock email transport which is set up in the AbstractBpmnTest. This can be used to test any email steps you have in the process.
  3. Custom Assertions using AssertJ Library on Activiti entities such as Task, ProcessInstance, DelegateExecution etc. Please note, the assertions I have created in this project is definitely not covering all possible assertion scenarios. However I think I have put a decent mix in there for you get started and you can add many more assertion methods depending on your test cases.  

 

Checkout the next blog Alfresco Process Services - Unit Testing # II to see the usage of some of these helper classes and assertions.

Conclusion

There are plenty of articles and blogs out there around unit testing best practices. So I’m not going to do that again here. However just wanted to stress one point with the help of an example: Do not mix integration testing with unit testing.

For example, a process containing the following steps Start → DMN (Rules) → Service Task → UserTask → End can be tested as three units

  1. Unit test for the process xml where you mock the “DMN” and “Service Task” steps
  2. Unit test for the DMN xml testing each rules in the DMN
  3. Unit test for the Service Task class

So, what next?

  • If you already have a good unit testing framework around your processes in APS, great, continue that way. Feel free to provide your feedback and contributions either as comments or as blogs here on community.
  • If you don’t have any unit testing around your processes in APS, I hope this article will help you get started. Feel free to fork it and make it your own at your organization. Good unit tests around your processes, rules, code etc will definitely help you in the long run especially when doing upgrades, migrations, change requests, bug fixes etc

 

Happy Unit Testing!

The aim of this blog post is to show a working CI/CD example for managing process applications built using Alfresco Process Services (APS) powered by Activiti. Please note that the selection of the tools that are used in this article is my personal choice from an ever growing list of open source tools in this area! You should be able to swap one or more of these with your preferred tools/technologies. Similarly, things like release steps, versioning etc which I used in this article is just one way of doing it. I understand that every organization/team have their own established standard release processes. Again, the idea is you should be able to adjust the process to suit your needs!

CI/CD Process Steps

A typical CI/CD process for process applications built using Alfresco Process Services (APS) involves the following steps.

  1. Develop processes, forms, decision tables, data models, stencil etc. in the APS Web UI (App Designer)
  2. Group everything to an "App" and export the app.
  3. Create a java project using your IDE and write custom java extensions and delegates that are used by the process
  4. Add the exported app package into to the java project
  5. Write unit tests against the BPMN xml(s) available in the app package
  6. Configure the project to build an app.zip and app.jar upon build & package
  7. Add the java project to a version control repository
  8. Integrate the project in version control repository with an automation server and continuously build and unit test upon changes
  9. Version and upload the packages (zip and jar) built by the automation server job to an artifact repository.
  10. Download and deploy the versioned components from artifact repository to higher environments.
  11. Run any automated system integration tests that you may have after deployment.

DevOps Tools Used

The tools used in this example are:

  1. Jenkins  - a leading open-source automation server
  2. JFrog Artifactory - open-source artifact repository manager
  3. GitHub - popular web based Git version control repository
  4. Ansible - open-source deployment automation engine
  5. Apache Maven - open-source software project management tool (used for dependency management & packaging in this demo)

Component Diagram

Configuration Details

Sample Process Project (GitHub)

The first step in the demo is to create a simple process application in Alfresco Process Services. It is assumed that you are familiar with this step. If not, please refer to Get started with APS or APS User Guide. Once a process and associated components are built in the APS web modeler, group everything to an "App" and export the “App” archive. As the next step, we will save the process app components to GitHub. Please refer to GitHub: super-cool-process-app for a sample maven project structure. The process models and associated components we modeled in the web modeler are stored in this directory. The pom.xml of this project is configured to build the following two artifacts upon packaging.

  1. app.jar file - this will include all the custom java extensions that you may have to support the processes in the app.
  2. app.zip - this is just a zip archive of the app content models built using the maven assembly plugin.

 

Note: If you notice this project, the unit tests and java classes in this project are just some empty classes and not really related to process testing or process java delegates! The reason is, I just wanted to focus on the whole lifecycle in this article rather than focussing on the technical aspects. Check out the following two blogs where I have covered unit testing of APS applications in great depth

JFrog Artifactory Configuration

Artifactory is used for the resolving dependencies (including remote artifacts) at build & test time and also to store the output of each build (jar, app.zip and build info). Download and install Artifactory if you don’t already have one running. Once Artifactory is installed and running, we will do the following configuration.

 

In order to build an APS maven project, you would need to access the Alfresco Nexus Repo. I like to keep all such remote repository information at one place and just use one url to download all my dependencies rather than including various repository informations in pom files, maven settings files etc. To do this in Artifactory we can first create a remote repo in Artifactory pointing it to Alfresco Enterprise Repo and then add this remote repo to a virtual repo in Artifactory.

Creation of Remote Repository in Artifactory

  • Repository Key: activiti-enterprise-releases
  • URL: Use the alfresco enterprise repo url
  • Advanced -> Username/Password: use your alfresco nexus repo credentials

Please refer Artifactory Remote Repositories for more details on remote repositories.

Add Remote Repository to Virtual Repository in Artifactory

Now add the newly created remote repo to the default virtual release repo (named “libs-release”). Please refer Artifactory Virtual Repositories for more details on virtual repositories.

 

Ansible Scripts

Ansible is a very simple but powerful tool that can help you automate your application deployment steps! If you are not familiar with Ansible, I recommend you to check it out at https://www.ansible.com/. A typical APS process application deployment involves the deployment of two types of artifacts:

  1. Process application archive (containing process models, form models, decision tables, data models and stencils)
  2. A jar file containing the dependant java extensions and customizations

 

The Ansible scripts (Ansible Playbook) that I used to automate the deployment of above artifacts are:

  1. app-deployment playbook - deploys the process application archive using the deployment REST APIs of APS. Note: Since this deployment is saving the models to the Process Engine database, no need to run this playbook on all nodes in a cluster. So you can just run this against one node in the cluster or against a load balancer url.
  2. jar-deployment playbook - deploys/copies the process application jar file to tomcat/webapps/activiti-app/WEB-INF/lib and also deletes any old version of the same jar file. This step needs to be run on all nodes in the APS cluster if you have a clustered deployment. Things that are not covered in the playbook are:
    1. Custom property file deployments - it is also common to have property files per environment that are associated with your jar files.
    2. Restart of app server, waiting for the server to come back up, making an api call to make sure the app is up and running etc - After the jar/property file deployment, a restart is often recommended. You should be able to add those steps into your Ansible Playbooks.

Jenkins Configuration

Jenkins can be deployed in a number of ways, please refer jenkins-download for details. Once it is installed, open the Web UI of Jenkins and install the following plugins which we would be using in this demo (Jenkins -> Manage Jenkins -> Manage Plugins)

  1. Ansible Plugin
  2. Artifactory Plugin
  3. Git Plugin
  4. GitHub Plugin
  5. Pipeline

Please refer to the respective documentation for the configuration of the plugins. Once all the plugins are correctly configured, let’s create the build and deployment jobs in Jenkins that will pull the above mentioned sample code from GitHub, creates deployable artifacts and deploys the code to target environments. We’ll be creating the following two jobs:

  1. Build Job: The Jenkins job responsible for testing, building, publishing the build to Artifactory and triggering the deployment job after a successful build. Instructions on creating this job below:
    1. Jenkins -> New Item -> Select “Pipeline” type and give it a name. Eg: “super-cool-process-app” in this demo
    2. GitHub project -> Project url -> https://github.com/cijujoseph/super-cool-process-app/
    3. You can configure various build triggers. For this example, I elected to do a poll every minute against GitHub. Poll SCM -> Schedule -> * * * * *
    4. Now we will create a pipeline script as shown in pipeline-script-file. The pipeline is split to 5 stages:
      1. Clone - clone the project from GitHub
      2. Test - maven based testing which will execute all the unit tests in the project
      3. Artifactory config - configuration of Artifactory repo using the artifactory plugin. Various examples of this can be found at artifactory-project-examples
      4. Package and publish to Artifactory - this step will package and publish the build artifacts to Artifactory
      5. Kick off deployment - this step will kick off the downstream deployment job explained in next section

   

Screenshots of this config shown below:

 


 

  1. Deploy Job: This job takes care of the deployment of artifacts built in the previous “Build Job” using the above mentioned Ansible Playbooks. Instructions on creating this job below:
    1. Jenkins -> New Item -> Select “FreeStyle” type and give it a name. Eg: super-cool-process-app-deploy (this name is used in the previous pipeline script in the last stage (“Start Deployment Job”)
    2. Check (tick) the “This project is parameterized” checkbox to configure some input parameters for this job. This is because, the Ansible Playbooks I wrote is generic and can work for any process application. Hence it requires few input variables to correctly identify the artifact that is deployed.
      1. APP_NAME: Default value is set to the name of the process application. in the case “Super Cool App”
      2. ARTIFACT_NAME: Default value is set to the name of the artifact (maven project). In this case “super-cool-process-app”
      3. ARTIFACT_VERSION: We don’t set any default value. The value for this parameter will be passed when this job is triggered. For example: if triggered from a build job, pass the build version. If triggered manually, enter manually via UI.
    3. The next item “Source Code Management” is configured to point to the GitHub repository where I saved the Ansible Playbooks. Git -> Repositories -> Repository URL -> https://github.com/cijujoseph/aps-process-app-devops
    4. Build Environment -> Check the box “Delete workspace before build starts”
    5. Now we need to configure two “Invoke Ansible Playbooks” in the next section under “Build”.
      1. The first Ansible Playbook will deploy the process app package via the REST APIs of APS. Configuration as below:
        1. Ansible installation -> Select the Ansible configuration available in dropdown
        2. Playbook path ->  ${WORKSPACE}/ansible-playbooks/app-deployment/site.yml (this will resolve to app-deployment-playbook)
      2. The second Ansible Playbook will deploy the process app jar file. Configuration as below:
        1. Ansible installation -> Select the Ansible configuration available in dropdown
        2. Playbook path ->  ${WORKSPACE}/ansible-playbooks/jar-deployment/site.yml (this will resolve to jar-deployment-playbook)

 

Screenshots of this config shown below:

 

Demo Video

A short video of the whole process in action is available here

Things to consider

Since this is not a production ready sample, listing down some of the factors you may want to consider when implementing it in your production environment.

  1. Extend the pipeline to manage the APS software deployment as well - embedded, standalone in a application server, container deployment etc
  2. If using dockerized deployments, creating a docker image per build is also an option instead of copying thing like jars, property files etc to an existing image.
  3. If there are breaking changes in the java classes during process modifications, consider a proper deprecation strategy to handle any running process instances.
  4. Securely manage the credential storage using features such as "Vault"
  5. Management of environment variables
  6. Manage re-usable java libraries/stencil libraries etc separate from processes specific libraries.
  7. The app package can grow quite big if you have a lot of processes in one app and this might make the release management and versioning of individual processes difficult. For a more granular control on each processes, you can create smaller apps containing one or two processes. Then those applications can be deployed via its own CI/CD pipeline. However if you would like to expose these processes in a single app to the end user (especially if the processes are user initiated process involving start forms) you can lock down the smaller individual apps to just a small set set of PVT (Production Verification Test) users and the CI/CD user. Once the processes are verified, those processes can be made available to the end users via a separate public facing process app.

Conclusion

So, if you don’t already have a CI/CD process around your process applications in APS, go ahead and set one up and I’m sure it will make your application development experience with Alfresco Process Services much easier!

Amazon Simple Queue Service (SQS) and Apache ActiveMQ ™ are two popular messaging systems/platforms out there. Alfresco Process Services powered by Activiti (APS) can be integrated with these systems in a few different ways. Some of the available options are:

  • Custom extension projects built using Spring libraries
  • Using Apache Camel Component in APS
  • Using Mule Component in APS

 

To demonstrate the first option mentioned above, I built a couple of very simple java projects (one for SQS and one for ActiveMQ). The idea of this blog is to point you to those examples. Since these example projects are really simple, I'll keep this blog really short.

 

The pattern is pretty much the same in both the examples, and is as given below:

  • Establish a connection with the respective messaging system from Alfresco Process Services
  • Listen to a SQS/MQ queue for new messages and start a process for every new message.
  • Send messages to SQS/MQ Queue during the process instance execution.

 

APS Integration with Amazon SQS

Source: GitHub: aps-aws-sqs-extension 

 

APS Integration with Apache ActiveMQ

Source: GitHub: aps-activemq-extension 


To try this out and for more details, please refer the README file available in the above mentioned projects.

This is a continuation of my previous blog post about data models Business Data Integration made easy with Data Models. In this example I'll be showing the integration of Alfresco Process Services powered by Activiti (APS) with Amazon DynamoDB using Data Models. Steps required to set up this example are:

 

  1. Create Amazon DynamoDB tables
  2. Model the Data Model entities in APS web modeler
  3. Model process components using Data Models
  4. DynamoDB Data Model implementation
  5. App publication and Data Model in action

 

Let’s look at each of these steps in detail. Please note that I’ll be using the acronym APS throughout this post to refer to Alfresco Process Services powered by Activiti. The source code required to follow the next steps can be found at GitHub: aps-dynamodb-data-model 

Create Amazon DynamoDB tables

As a first step to run this sample code, the tables should be created in Amazon DynamoDB service.

  1. Sign in to AWS Console https://console.aws.amazon.com/
  2. Select "DynamoDB" from AWS Service List
  3. Create Table "Policy"-> (screenshot below)
    1. Table name : Policy
    2. Primary key : policyId"
  4. Repeat the same steps to create another table called "Claim"
    1. Table name : Claim
    2. Primary key : claimId

Now you have the Amazon DynamoDB business data tables ready for process integration.

Model the Data Model entities in APS web modeler

Next step is to model the business entities in APS Web Modeler. I have already built the data models and they are available in the project. All you have to do is to import the "InsuranceDemoApp.zip" app into your APS instance. Please note that this app is built using APS version 1.6.1 which will not get imported in older versions. If you are using APS version 1.5.X or older, please import the app from my project I used in my previous blog post.

 

Once the app is successfully imported, you should be able to see the data models. A screenshot given below.

Model processes components using Data Models

Now that we have the data models available, we can now start using them in processes and associated components such as process conditions, forms, decision tables etc. If you inspect the two process models which got imported in the previous step, you will find various usages of the data model entities. Some of those are shown below:

 

 

Using Data Model in a process model


Using Data Models in sequence flows

 


Using Data Model in Forms

 


Using Data Models in Decision Tables (DMN)

 

 

Let’s now go to the next step which is the implementation of custom data model which will do the communication between process components and Amazon DynamoDB

 

DynamoDB Data Model implementation

In this step we will be creating an extension project which will eventually do the APS<-->Amazon DynamoDB interactions. You can check out the source code of this implementation at aps-dynamodb-data-model . For step by step instructions on implementing custom data models, please refer Activiti Enterprise Developer Series - Custom Data Models. Since you need a valid licence to access the Alfresco Enterprise repository to build this project, a pre-built library is available in the project for trial users - aps-dynamodb-data-model-1.0.0-SNAPSHOT.jar. Given below are the steps required to deploy the jar file.

  1. Create a file named "aws-credentials.properties" with the following entries and make it available in the APS classpath

aws.accessKey=<your aws access key>
            aws.secretKey=<your aws secret key>
            aws.regionName=<aws region eg:us-east-1>

  1. Deploy aps-dynamodb-data-model-1.0.0-SNAPSHOT.jar file to activiti-app/WEB-INF/lib

App publication and Data Model in action

This is the last step in the process where you can see the data model in action. In order to execute the process, you will need to deploy (publish) the imported app first. You can do it by going to APS App UI -> App Designer -> Apps → InsuranceDemoApp → Publish

Once the process and process components are deployed, you can either execute the process by yourselves and see it in action OR refer to video link in Business Data Integration made easy with Data Models demonstrating data model.

 

Once you run the processes, log back in to AWS Console https://console.aws.amazon.com/ and check the data in respective tables as shown below

 

 

That’s all for now. Again, stay tuned for more data model samples….

“Stencils” provide a very powerful set of capabilities to Alfresco Process Services powered by Activiti (APS) and can be used within the BPMN 2.0 Editor, the Step Editor and the Forms Editor.  Within the context of Forms, Stencils provide the facilities to develop custom form field types. The idea of this blog is to point you to some working form stencil samples.

In the recent weeks I had to build a few of these custom form stencils and thought it would be valuable to share it with the community. If you are someone who are just starting on stencils, I recommend you first read a 101 (introductory) blog on form stencils, which is available at Form Stencils 101.

In this blog I’ll take you through the following 5 form stencils:

  1. Custom Simple Text Input
  2. Grouping Fields using stencil
  3. Custom Grid/Table
  4. Signature Pad
  5. Rich Text Editor

Now let me try and explain these examples a bit more....

Simple Text Input Stencil

Source: github: simple-text-input-stencil

I built this component to demonstrate the implementation of a very simple custom input field using stencil. You might be wondering, why would I need to do this? Most often you may not need such a field in real life, however the idea of this stencil is to demonstrate the basic building blocks, or “Hello World” for a stencil implementation.

Grouping Fields using Stencil

Source: github: group-of-fields-stencil

One of the main use cases around this example is to build reusable domain specific field controls. Eg: Build a single form field called “Address” which will consist of fields such as Address Line, Town, State, Country, Zip Code etc. Building reusable form components very specific to your business data objects makes form modelling really easy. Given below is a screenshot of design time view and runtime view of my example.

 

Design Time View

At design time, the designer will select the custom stencil and configure it with some process data as shown below

Configuration

Run Time View

At runtime, the stencil will be displayed based on your configuration using your custom html, custom css etc

Custom Grid/Table

Source: github: angular-ui-grid-stencil

I built this component recently for a customer who wanted to display a list of records from a REST API in a tabular format. I have to say that it took only 15 minutes for me to do the research and implement this feature using the Angular UI Grid component. Since APS OOTB UI use this module for the dynamic table component, the angular modules I needed to implement this feature was already part of the product. That made my life easy! So, if you have complex table requirements that cannot be met by the OOTB table component, I recommend you to look at all the features available in this library at Angular UI Grid Tutorial and I’m pretty sure that a stencil component built using the features available in this library can meet most of your needs

 

Important - Since AngularJS doesn't allow module injection after application is bootstrapped, one cannot declare an external module as part of the stencil controller code.

One of the options is to include an external AngularJS module into the "activitiApp" is by modifying "activiti-app/workflow/scripts/<minified script>.js" the following way.

eg: to add ngMap module from ng-map, one should this activitiApp=angular.module("activitiApp",[.,.,.,.,.,.,"ngMap"]) in the main module js file.

 

Please refer the comments where Greg has provided another option to load an external module.

 

Signature Pad

Source: github: signature-pad-stencil

A signature stencil implementation based on Signature Pad

Thanks to Angular Signature for providing the AngularJS directive which I used in my Stencil implementation. My example app, demonstrates the following:

  1. Capture a signature
  2. Display of a signature captured in a previous task
  3. Display the signature in a document that is generated in the process.

Rich Text Editor

Source: github: tinymce-rich-text-editor

A rich text editor stencil implementation based on TinyMCE

Thanks to TinyMCE AngularJS Integration for providing the AngularJS directive which I used in my Stencil implementation.

 

General instructions to run the above examples

The zip archive files available in the above mentioned github source folders are “App” exports. To to use them, you will need to import them via App Designer (Kickstart App) -> Apps -> Import App. Once the “App” is successfully imported, the stencils along with an example process and form will also get imported which will make it easy for you to see these examples in action!

Finally..

Hopefully this is a good set of examples to show the power of stencil component in Alfresco Process Services. If you happen to build any cool form components using stencils, feel free to share it here in the form of a blog or video...

There are more examples available on the product documentation page - APS Docs: Custom Form Fields & Developer Series - Custom Form Fields 

 

Note: I’m in the process of building Alfresco ADF (Application Development Framework) equivalent of these stencils and as soon as they are ready, I’ll make it available and update this blog with the reference. A 101 blog on implementing form stencil components in ADF can be found here - Custom stencils with Alfresco ADF

 

UPDATE:

Business Process Management (BPM) best practices often suggests that the BPM solution not be the system of record. In particular, the business data required for the Digital Business Solution should exist in other data stores outside of the persistence store being used by the BPM engine itself. Business data that is used and or created during the execution of a business process should exist and be maintained in one or more external data stores (e.g. RDBMS, NoSQL, etc...). Therefore to simplify and accelerate the development of enterprise scale Digital Business Solutions, Alfresco Process Services (Alfresco’s Enterprise Edition of the Activiti Community Edition (Open Source)) provides an important and valuable component called "Data Models" which is the focus of this blog.

Please note that I’ll be using the acronym APS throughout this post to refer to Alfresco Process Services.

Business Data Integration in BPM solutions

If you haven’t read this blog post (a bit old, but still very relevant) Storing data in automated business processes :: AirQuill  & More on Orchestration Data :: AirQuill  already, you must read it first (especially the first blog) before you go into my next section. This is a great post explaining why storing data outside of process engine tables is so important!

Integration options available in APS

Good news is, APS provides a variety of options for customers to do the business data integration. Listing down below all the data integration options that are available when implementing business processes using Activiti engine:

  1. BPMN Service Task Component
  2. Custom Java Logic wired into the process using listeners such as Execution Listener, Event Listener, Task Listener etc
  3. Execute Custom SQL
  4. REST Task Component (APS/Enterprise Only Feature)
  5. Data Model Component (APS/Enterprise Only Feature)

 

Since the purpose of this blog is to go through the Data Model component, I’ll only be focussing on the 5th item in the above list. Please checkout Activiti User Guide , Activiti Custom SQL User GuideAlfresco Documentation | Alfresco Documentation  and Alfresco Process Services Blogs for more about the other 4 options.

Advantages of Data Model over other alternatives

Given below are some of the pain points we hear from business process modelers, analysts, developers, etc quite often.

  • Systems in my organization are so difficult to integrate into our processes.
  • Data modeling capabilities in our existing BPM platform is highly technical and have got a steep learning curve.
  • As an analyst/modeler, I would love to have some features in the product that allow me to model my SoR (System of Records) data model in the process platform.
  • Our organization has very mature and well defined REST APIs around all our IT systems. However as a business analyst/process modeler, mapping REST API requests and responses are too technical a job for me!
  • We have well defined and re-usable web services based on standards like SOAP, POX etc in our organization. We wish our BPM system has inbuilt capabilities that allow us to write re-usable and business friendly components over these web services.
  • We have been using Activiti community version for a long time. We have a lot of reusable Java code that allow us to integrate our processes with our IT systems. In order to understand those external system integration we often have to dig into the Java source code associated with the process. When moving from community to enterprise, it would be really nice if we can visually represent those data structures in the BPMN modeler and have direct integration of those components to other process components such as forms, rules etc.
  • As a business person, when I review a BPMN diagram I see a lot of service tasks with hidden Java logic in it. Every time when I do this, I have to go to a developer to understand the java components and to find out the input and output fields of those components. This makes me look stupid!
  • We are an organization with a lot of old school two tier applications (client->database) with no APIs. We need our business processes to talk directly to our application databases!
  • We use Alfresco Content Services as our System of Records for documents. What are the integration capabilities of APS with Alfresco Content Services.

 

Data Model is the component that can address all the above mentioned concerns/pain points in an elegant, simple and user friendly fashion without the complexities of similar components that is normally found in other large BPM vendor products.

 

Enough of overview and description and let’s see it in action.

Data Model Demo

The crux of the blog is in the following video which I recorded to demonstrate the various capabilities of the Data Model component

Summary

Let me summarize the post highlighting the key features of Data Models:

  • Data Models allow you to separate the data integration from business process modeling, In other words, process modeling is made easy with Data Models where it allows you to hide the implementation complexity from process models.
  • Data Models are integrated with all other modeling components such as forms, decision tables etc available in APS thereby reducing the time to market of your business process solutions.
  • Data Models allow you build re-usable domain entity objects/components which in-turn can be reused in a uniform way across multiple processes.
  • Data Model is a business friendly integration alternative available in APS.

 

Hands-on time!

If you are new to the APS, this post will help you get started with it - Installing Alfresco Process Services Trial using an Installer 

If you are new to the Data Model component, I highly recommend you to first read a couple of other posts which are given below, before trying out the demo I used in this post.

The complete source code associated with the above video presentation along with a detailed readme file is available at https://github.com/cijujoseph/activiti-examples/tree/master/activiti-custom-data-model-sample

 

I'll be creating more data model examples in the coming months! Stay tuned...

This blog is a short explanation of a voice enabled business process demo which I built using Alfresco Process Services powered by Activiti and Amazon Alexa. The solution also has integration with a variety of technologies such as Alfresco Content Services, Email, Twilio, Decooda etc. A user can interact with the process over voice using their Amazon Echo and mobile phone! The source code of my demo components are available at https://github.com/cijujoseph/activiti-examples/tree/master/activiti-alexa-demo along with a detailed README of all components.

Demo Use Case

The use case for my demo is a “Vehicle Service Booking” business process. The business process is modeled and run using Alfresco Process Services. The business process can be started by a user using an Amazon Echo device. Once started, the process will schedule an appointment in the system and will go into a wait state until the booking date. While the process is in a wait state, the appointment can be changed or cancelled by the user via their Echo device. On the appointment day, a “User Task” will get assigned to a technician/mechanic who will complete the task upon service completion. Once the “User Task” is completed by the technician, a “Service Report” file is generated by the process which is then emailed to the customer and saved to Alfresco Content Services for records. At this stage the user is also notified by a Text message and asked to participate in a short survey. The next step in the process will collect user feedback through a voice call using the Twilio APIs. The feedback provided by user is analysed using a modern cognitive analytics platform (Decooda) which will accurately measure the customer experience. The results from the analytics platform is passed through a set of business rules (DMN rule engine available in Alfresco Process Services) and process will get routed accordingly.

To me, the most interesting parts of this whole demo are the following:

  1. Use of voice (Echo) to start and manage the process.
  2. Ease of integration of Alfresco Process Services with external cloud platforms such as Twilio.
  3. The power of platforms such as Decooda for measuring customer’s emotions and the value it can bring to a business process like this!

 

Please find below the solution diagram followed by a BPMN diagram (modeled using the BPMN Modeler in Alfresco Process Services) which shows the various steps in my business process and the components involved in each step:

 

 

 

 

Demo Stack Explained

  • Alfresco Process Services powered by Activiti- The business process which is the core component of this demo is built using Alfresco Process Services powered by Activiti. Please refer to https://www.alfresco.com/platform/process-services-bpm for more details.
  • Alfresco Content Services - The contents associated with the process (customer feedback, service report etc)  are configured to be stored in Alfresco Content Services platform. For more details on Alfresco Content Services please refer to https://www.alfresco.com/platform/content-services-ecm. Alfresco Process Services has out of the box integration with popular content management repositories such as Alfresco Content Services, Box, Google Drive etc. Please refer to http://docs.alfresco.com/process-services1.6/topics/integration_with_external_systems.html for configuring Alfresco Process Services with these systems.You can modify the demo process to save the content to GDrive, Box etc.
  • Amazon Alexa Integration - A user can schedule a service using their Amazon Echo device. In addition to starting the process using Echo, the user is also able to Change Appointment Date, Cancel their appointment and Check their appointment date at any time using their Alexa device.
  • Twilio Integration - https://www.twilio.com/ - Voice and Messaging integration with Alfresco Process Services is done using Twilio cloud communication platform in this demo.
    • User will receive a text notification when the servicing is complete and the car is ready to be picked up.
    • User can make a call to one of the Twilio numbers and provide feedback over the phone.
    • Twilio recording and transcription services are used to record the voice message and subsequently transcribe the voice to text for further analysis!
  • Decooda Integration - https://decooda.com/ - Decooda is a powerful Cognitive Text Mining and Big Data Analytics Platform which is used to analyse the transcribed customer feedback. The Decooda analytics results will help you  accurately understand the customer, their emotions, service experience etc. The Decooda results are then used to run some business rules(DMN based) in the process model and route the process through appropriate paths depending on the customer experience.

Demo Code

I want to keep it simple, so that’s all I have in this blog! Now go ahead and try this out yourselves! Demo code along with instructions are available at https://github.com/cijujoseph/activiti-examples/tree/master/activiti-alexa-demo