Skip navigation
All Places > Alfresco Process Services & Activiti (BPM) > Blog

Last week, the whole team spent a lot of time making sure that the code and projects are well organized under a whole new set of repositories. You can find all the new repositories under the organization. A blog post will follow next explaining the changes and additions to our projects. I recommend to our community members to check the Activiti/activiti-cloud-examples repositories, where you can find a Docker based example of starting the whole infrastructure plus your domain specific runtime bundles (Spring Boot Process Enabled applications).


@daisuke-yoshimoto he is trying to improve sending event data to Audit Service for performance issue and transaction consistency.Next, he is adding MongoDB support to Audit Service.

@gmalanga still working on the elastic repository, for replacing the existing Spring-Data JPA repository for the query service with Elasticsearch.  

@abdotalaat working on getting a simple local identity service to support simple scenarios where SSO is not required.

@fcorti continued to work on the activiti example. This time integrating the 'activiti-cloud-keycloak' container for authentication, ‘activiti-cloud-registry’’ implementing the registry using Spring Eureka and 'activiti-cloud-starter' to define a real microservice architecture for the departments (in the example). Next week is planned to complete the iteration.

@erdemedeiros merged first iteration of query module into the master branch; created related starter in the activiti-cloud-starters repository. Currently adding more integration tests to the query starter.

@ryandawsonuk modified some of the key services to separate out a starter and an implementation based on the starter with the implementations now published to dockerhub and the main cloud example able to use the published images

@salaboy mostly worked on creating the builds in our internal servers to make sure that we can publish docker images to docker hub and reviewed multiple times the example that we are building in activiti-cloud-examples.

This week

This week is all about refining our process to produce and publish docker images to Docker Hub and creating End User examples against our new services. We will be also looking at creating a initial version of our Kubernetes services descriptors.

Notice: as part of our work to keep the project healthy we are closing all the issues in Github that are inactive. We take reported issues very seriously and for that reason we want to make sure that all open issues are being worked on. If you are involved in one of these closed issues and you still think that there is some work to be done there, please get in touch, re-open the issue and write a comment. All new opened issues will be worked on and have a person assigned to it that will report back progress. 
Join us in Gitter if you know or want to learn about Activiti, Spring Cloud, Docker and Kubernetes.


Original Blog Post: Activiti 7: Last week Dev Logs #7 – Salaboy (Open Source Knowledge) 


Activiti & Activiti Cloud

Posted by salaboy Employee Aug 22, 2017

After a week of moving things around, we are reaching a point where we can share the new structure of our projects and repositories. The changes introduced highlight the different nature of each of these projects and how are they going to be used and consumed. The changes are now considered stable and unless we find very good reason for a new re-organization, these repositories will be considered as part of the Activiti 7 project main efforts.


Conceptually, you will find that there are 3 main layers:

Screen Shot 2017-08-16 at 15.24.05

The Core layer will always contain Java Frameworks, in this case, the Activiti/Activiti repository will host the Process Engine that you can embed in your own applications as any other java dependency. While this is enough for some cases, it requires you to build a significant layer of integration and make some complicated decisions about your runtime. I’ve seen such implementations fall into two main categories:

  • The engine embedded in the application: this approach has major drawbacks regarding footprint of your application, memory consumption and adding too much responsibility.
  • The engine embedded in a service layer: this approach is better, but it pushes you to define this service layer. Some companies use this layer to make sure that they don’t depend on a process engine, but this is costly and takes a lot of time to get it right

The same applies to the Query and Audit modules, which are now hosted in the same repository, but we might want to move those away in a future refactoring.

The next new layer that we are providing in Activiti is the Service Layer. As mentioned before, we want to avoid you having to implement this Service wrapper on top of the Engine. For that reason, we have created a modern REST HAL API and Message-Based Endpoints that can be easily extended and adapted to your needs. These Services are designed and implemented in isolation to make sure that they follow the single responsibility approach. We now provide the following services:

  • Process Engine
  • Audit Service
  • Query Service

This list will be expanded in the future with more services, and we will make sure that our services doesn’t overlap functionality that is already provided by the infrastructure or other popular components that you might be already using. A typical example for avoiding overlap, in this case, is the new SSO/IDM component that we are using. We are not providing any homegrown SSO/IDM mechanism as most of the BPM engines out there do. Instead, we are delegating that responsibility to a component that has been designed for providing that integration layer with SSO and IDM specific implementations.

Finally, our last layer is the Infrastructure layer. Our new infrastructure layer allows us to provide a simple way to bootstrap all these services in a cloud-oriented way. We recognized that most of our users aim to run these services on existing infrastructure and for us, it is important to make their lives easier. This infrastructure layer is based on Spring Boot, Spring Cloud,  Docker and Kubernetes, relying on and reusing all the services that they provide so our services can scale independently.  Once again, by aligning our services to these technologies we wanted to make sure that we don’t overlap with the features that they provide. We want our users to feel that when they adopt Activiti, they don’t need to change their infrastructure or the way that they do things in their other services.

The Repositories

Under the Activiti organization in Github we have created several repositories to represent these layers. These repositories are linked in our CI servers and arrows in the following diagram means downstream dependency.  Every time that we make a change in the core engine all the services will need to be built and test again.

Notice also, that all the repositories that depends on *-starters are going to generate and publish Docker Images to Docker Hub, meaning that you will be able to consume all these infrastructure without the need of compiling a single Java class.

Our activity-cloud-examples repository will provide examples that show how to bootstrap the whole infrastructure in your dev environments and existing cloud infrastructures.

One more important thing to understand is that we recognise that each implementation will probably want to replace one or more of these components, so you can end up using the example implementation as reference to implement your own components. If you think that one of your components might benefit the whole community then please get in touch- we can help each other with the design and implementation.

Here are some of the links to our new repositories, Docker Hub and our Travis-CI builds:

Core and Core services repository

Docker Hub Activiti organization

Activiti Cloud related repositories

Activiti Cloud Examples

Travis CI public builds

New Approach / New Scenarios

You might have notice that we now have more services, more things to manage, possible different types of storage. Clearly the way of using the Process Engine is changing. This new approach will open the door to new scenarios, scenarios where we don’t work against a single clustered process engine.  Scenarios where we want different components to emit events that will be aggregated by other components such as the Query and Audit service. Each of these different scenarios might have different requirements such as the use of NoSQL data stores to support graph based data or json documents for search and indexing. This new approach will also allows us to scale different parts of our infrastructure separately and responsively.

Because of that all our services are dockerized and will require an orchestration layer to wire them together. The next section provides a quick intro to these docker images and how to get all the infrastructure up and running by following the activiti-cloud-examples.

A (docker) image worth more than 1000 words

We have now 6 (docker) images published on docker hub and you can get them all up and running in just a couple of minutes. The following diagram shows set of services started when you follow the README file in the activiti-cloud-examples repository:


Notice that Cloud Connectors are not there yet. The client application represent your other microservices that might share the same infrastructure as Activiti. And Databases related to each services are omitted for simplicity, but the Runtime Bundle docker-compose is starting a PostgreSQL DB. 

Also notice that the communication between these components is likely to happen in an asynchronous way, and for that reason we are also starting RabbitMQ as our message broker. Because we are relying on Spring Cloud Streams, the provider (binder) can be replaced with by some other providers such as Kafka and Active MQ. 

From a client/developer side, you only need docker to get all up and running in a couple of minutes. No java or maven is required to build your domain specific runtime bundles.

You can use the (chrome plugin) postman collection to test the services endpoints.

We will be working hard to make sure that we provide tools to package and version runtime bundles to make sure the the whole process of building and deploying these images is smooth and fast.

More blog posts about Runtime Bundles are coming, which is a central concept in the new infrastructure.  

If you are interested, have questions, comments or if you want to participate in all these changes please get in touch. You can join us everyday in our Gitter channel, where we have open discussions about how each of these components are implemented.

Stay tuned!


Original Blog post:

Big changes are coming into the Activiti 7 project. Last week (7/8/17 - 13/8/17) we started making some big organizational changes in how we structure our projects and our repositories. We have created 3 new repositories: activiti-examples, activiti-compatibility and activiti-spring-boot-starters. Due the amount of changes we are trying to minimize the negative impact that we have in our builds and in people working on different branches. All these changes are necessary and we hope to stabilize the build by the end of next week.


@daisuke-yoshimoto added new REST API for handling task variables. Next, he is adding MongoDB support to Audit Service.

@gmalanga is working on using ElasticSearch to provide an alternative implementation of the Query Service.

@abdotalaat created  a new spring JPA project that expose data (users-IdentityInfo-groups-membership) using Rest Endpoints. The previously created user/group/membership migrator from activiti 6 to keycloak was moved here.

@fcorti developed iteration n.3 of the Emergency Call Center project as an example of microservices based solution. This time the project is composed by an Activiti process (with some Java delegates) and three external REST services running as three independent Spring Boot applications. 

@erdemedeiros worked on query module improvements; added support to variable related events. Now the Query Service module is quite close to be merged on the master branch.

@ryandawsonuk integrated zuul and eureka into the docker setup, moved the main docker example app to using postgres, reduced the size of the docker images and refactored the audit service to split out a starter dependency.

@salaboy was working on refactoring the API modules inside the new Activiti Services modules. Now we have the activiti-services-api module that contains all the contracts (interfaces) that can be shared with other external modules. I also added an initial version of the @ActitiviRuntimeBundle annotation that will serve as main entry point to configure your process enabled Spring Boot applications.

This week in Activiti 7

This week we will finish the repository changes and we will stabilize all the builds. We are getting closer to start our journey with kubernetes and for that reason we will spend some more time in making sure that our docker images are correctly built and published to docker hub. As usual you can track our progress and check our outstanding issues and planning here: Feel free to get in touch if you see something in our boards that you might want to contribute to the project. We are looking forward to mentor people interested in joining the project as community members.

Join us in Gitter if you know or want to learn about Activiti, Spring Cloud, Docker and Kubernetes.

Hi everyone, welcome to the Activity 7 Dev Logs #1 (3/07/17-9/07/17).

Last week we published the initial short term roadmap for Activiti 7 and we started executing according that plan. We got a lot of great feedback and some external community contributors are already working with us. This is a short update about what we did last week.


@daisuke-yoshimoto our community contributor from Japan is helping us with cleaning up the parent pom dependencies and tests that are failing when the Locale is set to Japanese. He is also having discussions with @ryandawsonuk about IDM and is that going to work in the future. You can track Daisuke work here:

@fcorti and @cijujoseph are working on creating our flagship example process that we will use to demonstrate all the new changes in the project infrastructure .

@balsarori helped us with clean up of the old Form classes contained inside the process engine.

@erdemedeiros is working on the new REST HAL APIs and Data Types, this is an initial PoC to make sure that we can interact with the process engine via REST making sure that we have clear definitions (in JSON format)  of the data types accepted and returned by these endpoints. You can track Elias' work here:

@ryandawsonuk is working on integrating keycloak as our default IDM and SSO provider and removing the IDM module from inside the process engine. This is valuable contribution to the keycloak project as well that it didn't supported spring boot 2 just yet, so Ryan provided the PR to the spring boot keycloak adapter.  You can track Ryan's work here:

@salaboy (myself) I'm working in making sure that the process engine emits events using well defined data types via messages using Spring Cloud Stream. These messages should use the same JSON data types and payloads that Elias is defining, so I'm currently working in a PoC to demonstrate this event emitters and endpoints in action. You can track my work here:

Consider that all the code references are work in progress so they will evolve accordingly once the basics are done. We aim to improve coverage and code with every commit/merge to master branch.

This Week in Activiti 7

This week  we are moving the activiti-rest and activiti-common-rest modules to the activity-compatibility repository, so we can focus only on the core engine inside the Activiti repository.

We will aim to have a first draft of all Data Types, Commands and Endpoints for the new HAL APIs and message based communications, so we can share and ask for feedback to the community. We will also try to have a working example with Keycloak to demonstrate all the advantages of having a fully fledge SSO and pluggable IDM for our infrastructure.

NOTE: if you want to send a contribution to 6.x version of the engine please  send PRs to the 6.x branch. We are going to a clean up process in Github, so if you have an open PR, branch or issue please get in touch. We will not discard any issue but we need to make sure that all the PRs, branches and issues are still relevant.

Feel free to join us in the official Gitter channel of the project and join the discussions.

Hi everyone, welcome to a new edition of the Activiti 7 Dev Logs #2 (10/07/17-16/07/17).

We keep crunching code based on our short term roadmap for Activiti 7. We started to see some light at the end of the tunnel, and there will be more blog posts about the new HAL API draft and the IDM integration with Keycloak later this week. This is a short update about what we did last week.



@daisuke-yoshimoto was adding the new REST API that generates JSON/XML/SVG for Process Definitions and SVGs for Process Instances. We are trying to provide a process diagram in a more flexible format (SVG) than a PNG/JPEG image. We wanted to analyze the complexity of providing SVG generation in the backend in contrast with client side generation, which should be possible as well. You can track his work here.

@gmalanga (our brand new community contributor) started working on providing a new endpoint for getting meta information about Process Definitions such as required roles and users, User Tasks, Services Tasks, delegates/connectors required, etc. This will helps us to understand more about our runtime bundles for smart deployments in the future.

@abdotalaat (a new community member as well) was working in a PR related to code quality, removing dead code and making sure that our tests improve their quality. You can check his work here.

@erdemedeiros: was adding new entry points to the REST HAL API, moving legacy REST API to activiti-compatibility repository and now he moved to work with the initial PoC of the new History/Audit module. With Salaboy we are working in this branch. 

@ryandawsonuk was adding a sample Keycloak integration for the new REST API for authentication and capturing users for actions such as recording process initiators. The integration is almost done and now we need to figure out the structure of the different modules and how these modules will integrate into end users applications. You can follow the work and the discussion here.

@Salaboy:  I was working on the REST APIs refinements and overall project clean ups and structures. We are getting quite close to get our initial HAL APIs initial draft, and we will be sharing more about this soon. I finished my first take on the History/Audit and Query modules and now Elias is taking that over for pair reviewing and improvements. Some of my work is already in master for the API and there were some more cleanups being reviewed here. 

As a side note we are now using GitHub Projects to track our and community contributions so feel free to get in touch if you find something there that you want to work on:

As usual feel free to join us in our Gitter channel for further discussions and collaborations:

Stay tuned!

Hi everyone, here another Dev Log from the Activiti 7.x (17/7/17 - 23/7/17) team. We are making a lot of progress on creating the new services decoupled and independent. We are reaching a point where we can start writing examples that demonstrate clear advantages in comparison with the traditional (monolith) approach.

As usual, you can track the progress via the GitHub project:


Here is the list of people contributing to the project on daily basis and the work that they performed last week:

@daisuke-yoshimoto was adding new ProcessDiagramGenerator for new REST API that generates SVG for Process Definitions and for Process Instances. In contrast with the previous version that only generated PNG/JPG which are difficult to manipulate.

@gmalanga was working on a metadata set of endpoints for process definitions. This will enable us to check against different components to see if all the requirements to run our deployed process definitions are met by the runtime environment. One simple check might be to check if the users and groups required by our processes are defined in our IDM provider.

@abdotalaat was working in a PR related to code quality, removing dead code and updating the PR after some changes in master.

@fcorti was working on a simple but concrete example process, defining an Emergency Call Center coordinating the requests together with different Organizations (Fire Department, Police, Hospitals, etc.). The process definition will run in Activiti 6, but it will be ready to be deployed in a microservices architecture in the future Activiti 7. The project is in the development phase and you can find the design at this link. Feedback and help are more than welcome.

@erdemedeiros mainly worked on the initial PoC of the new History/Audit module: make the REST API read-only; improve the result presentation. He’ll carry on that next week.

@ryandawsonuk verified the Keycloak integration with some tests in the sample project and mocking for identity in the spring boot starter project. I also set up the Keycloak integration to use a docker container that loads sample configuration from a json file. I then moved on to doing PoCs to assess options for the new Query Service.

@salaboy was working on adding the initial version of the activiti-services. These modules will contain the core logic to expose our brand new HAL APis and Message Based Endpoints. These modules are in draft stage and will evolve quickly if you are interested in more details about these services get in touch.

This Week on Activiti 7

This week we will be working towards polishing all these services and making sure that we have examples to demonstrate how they all work in coordination. We will also publish a couple of RFC (Request For Comments) blog posts about the new HAL APIs, Data Types and Message Based Endpoints. You will also see some changes in how the artifacts are versioned so we can release an Early Access version (EA) every month. 

We are also encouraging the community to join us in the community workshops that we are going to organize once every two weeks to work on the project. If you are in the London area, get in touch, join the Gitter channel and ask us for more details. 

This is a great chance to get involved and shape the future of the Activiti project.

We are closing our 4th week (24/7/17 – 30/7/17) of work on the new Activiti 7.x project and we are moving fast to make sure that we provide a set of robust services that are ready to be integrated into your existing micro services architecture. We have chosen a technology stack that relies on standard Spring mechanisms for integrations and topics such as: IDM, security, messaging and SSO.

This new month that is starting tomorrow will be all about introducing two new services called Audit Service & Query Service. These services are completely independent of any process engine instance and will be in charge of aggregating information for different users (applications) to consume. These new services will allow us to remove the responsibility from the process engine to keep track of auditing and to deal with complex queries for different types of clients.



Here is what we were up to this last week:

@daisuke-yoshimoto added new ProcessDiagramGenerator for new REST API that generates SVG for Process Definitions and for Process Instances. Next, he is creating new REST API that provides xml/json of Process Definitions.

@gmalanga was fixing some issues related to the endpoint for metadata process definition, then moved to integrate elasticsearch as repository for the query service. Still progressing with the elasticsearch integration.

@abdotalaat was working in a PR related to code quality, removing dead code and making sure that our tests improve their quality.

@fcorti The first version of the Emergency Call Center process example has been completed at This version of the process runs on Activiti 6 but in the next releases more support to a pure microservice architecture will be added.

@erdemedeiros worked on Audit module. The existent code was adapted based on the last changes from the master branch. Currently adding more integration tests for this module.

@ryandawsonuk worked on the new activiti-services-query component, which will provide a way to query aggregated data consumed via an event stream. A structure has been put in place so that the default repository can be replaced with an alternative (e.g. elastic). Various approaches for providing a query DSL were evaluated and QueryDslPredicate was chosen.

@salaboy was working on improving the Command Based Endpoints for interacting with the process engine as well as adding versioning to the Rest APIs controllers. With @erdemedeiros and @ryandawsonuk started looking at the release process for maven central and some estimations on when a first version might be uploaded there. Pull requests are piling up so I spend some time reviewing and collaborating with @daisuke-yoshimoto and @gmalanga.

We are evaluating the possibility of making another community workshop the 21st August in Central London (Near Oxford Circus Tube Station). You are all invited to come and join us while we work that day. This is a great way of getting introduced to the work that we are doing and to join the community. If you have troubles convincing your boss about how that is valuable you can get in touch and we can help you with that as well. :)

I will publish soon a newly updated roadmap blog post based on the work that we did for Milestone #0.

Last week (31/7/17 - 6/8/17) on the Activiti 7 project we spend a lot of time working on clean ups and infrastructure as well as refining and adding new Activiti Services (Gateway and Registry). We are enabling all our services to work with Docker so they can be scaled independently. We hope to start publishing these Docker images soon for those eager to try them.


@igdianov started researching tracing with Zipkin and reviewing our CQRS implementation.

@daisuke-yoshimoto added new REST API that provides JSON/XML/SVG for Process Definitions and Process Instances.Next, he is creating new REST API for handling task variables.

@gmalanga was working on the ElasticSearch implementaiton of the query module.

@abdotalaat was working on a migration application for User, Groups and Memberships from Activiti 5.x/6.x to Keycloak json configuration.

Francesco Corti this week worked on the second iteration of the Emergency Call Center Activti example. The enhancement this time is that a real micro services architecture has been developed using a Spring HATEOAS project to run three different services in three different Apache Tomcat instances. You can check the project here for further details.

@erdemedeiros (Elias Ricken de Medeiros worked on audit module: improve tests coverage; launch docker used for tests directly from Maven. The initial implementation is now merged into the master branch. Moving to query module.

@ryandawsonuk (Ryan Dawsonset up the build to create docker images of core Activiti services and put together a start script to run the build and start containers from the images using a docker-compose.

@salaboy worked on finishing the initial version of the command based endpoints as well as how to test these interactions. I’ve also moved the spring boot starter sample app to a new repository under Activiti/activiti-examples/. We are refining our services to make sure that they all play nice together, as part of this month roadmap we have included two new services, the Gateway and the Registry that will enable us to scale and communicate our services in a unified way.


This week plan

This week we need to review our SSO security integration with Keycloak now that our services are behind a Gateway, we need to make sure that the Gateway itself is secure and each service behind the Gateway can consume the signed tokens. We also need to finish the initial Docker configuration of the following services: Gateway, Registry, Runtime Bundles (Process Engine), Audit Service, Query Service, Rabbit MQ and all the  DataSources so we can start building examples.

Stay tuned!

See you all in Gitter

On behalf of the team, I am pleased to announce that Alfresco Process Services 1.6.4 has been released and is available now. This release contains some important bug fixes. Here are the notable highlights:



    • Possibility to encrypt sensitive properties (e.g. db user password, ldap user password, etc.) used in Alfresco Process Services properties files (,, Please check the documentation.
    • Whitelisting is enhanced to cover class whitelisting in JavaScript. Please check the documentation.
    • Improved SQL Injection protection.


For complete list of improvements in 1.6.x, please check the what’s new page in Alfresco Process Services documentation.

This blog post is part of another blog post that I wrote around the text sentiment analysis topic. This blog will focus on the process side and how to Move a file automatically in a folder using the metadata info. If you want more details about the sentiment analysis you can read this article.


For which of you is familiar with the BPM world the diagram below is almost self-explanatory but let's see in details what does the Process that we are going to implement:


  • The process starts getting a nodeId as an input.

  • In the second block, all the metadata related to this node is fetched from the content service through the API

  • If the sentiment is <0.5, the content will be moved to the "Bad Folder"
  • If the sentiment is >=0.5,  the content will be moved to the "Good Folder"

How import the app

To simplify the execution of this example I have already created the app that implements the flow above.You can download the sentiment app from the following link.

Once you have downloaded the sentiment app from the main page of the process service:


1. App Designer ->  

2. Select the App tab and click on import App -> 


3. Import the sentiment-app downloaded.


Process service Endpoint configuration

Let's see how to configure the content service endpoint in the process service tenant configuration. From the main page of the process service:


1. Open the process service as admin


2. Identity management -> 


3. Select the Tenants tab and form the Tenant dropdown select the tennant that will run the app


4. Press the plus button in Basic Auth configuration and add your credentials for the Content service and press save:



5.Click on the endpoints table plus button and add your content service endpoint information as the screenshot below:




Now that our app is configured in the process service we need to configure the content service.


Content service Metadata configuration


As you can see from the BPM graph, our Process has a gateway:



This gateway decides if the next step to execute is move the content file in a bad folder or a good folder.

How does the process know if is a good/bad file?

In order to archive this level of consciousness, I have added a new metadata inside the content service. The process gateway will analyze the value of this metadata and from its value will execute the corresponding action.

Let's see how to add a new metadata in the content service:


1. Login inside your content service

2. Click on the admin tab

3. From the left Tools menu select Model Manager

4.  and import the Model downloaded from the Github repository


Now as last step of configuration we need to create our bad folder and good folder in the content repository. 


Curl call bad folder:

curl -X POST --header 'Content-Type: application/json' --header 'Accept: application/json' --header 'Authorization: Basic YWRtaW46YWRtaW4=' -d '{
  "name":"bad folder",
' ''

Curl call good folder:

curl -X POST --header 'Content-Type: application/json' --header 'Accept: application/json' --header 'Authorization: Basic YWRtaW46YWRtaW4=' -d '{
  "name":"good folder",
' ''


Get the two nodeId returned by those calls above and use it to configure the steps Move Bad Folder and Move Good Folder of the process service.

Click on the Move good folder step and Bad Folder step to replace your folder Id in the request calls as in the gif below:


At this point, all our process is configured what you need to do is only valorize the value of the metadata for the choice and pass in input to this process the nodeId to check.

For executing those step in a nice visual way I suggest you give a look to this other blog post otherwise you can still populate the metadata using the rest API of the Content service and start the process using the rest API of the process service.

On behalf of the team, I am pleased to announce that Alfresco Process Services 1.6.3 has been released and is available now. This release contains a few important bug fixes as well as a couple of improvements. Here are the notable highlights:

  • Kerberos SSO support

Organizations using Kerberos AD infrastructure can now quickly set up windows-based SSO to allow secure and seamless access to the Alfresco Process Services application without explicit login. Kerberos configuration settings need to be defined in Please check the dedicated documentation page.

  • Whitelisting

As from version 1.6.3, it is no longer required to whitelist specific beans and classes in order to use the Alfresco, Box and Google Drive out-of-the-box publish tasks. They now work by default. Please check the dedicated documentation page.

  • Supported platform

Red Hat Enterprise Linux version 7.3 now supported.

  • Getting started with Alfresco Process Services

Unfamiliar with Digital Process Automation and Business Process Management (BPM)? Try out our getting started tutorial and build your first app in 3 steps.


Getting started with Alfresco Process Services

For complete list of improvements, please check the what’s new page in Alfresco Process Services documentation.


Activiti 7 Kick Off Roadmap

Posted by salaboy Employee Jul 5, 2017

If you were looking at the Activiti/Activiti repository, you might have noticed that we are restructuring the project. Activiti is taking a new direction towards microservices architectures and we are planning to make big design upgrades.


This new direction will give users the flexibility to open up the architecture, replace components as needed and to scale the engine and your applications independently. In combination with containerization (Docker), Microservice architectures with event-driven endpoints provide a cloud-native approach to deploy and interact with the engine in a distributed fashion.


However, we are well aware that users are currently embedding Activiti 5.x and 6.x in their applications. As we move toward the final release of the next major version of Activiti, we will provide a set of compatibility and migration tools. Embedding of the Activiti engine will continue to be an important feature for the project. You won’t need to re-architect your application to use this version of Activiti. However, new microservices architectures implementations will be able to take advantage of this new approach.


We understand that not everyone is using microservices yet but for those that are (and those that are going to) we need to make a real upgrade of the technology behind the engine and how the internal components integrate with each other. The Activiti Process Engine was designed several years ago, this means that there are some components, extension points, and mechanisms that have a high impedance with modern architectures and environments.


We recognize that most Open Source process engines out there claim that they are following microservices practices, but we haven’t seen the major refactoring in how the process engine works or how it integrates with cloud providers and other microservices that would be required to be a truly microservice architecture. Most of these other projects treat other microservices as REST endpoints that you can interact with. By doing so, you neglect the infrastructure that you are using and we want to leverage that. We also recognize the need for playing nicely with Containers and other technology stacks. For that reason, we need to upgrade how the engine and other services deal with versioning and immutability. With microservice architectures, if you want to integrate nicely in fast evolving environments you need to make sure that other technology stacks integrate well with your APIs and data types. For that reason, we are designing our new APIs to provide consistent data types and HAL (Hypermedia Applications Language) APIs to help you to build modern applications with cutting edge technologies.


We have created a short term roadmap for this initial refactoring, which will provide the stepping stones for more advanced services that will come towards the end of the year such as:

  • Distributed Process Execution and Coordination
  • Contextual Event Driven Case Management  
  • Decision Management and Inference Service
  • Blockchain integrated Audit Logs
  • Polyglot Process & Decision Runtime targeting IoT and mobile platforms


In the short term we want to focus into the process engine APIs and data types as well as our deployment model using containers and orchestrators such as Docker and Kubernetes.


Key points


Below a list of key points that we are going to cover in the short term:


  • Code Quality, Coding Standards, Code Maturity and Code Modularity
  • Infrastructural changes and tools that we are going to use to maintain the project healthy
  • New Rest HAL APIs, new Messaging endpoints with JSON payloads, new Event Listeners with JSON payloads
  • We will decouple all the components that are not part of the core process engine into new repositories and well-focused services
  • We will provide alignment with the Spring Community (Spring Boot 2 / Spring Cloud), targeting AWS + Kubernetes + Docker as our main deployment strategy
  • We will reuse as much of Spring Cloud as possible to make sure that the Process Engine doesn’t overlap with services provided by the infrastructure


By covering all these key points we plan to provide a set of services that you will be able to use as building blocks for your implementations, making sure that the process engine is only responsible for automating your business processes, and the impedance mismatch with your infrastructure is minimal. The following is the proposed roadmap for the next three months. We aim to test our release process by the end of July 2017, so the first one might be delayed a little bit.


Milestone #0 - 31 July 2017


  • Clean up Activity
    • Repository cleanup & restructuring
    • Dependency upgrades/Alignment with Spring 5 & Spring Boot 2
    • Infrastructure
      • GIT / Travis / Bamboo / Maven Central
      • Daily Snapshots
    • Test Coverage review
    • Testing frameworks review
    • Logging Frameworks review
  • Domain API + HAL API + Runtime Bundle 
    • Process Definitions
    • Process Runtime
    • Task Runtime
    • Source/Producer of Audit Events
  • Event Store for Audit Information - Initial Implementation
    • We should be able to query for all the events generated by multiple process engines
  • Identity Management and SSO (KeyCloak research in progress) - initial integration
  • First Release Process


Milestone #1 - 28th August 2017


  • Domain API + HAL API + Runtime Bundle
    • Source/Producer of Integration Events
    • (Source/Producer of Async Job Executions)
  • Event Store for Runtime Information - initial implementation
  • Gateway (Zuul research in progress) - Initial configuration and infrastructure
  • Application & Service Registry (Eureka research in progress) - Initial configuration and infrastructure
  • Tracer (Zipkin research in progress) - Initial configuration and infrastructure
  • AWS example demonstrating all these services in action


Milestone #2 - 29th September 2017


  • Event Store for Runtime Information - initial version
    • Decoupled query module based on events sourcing and rolling snapshots
    • We should be able to query the current state of processes and tasks without talking with the engine
  • Application Service / Case Management Features - initial version
    • Provide basic case management constructs
    • Provide coordination between different process runtime bundles
    • Deal with versioning, upgrades and case management like behaviors


Feedback are welcome


We encourage all users to start trying our milestones as soon as possible so you can get a gist of how these services are going to fit in your implementations.


We are looking forward for comments, questions, concerns, and collaborations. We will be working in a truly open source way, meaning that we want you to participate in the development of this and future roadmaps. We are looking for contributors from all levels, so if you feel attached to one of these topics, you want to learn new things or to participate in the conversation please get in touch.


Where to find more resources


We are using a Gitter channel here to discuss the roadmap, plan and collaborate every day, feel free to join:


We also have set up the following public services that you might find of interest:

Travis CI for continuous integration:

Codecov for code coverage reports:

Codacy for code style, security, duplication, complexity, and coverage:


These services will keep track of our code quality progress and release cycle.


There will be more announcements and blog posts about the topics mentioned here. We will be sharing progress and engaging community members regularly, stay tuned!


Original blog post: Activiti 7 Kick Off Roadmap – Salaboy (Open Source Knowledge) 

Activiti 6 is here and it brings a number of significant updates to its core functionality.

The main highlights are:


  • Pluggable Persistence

Previously, the persistence logic was spread across different parts of the code. This made it hard to maintain and impossible to customise. Now, persistence logic has been centralised and a new set of DataManager interfaces (for low level CRUD operations) has been introduced, in addition to refactoring all entity classes and entity manager interfaces. This provides a high level of abstraction that easily enables customizing the persistence logic and swapping it with a totally different implementation. This enables the possibility for custom implementations to use different ORM libraries, multiple databases, NoSQL, or both relational and nonrelational databases concurrently and more!


  • No PVM, just BPMN

Activiti 5 uses a PVM (Process Virtual Machine) intermediate layer between the core engine and BPMN model. Activiti 6 core engine now directly works with the BPMN model.  All classes in the org.activiti.engine.impl.pvm package (and subpackages) have been removed since they are no longer needed. This results in reduced complexity and an increase in performance.  This also makes it possible to do things that were not previously possible in Activiti 5.


For example:

Running the process on Activiti 5 will result in a StackOverflowError. In Activiti 6 the process will run without any issues. Check it yourself by running this test available on github. The test class also contains other tests like the inclusive gateway and concurrency after a boundary event that were failing in Activiti 5 but are now working fine.


  • Support for dynamic processes & ad-hoc sub processes

Consider a process definition designed to include a service task that communicates with an external system. After deploying and running some process instances some issues in the external system occurred and modifications were needed to that system. The modifications aren’t backward compatible, as a result, an updated service task implementation that’s aware of those changes is required. The process definition can be updated and redeployed but what happens to in-flight process instances? Isn’t there a way to fix them? It’s possible to update the process definition manually from the databases but that’s not a clean way to fix it. That’s where the DynamicBpmnService is very useful. DynamicBpmnService enables changing properties/attributes (i.e. changing task assignee, task priority, service task class, script task script, etc!) of a process definition without the need to redeploy. Some examples of using DynamicBpmnService can be found on github here.


Ad-hoc sub processes provides the possibility to dynamically add sequences of work on-the-fly. As part of dynamic support introduced in the engine ad-hoc sub process allows defining tasks without a predefined sequence order. The sequence order can be determined at runtime. There could also be some tasks that need to be executed in sequence order and other tasks that are left to runtime to determine their order. The following example shows two sets of tasks, one set (A task, Next Task, and Final Task) that should be executed in a pre-defined order.

The other set is can be determined at runtime. The following code shows how to get and execute those tasks.

Execution execution = runtimeService.createExecutionQuery().activityId("adhocSubProcess").singleResult();

List<FlowNode> enabledActivities = runtimeService.getEnabledActivitiesFromAdhocSubProcess(execution.getId());
assertEquals(4, enabledActivities.size());

runtimeService.executeActivityInAdhocSubProcess(execution.getId(), enabledActivities.get(0).getId());


The DMN Engine and Designer allows you to create decision tables and get an outcome based on the input values and the rules defined in the decision table. These decision tables can be invoked from a decision rule task in a BPMN definition, but also independently of a process instance.


  • Form Designer & Engine

The Form Designer is a web-based visual environment to quickly design web forms without requiring special technical skills. It empowers business users to be part of the design phase. The Form Engine centralizes the form logic that was introduced with the new Form Designer. Forms can be deployed with BPMN process definitions but also deployed separately. Forms can be either referenced in a start event or a user task.


  • Activiti 6 App UI

The activiti-app is a new web-based environment composed of 3 Apps:  Kickstart App, Task App and Identity Management app.

  • The Kickstart App includes a BPMN 2 Editor to design process models, a Form Editor to visually design forms, a decision table editor to create DMN decision tables and an App Editor to create and publish process apps bundling all the models in one single package.
  • The Task App allows you to start new processes and tasks and access tasks assigned to you from any process apps.
  • The Identity Management app gives the admins the capability to create and manage users and groups. 


The Activiti 6 app UI home page.


Activiti 6 Task App showing the process list view.




Activiti 6 Identity management app showing the user list view.



  • Activiti 6 Admin App UI

The activiti-admin app is an administration console for administrators to monitor running tasks, process instances and jobs. Admins can perform various actions such as assign/claim/delegate tasks, terminate/delete process instances, update/add/delete variables, execute/delete jobs...


Activiti 6 Admin console showing a running instance detailed view.



  • Migration guide

Ready to move to Activiti 6? We have created a migration guide that outlines various aspects to consider and what is needed when migrating from Activiti v 5.x to Activiti version 6. Here the download page to get started with Activiti 6!

Amazon Simple Queue Service (SQS) and Apache ActiveMQ ™ are two popular messaging systems/platforms out there. Alfresco Process Services powered by Activiti (APS) can be integrated with these systems in a few different ways. Some of the available options are:

  • Custom extension projects built using Spring libraries
  • Using Apache Camel Component in APS
  • Using Mule Component in APS


To demonstrate the first option mentioned above, I built a couple of very simple java projects (one for SQS and one for ActiveMQ). The idea of this blog is to point you to those examples. Since these example projects are really simple, I'll keep this blog really short.


The pattern is pretty much the same in both the examples, and is as given below:

  • Establish a connection with the respective messaging system from Alfresco Process Services
  • Listen to a SQS/MQ queue for new messages and start a process for every new message.
  • Send messages to SQS/MQ Queue during the process instance execution.


APS Integration with Amazon SQS

Source: GitHub: aps-aws-sqs-extension 


APS Integration with Apache ActiveMQ

Source: GitHub: aps-activemq-extension 

To try this out and for more details, please refer the README file available in the above mentioned projects.

This is a continuation of my previous blog post about data models Business Data Integration made easy with Data Models. In this example I'll be showing the integration of Alfresco Process Services powered by Activiti (APS) with Amazon DynamoDB using Data Models. Steps required to set up this example are:


  1. Create Amazon DynamoDB tables
  2. Model the Data Model entities in APS web modeler
  3. Model process components using Data Models
  4. DynamoDB Data Model implementation
  5. App publication and Data Model in action


Let’s look at each of these steps in detail. Please note that I’ll be using the acronym APS throughout this post to refer to Alfresco Process Services powered by Activiti. The source code required to follow the next steps can be found at GitHub: aps-dynamodb-data-model 

Create Amazon DynamoDB tables

As a first step to run this sample code, the tables should be created in Amazon DynamoDB service.

  1. Sign in to AWS Console
  2. Select "DynamoDB" from AWS Service List
  3. Create Table "Policy"-> (screenshot below)
    1. Table name : Policy
    2. Primary key : policyId"
  4. Repeat the same steps to create another table called "Claim"
    1. Table name : Claim
    2. Primary key : claimId

Now you have the Amazon DynamoDB business data tables ready for process integration.

Model the Data Model entities in APS web modeler

Next step is to model the business entities in APS Web Modeler. I have already built the data models and they are available in the project. All you have to do is to import the "" app into your APS instance. Please note that this app is built using APS version 1.6.1 which will not get imported in older versions. If you are using APS version 1.5.X or older, please import the app from my project I used in my previous blog post.


Once the app is successfully imported, you should be able to see the data models. A screenshot given below.

Model processes components using Data Models

Now that we have the data models available, we can now start using them in processes and associated components such as process conditions, forms, decision tables etc. If you inspect the two process models which got imported in the previous step, you will find various usages of the data model entities. Some of those are shown below:



Using Data Model in a process model

Using Data Models in sequence flows


Using Data Model in Forms


Using Data Models in Decision Tables (DMN)



Let’s now go to the next step which is the implementation of custom data model which will do the communication between process components and Amazon DynamoDB


DynamoDB Data Model implementation

In this step we will be creating an extension project which will eventually do the APS<-->Amazon DynamoDB interactions. You can check out the source code of this implementation at aps-dynamodb-data-model . For step by step instructions on implementing custom data models, please refer Activiti Enterprise Developer Series - Custom Data Models. Since you need a valid licence to access the Alfresco Enterprise repository to build this project, a pre-built library is available in the project for trial users - aps-dynamodb-data-model-1.0.0-SNAPSHOT.jar. Given below are the steps required to deploy the jar file.

  1. Create a file named "" with the following entries and make it available in the APS classpath

aws.accessKey=<your aws access key>
            aws.secretKey=<your aws secret key>
            aws.regionName=<aws region eg:us-east-1>

  1. Deploy aps-dynamodb-data-model-1.0.0-SNAPSHOT.jar file to activiti-app/WEB-INF/lib

App publication and Data Model in action

This is the last step in the process where you can see the data model in action. In order to execute the process, you will need to deploy (publish) the imported app first. You can do it by going to APS App UI -> App Designer -> Apps → InsuranceDemoApp → Publish

Once the process and process components are deployed, you can either execute the process by yourselves and see it in action OR refer to video link in Business Data Integration made easy with Data Models demonstrating data model.


Once you run the processes, log back in to AWS Console and check the data in respective tables as shown below



That’s all for now. Again, stay tuned for more data model samples….