AnsweredAssumed Answered

Newbie question: Activiti's usefulness for CI environments

Question asked by abrindeyev on Jun 13, 2013
Latest reply on Jun 19, 2013 by jbarrez
Hi, everybody!

I'm new to Activiti and wondering if it can help me to build a core of my company's continuous integration system or not.

We're using Jenkins as core of our CI system. It's a common tool for developers and QA's. However, from CI pipeline construction standpoint it's a bad solution:

1. You need to introduce a set of technical jobs if you need to join several parallel executions to one and to control shared resources between jobs.
2. You have to fight with incorrect build numbers (if your builds are queued for some particular job FUNC_TESTS then several next parent builds will get "Triggering a new build of FUNC_TESTS #234" in console log but in reality that build will get a new (greater) number later.
3. You can't reliably save state inside Jenkins (if your pipeline includes a set of resources like VM pool nodes or DB pool databases which could and should be reused in several pipeline's jobs for better efficiency) and we'll get an occupied resources while in reality pipeline was finished a while ago.
4. You need to distinguish failures for technical (infrastructure) issues and when your test suite's cases failed due recent committed changes. Right now we're using 'UNSTABLE' builds for latter scenario and 'FAILED' builds for infrastructure issues (like unable to get a VM from cloud provider or failed deployment with Chef cookbooks)
5. Sometimes you need to restart a specific job inside a pipeline to test a specific build of Java artifact (normal build of that job failed due infrastructure issues, human fixed that issue and need to restart test or part of pipeline from specific point because other QA teams already certified that specific build - we can't just make another commit and run a whole pipeline from the start)

I came to an idea that you need to orchestrate (one or more) Jenkins instances externally with third-party service.

Ideal solution to me right now is following:
1. Build job in Jenkins is looking into VCS (SVN/Git repos in our case) and build Java artifact once change of repo is detected
2. Upon successful artifact construction build job will ask external service to start a named pipeline - to test that build (passing artifact's URL to that process as input parameter)
3. External service will run a process flow for that build, firing several Jenkins jobs via Remote API as necessary. Note: there will be no inter-job links inside Jenkins and from Jenkins point of view all jobs will be unrelated to each other.
4. Launched jobs can ask external service for resources (cloud pool nodes, databases and so on)
5. QA engineers should be able to get a status of particular process (CI pipeline) from ANY participated build in Jenkins: when that pipeline was started (build job name/build#), is it still running, look into tree of jobs which are currently running, when pipeline was finished and so on. Developers and QAs are now using imperfect Downstream view Jenkins' plugin for that purpose.
6. Once finished (with success or any kind of failure) external process need to free any allocated resources (VMs and databases) back to providers.

I know that there is a Jenkow plugin. It's in early development and still have a major design flaw: see #3 in issue's list. We can't afford to keep state in Jenkins as we seen too many failures in the past.

List of questions:
1. Can be a such 'external service' backed by Activiti?
2. I'm new to BPM - how can I implement a simple workflow: a build job which have an artifact and several job with tests after that which needs to be run in parallel and (after successful execution of ALL test jobs) - run upload to Nexus.

Thanks for everybody who was able to read that manuscript to that point :-)