Skip navigation
All Places > Alfresco Content Services (ECM) > Blog > 2017 > January

Deploying React pages to Share

Posted by ddraper Jan 18, 2017


I've previously experimented with developing Alfresco clients using a number of different application frameworks. In this blog post I'm going to step through the process of creating a page using React and deploying it into Share.

Although I'm using React in this example I could have just as easily used Vue.js, Aurelia, Angular, Ember or any framework that provides a CLI or project template for developing and packaging a Single Page Application.

Basic Setup

  1. Install the create-react-app CLI following the instructions on the linked page
  2. Create a new application:
    create-react-app MyApp
  3. Change to the project directory:
    cd MyApp
  4. Eject!
    npm run eject
  5. Edit the "scripts/start.js" file and add the following at the end of the "addMiddleware" function:
    let alfrescoProxy = httpProxyMiddleware('/share/proxy/alfresco-api', {
       target: 'http://localhost:8080/alfresco',
       changeOrigin: true,
       pathRewrite: {
          '^/share/proxy/alfresco-api': 'api'
  6. Install alfresco-js-utils (the version used at the time of writing is specified, newer versions maybe available at time of reading!):
    npm install alfresco-js-utils@0.0.7 --save-dev
  7. Start the application:
    npm start


Why Eject?

Strictly speaking it is not necessary to eject (and it would be preferable not to). However, in this example I'm re-using my own NPM package called alfresco-js-utils which provides some re-usable JavaScript functions. One of these functions is a service for retrieving Nodes from the Alfresco Repository using a V1 REST API and the URL that this function calls starts with /share/proxy/alfresco as it is intended to ultimately be run on Share. It is possible to configure an API proxy for the create-react-app CLI but it does not support URL rewriting. Therefore we need to eject in order to gain access to the start.js script that is edited to allow us to add custom HTTP proxy middleware.


Create Your Page

You should now have an application running locally with a great development experience with hot-reloading, etc provided by create-react-app. If you make an REST API call to (using an URL starting with /share/proxy/alfresco) then you should get a basic authentication challenge and once you've provided some valid credentials you'll be able to access data from the Alfresco Repository (don't worry, when you deploy into Share the authentication will be handled for you by Share/Surf).


I've built a very simple client for browsing company home. You can view/copy the code from this GitHub repository (check out this tag for the code at the time of writing this post). The client contains 4 components:

  • List
  • ListView
  • Breadcrumb
  • Toolbar

...that provide a simple interface for browsing.


Build and Deploy

Once you're happy with your page then run:

npm run build

This will populate the build folder with the resources that you want to deploy into Share. 


  1. Copy the build folder to a new location
  2. Rename the static folder to be "META-INF"
  3. Create the following files:
    1. alfresco/site-data/pages/react.xml
    2. alfresco/site-data/template-instances/react.xml
    3. alfresco/templates/react.ftl


Your folder should now look like this:


Now update the files as follows:



<?xml version='1.0' encoding='UTF-8'?>

This defines the new Surf page that will contain our React code.



<?xml version='1.0' encoding='UTF-8'?>

This is referenced from the page and creates a mapping to a template.


Copy the contents of the index.html file into the alfresco/templates/react.ftl file but replace all occurrences of "/static" with "/share/res".

If you change the contents of any JS or CSS and rebuild it will be necessary to copy and update the contents of index.html again because the resources are named with a checksum (just like Surf/Aikau does!)

Now bundle up the contents of the copied build folder (not including the build folder itself) as a JAR file and copy it into the WEB-INF/lib directory of Share and restart the server that it is running on.


Once the server has restarted you'll be able to login to Share and then you will be able to access your page at /share/page/react.



Include the Header and Footer

If you want your users to access this new page with the context of Share then it makes sense to include the standard header and footer.


Update the react.ftl file initially so that it looks like this:

<#include "/org/alfresco/include/alfresco-template.ftl" />
   <!-- Insert CSS link here -->
   <div id="alf-hd">
      <@region scope="global" id="share-header" chromeless="true"/>
   <div id="bd">
      <!-- Insert contents of body element here -->
   <div id="alf-ft">
      <@region id="footer" scope="global" />


...and then add the CSS <link> element and the contents of the <body> element from index.html into the commented sections.


Update your JAR file with the changes, redeploy and restart the server and when you access the same URL you'll see the header and footer.


Add a Link to the Header

Obviously it's not ideal for users to have to enter a URL into the browser to get to your new page so let's add a link into the header bar.


This Stack Overflow question and answer provides a good overview on doing this properly (go on, give it an up vote - you know you want to!) but the basic steps in this case are as follows:


  1. Create an alfresco/site-data/extensions/react-extension.xml file containing:
             <id>React Extension</id>

  2. Create an alfresco/site-webscripts/org/alfresco/share/pages/customizations/share/header/share-header.get.js file containing:
    var headerMenuBar = widgetUtils.findObject(model.jsonModel.widgets, "id", "HEADER_APP_MENU_BAR");
    if (headerMenuBar && headerMenuBar.config && headerMenuBar.config.widgets)
          name: "alfresco/menus/AlfMenuBarItem",
          config: {
             label: "React page",
             targetUrl: "react"
  3. Repack your JAR file, deploy it to /share/WEB-INF/lib and restart the server


Now on each page in Share you'll see a link to your new page:



It's very easy to take advantage of the development environments provided for modern web application frameworks and deploy the output into Share. Some of the techniques shown in this post may be old but are still extremely effective which shows the value in a properly architected framework.


Posted by ddraper Jan 16, 2017


If you've been following my personal blogs then you'll know that I've been experimenting with some of the currently popular web development frameworks (Vue.js, React, Aurelia and Angular). Each comes with their own advantages and disadvantages but my personal preference (and everybody is entitled to their own preference!) was for Vue.js.


There are lots of reasons to like Vue.js (many of which are summarized described here) but one of the things that I found really useful was that you could very easily write it in plain ol' ES5 JavaScript.


An Argument Against Transpilation

Now don't get me wrong... I love writing in ES6 and I get positively giddy when writing in TypeScript (although that might just be down to my flagrant abuse of the "any" keyword) but ultimately the code you write in ES6 and TypeScript is going to get transpiled down into ES5 and that can mean that the code ends up both larger and in some cases actually less efficient which will actually result in poor performance.


Bundling and Tree Shaking

The other interesting thing about these new frameworks is that they all offer lovely development environments with hot-reloading which can really enhance development speed.... however, when you come to deploy your code you still need to bundle it up and will want to tree-shake and minify (or uglify) the code - and you have to do this each time you want to deployment your code.


The idea of tree-shaking is that you remove all the code that isn't required on the page in order that the page load is faster to boost performance. Surf has actually been providing this capability via Aikau since March 2013. The fundamental differences being that it is done in Java (and not via Node.js) and that it does this efficiently at production time through aggressive caching. 


  • Yes, you can load System.js into the browser and transpile and asynchronously load from the client but the performance will be poor.
  • Yes, it is true that that there is some initial overhead to Surf/Aikau performing dependency analysis on the first page load this then disappears for all subsequent page loads through aggressive caching.
  • Yes, it's also true that Node.js will do the necessary string processing required for dependency analysis must faster than Java, there isn't (to my knowledge at least) any Node.js middleware that is offering this capability.



The other aspect of web application development that has always interested me is in re-use. It's really easy to write some JavaScript component that is re-usable as a "leaf node" and you can probably provide some configuration options to customize its appearance or behaviour.


We should all be implementing components following the "single responsibility principle" that will typically result in a nested component hierarchy. This is perfectly fine where you own that entire hierarchy because you control all the components and are the only one using them. But what if you want to share your components for others to use? And what if they want to customize those components?


Let's say you provide component A which contains component B which contains component C. Someone else wants to replace C with D, so they have to customize B to create D instead of C (which results in component E) and then customize component A to instantiate E instead of B. So the end goal was A contains B contains D, but the result was F contains E contains D.


That's 3 extra components that you now have create and maintain instead of 1.


Transclusion, et al

Some of the frameworks provide a solution to this that is variously called "transclusion" (Angular), "content projection" (Aurelia) and "content distribution" (Vue.js) but I've found that these only really work to a single level of nesting and aren't the easiest things to implement. 


The reason that this is a hard problem boils down to dependency management. A component will need to declare the sub-components that it can contain in order to be able to render them.


Aikau has always provided a solution to this problem by allowing developers to compose pages of widgets in a JSON model that is analysed by Surf to establish all of the dependencies prior to rendering. Widget references in the model can easily be swapped out, removed or reconfigured without needing to update any other widgets in the model.


Unfortunately one of the main criticisms levelled at Aikau was that it's hard to find people with the skills despite the fact that all you really need is some basic web development skills and an understanding of 3 very simple concepts (as described here).


So this got me to thinking...


An Idea...

We already have a great tree-shaking and module loading solution in Surf (albeit one that is desperately uncool) that provides an excellent basis for providing re-usable components (as ably demonstrated by Aikau for over 100 releases).... what if we could marry that existing infrastructure with some cool new JS framework?


So that's what I've done.


I've added a couple of widgets into Aikau that allow you to easily build Vue.js components that can be composed together in the traditional Aikau page models. This means that if you can get your head around some trivial boilerplate code then you can create re-usable and Vue.js components.


The Solution...

So the boiler plate code looks like this:

        function(declare, Base) {
   return declare([Base], {

      getComponentElement: function() {
         return "";

      getComponent: function getComponent() {
         return {


There are two functions that need to be implemented:

  • getComponentElement: returns a string with the element custom component element name
  • getComponent: returns a Vue.js component object


For example, I implemented the same simple application as I had previously done in standalone Vue.js (and React, Aurelia and Angular!) which resulted in 4 components including a toolbar component containing forward and back pagination buttons that looks like this: 

        function(declare, Base, template) {
   return declare([Base], {

      getComponentElement: function() {
         return "toolbar";

      getComponent: function aikau_vue_Toolbar__getComponent() {
         return {

            template: template,

            props: ["list"],

            methods: {
               back: function() {
                  var changeEvent = new CustomEvent("pageBack", {
                     bubbles: true

               forward: function() {
                  var changeEvent = new CustomEvent("pageForward", {
                     bubbles: true


These components can be composed in an Aikau page model like this:

model.jsonModel = {
   widgets: [
         name: "aikau/vue/Bootstrap",
         config: {
            widgets: [
                  name: "aikau/vue/List",
                  config: {
                     widgets: [
                           name: "aikau/vue/Breadcrumb",
                           config: {
                              "v-bind:relativePath": "relativePath"
                           name: "aikau/vue/Toolbar",
                           config: {
                              "v-bind:list": "list"
                           name: "aikau/vue/ListView",
                           config: {
                              "v-bind:list": "list"


How it Works

The solution is pretty straightforward. The "aikau/vue/Base" module (that all the Vue.js component widgets must extend) iterates over any child widgets and registers a local component for them. The child component element (the value returned from getComponentElement) is written into the template and the properties defined in the widget config are output as attributes of that component.


If you're familiar with Vue.js then you'll note the use of the "v-bind:" prefix that is used in this case to create a dynamic binding between the "list" value in the List widget and the Toolbar and ListView widgets. This means that when "list" is updated in the List component the Toolbar and ListView components can reactively update to reflect the changes.


The List component template contains a single property "${widgets_slot}" that is swapped out with the locally registered child components, so that...

<div @navigate="navigate" 



<div @navigate="navigate" 

   <breadcrumb :relativePath="relativePath"></breadcrumb>
   <toolbar :list="list"></toolbar>
   <list-view :list="list"></list-view>

...when the child components are rendered.



The only limitation that I've found with this approach is that I wasn't able to take advantage of the Vue.js event model and needed to fall back to native custom events, but the only downside I could see to this was that you wouldn't be able to benefit from the excellent debugging tools provided by the Vue.s developer extension for Chrome.



So if you're creating a Vue.js application (or indeed an application using any other frameworks) then you probably won't want to jump through these hoops. But if you want to provide an application that you want people to be able to customize (like Alfresco Share) then you need to allow developers to make small (or large) changes to deeply nested components without copy/pasting code or extending every component in the hierarchy of the only one they actually need to change.


Even in this very simple 4 component example it means that you could swap out the ListView component without needing to change the parent component. The ListView component itself could be re-written to be composed from a library of metadata renderering components that could be selected from to render views specific to the data displayed.


This approach also allows composition of components that haven't been written with respect to each other - once again something that we have been able to successfully benefit from using Aikau at Alfresco.


What Next?

I strongly doubt that this implementation is going to be used anywhere unless there is any kind of major demand for it from customers, our partners or the community. It currently exists on an unmerged branch of Aikau but could easily be integrated into the main code base and released for use.


Could This Approach be Used with Framework {X}?

The main reason why I picked Vue.js for this experiment takes us back to the original point way back at the beginning of this post. Vue.js can very happily be run without transpilation in ES5 code. This same approach could theoretically be applied to React but to get the best out of it you'd want to use JSX and that would require a transpilation step that Surf currently doesn't provide. 


It would obviously be possible to transpile the component code as part of the build process for Aikau but that would introduce its own set of problems as Surf/Aikau doesn't provide source maps which would create a barrier to effective debugging.



If you've reached the end of this very long post then thanks for your perseverance! I'd really appreciate any comments and thoughts that you might have on this - especially on whether you think the use case being addressed is even a valid one.

With Alfresco 5.2 we are introducing Alfresco Search Services and Solr 6.3, read more about Solr 6.3 and the new search features here.


In this post we'll look in more depth at using SSL with Solr 6. If you haven't already, see this post for more info on installing Solr 6 without SSL.


Introduction to SSL

HTTPS provides over-the-wire encryption and a means to secure access to Alfresco Search Services. Only those clients and applications that have an appropriate certificate can get access. See for more details. It may use SSL or its successor TLS (SSL is sometimes used as a synonym for HTTPS).


You may choose to secure Alfresco Search Services in other ways. This post will guide you through setting up Alfresco One 5.2 with "SSL" enabled. Access to the Alfresco APIs by which Solr builds its index will be secured (URLs like repo/api/solr/*), access to the Solr 4 web application will be secure (URLS like solr4/*), and access to the Solr 6 application (URLS like localhost:8983/solr/*) will be secured.


In addition, Solr 6 supports sharded indexes with "SSL" (which was not possible with Solr 4). Once secured you'll need to install a certificate on your browser to gain access to the protected URLs. You should generate your own unique certificates. In this task we focus on what to do after you have generated your own keys, using the example well-known certificate that comes with the installer. Using the "default" certificate will provide encryption but not authentication.


The steps below describe how to install Alfresco Search Services over HTTPS protocol.


Install and prepare your Alfresco One 5.2 installation

You can use your existing Alfresco One 5.2 installation or start a new one from scratch (see this post for more info on installing Solr 6). In this paragraph we'll see how to prepare your Alfresco installation.



Install Alfresco Search Services

Now that Alfresco One 5.2 is correctly installed and prepared, we're going to install Alfresco Search Services 1.0.


  • Download Alfresco Solr 6 distribution from here, unpack it, and move it to your preferred location.
  • Prepare the keystore by creating the folder <solr6>/solrHome/keystore.
  • Into this new folder copy the ssl.repo.client.keystore and ssl.repo.client.truststore files from <alfresco-one-5-2>\solr4\templates\rerank\conf.
  • Update the SSL properties in <solr6>/ (if you're in a Linux based platform) as described below.
    If you are using a Windows based platform the file will be called <solr6>\ and the content should be updated as described below.
    set SOLR_SSL_KEY_STORE=<solr>\keystore\ssl.repo.client.keystore
    set SOLR_SSL_KEY_STORE_PASSWORD=kT9X6oe68t     
    set SOLR_SSL_TRUST_STORE=<solr>\keystore\ssl.repo.client.truststore     
    set SOLR_SSL_NEED_CLIENT_AUTH=true    
  • Update <solr6>/solrHome/conf/ as described below.
    // Enabling the suggestion as with Solr 4 (Suggestion is disabled by default for SOLR 6).{}name{}title{}description{}content

    // Enabling camelCaseSearch support in all fields as Solr 4.

    If you are installing Solr on the same host then the default, solr.port, solr.baseulr will be correct.


Starting Alfresco Search Services

In this paragrah we are going to see how to setup and start the installed Alfresco Search Services 1.0.


  • Start Solr in foreground using the following options.
    solr/bin/solr start -f -a " -Dsolr.ssl.checkPeerName=false"
  • Enable dynamic sharding using the Alfresco administration page, available at the link below.

See the screenshot below for how the administration page looks like.



  • Click Manage to create an unsharded Archive index.
  • Enter the details for your Solr 6 server (https://<solr_ip>:8983/solr) in the New Index Server box and click Add.
  • Fill in the details as shown in the following screenshot. Click Create Shard Group when you're done.



  • Once the submission as been completed, the index will appear in the shard view as they start to track.
  • Click Manage to create a sharded Alfresco index.
  • Enter the details for your Solr 6 server (https://<solr_ip>:8983/solr) in the New Index Server box and click Add. If you created the archive index above the index server may already be listed.
  • Please fill in the form in the way the screenshot shows. Fill in the details as shown in the following screenshot. Click Create Shard Group when you're done.



    • Once the submission as been completed, the index will appear in the shard view as they start to track.
    • Check in the Solr administration console that both the indexes are correctly listed. See the following screenshot for how the console should look.



    Validate Search over HTTPS

    Now that the Alfresco Search Services is up and running with the correct settings, let's see how to validate the searches over HTTPS.



    • You can validate searches executing one (or more) searches over Alfresco using Alfresco Share.




    Please let us know how you get on, leave a comment or email

    Read more about the changes and new features introduced with Solr 6 here.


    In this post we will share more information about setting up an Alfresco Enterprise/Solr sharded search index. If you haven't already, see this post for more info on installing Solr 6.


    When an index grows too large to be stored on a single search server it can be distributed across multiple search servers. This is known as sharding. The distributed/sharded index can then be searched using Alfresco/Solr's distributed search capabilities. Alfresco/Solr has several different methods to choose from for routing documents and ACL's to shards.


    In this post we will focus on the out-of-the-box approach which is sharding by Node ID. In Alfresco/Solr's configuration this is referred to as DBID sharding, as the DBID field is used to hold the Node ID in the Solr index. With DBID sharding, documents are routed to shards based on a hash of the Node ID of the document. Hashing on the Node ID is a simple approach that relies on randomness to evenly distribute documents across the shards.


    When using the DBID sharding approach, all ACLs are indexed on each shard. This ensures that the ACL for each node is co-located on the same shard. This is required for proper access control enforcement. The DBID sharding method is ideal for use cases where there are a large number of nodes, but a smaller number of ACLs.


    Follow the steps described below to complete the sharding setup and test.


    Switch the Search Services to Solr 6

    1. Go to the Alfresco Admin Console.
    2. Go to the Search Service Console.
    3. Select Solr 6 as the search service.
    4. Save the Search Service settings.


    Turn on Dynamic Sharding

    1. Go to the Index Service Sharding page as described in the screeshot below.



    1. Check the Dynamic Shard Instance Registration checkbox.
    2. Save the Index Service Sharding settings.


    Install Solr 6

    1. Solr 6 is not installed with the Alfresco Installer, so you'll need the Alfresco/Solr 6 zip file - download it here.
    2. Create a directory for each shard in the distributed index. This can be on the same server or different servers.
    3. Unzip the Solr 6 zip file in each directory.


    Edit the

    1. Inside each of the Solr 6 directories there is a directory called solrhome.
    2. Edit the solrhome/conf/ file.
    3. Change the solr.port property to be the port you want to start Solr on.
    4. Change the property to the host that Solr is running on.


    Start each Solr instance

    For each Solr install:

    1. At the same level as the solrhome directory there will be a solr directory.

    2. Enter the solr directory and enter the following command:

      ./bin/solr start

      This will start solr on the default port (8983).

      To start Solr on a different port enter the command:

      ./bin/solr start -p PORT_NUMBER

      Replace PORT_NUMBER with the port you will be starting Solr on.

    3. Open a browser and go to the solr admin screen:


    You will see a Solr 6 admin screen without any cores created


    Add Index Servers

    1. Go to the Index Server Sharding Page on the Alfresco Admin Console.
    2. Choose Manage.
    3. A window will display where you can add new index servers.
    4. Add the base URL for each Solr server using the form, for example: http://hostname:port/solr.
    5. Stay on the Manage screen for the next step.


    Add a Shard Group

    1. In the Manage screen add a Shard Group.
    2. Specify the number of shards, shard instances (replicas of shards), and the core name.
    3. Select Create Shards.


    This will create Solr cores for each shard and shard replica on the index servers that have been registered. The cluster is now created and will began tracking the Alfresco repository and indexing documents and ACLs across the sharded index.



    Please let us know how you get on, leave a comment or email

    With Alfresco 5.2 we are introducing Alfresco Search Services and Solr 6.3, read more about Solr 6.3 and the new features here.


    With Solr 6 there is a significant change - it's now a standalone application powered by a jetty server. In this post we will cover the additional steps required to configure and setup Alfresco and Solr 6.






    Enterprise with clustering license

    NB if using SSL please also refer to this post

    1. Download Alfresco Solr 6 distribution from here, unpack and move to a preferred location.

    2. Start Solr 6 using the command below.

      $SOLR_HOME/solr/bin/solr start
    3. Open a browser (assuming you're in the same server where Solr 6 is installed) and navigate to:

    4. You should see Solr's admin screen - in here you can see the Solr version in use is 6.x.

    5. Edit the file as shown below to disable SSL, then restart Alfresco.

    6. Login as admin and navigate to the Admin Console. Use the following link (assuming you have Alfresco installed into localhost at port 8080).
    7. Upload and apply the appropriate license.

    8. Restart Alfresco, and once it's restarted return to the Admin Console.

    9. Select "Search Service" from the navigation.

    10. Select Solr 6 from the "Search Service In Use" dropdown and review the details.

    11. Click on Save to set Solr 6 as the default Search Engine.

    12. To create the new cores, select "Index Server Sharding" from the navigation.

    13. Select the "Dynamic Shard Instance Registration" check box and click 'Save'. 

    14. Click on Manage, which will launch the Index Server Shard Management popup.

    15. Enter the Solr 6 ip address, port and context (for example and click Add.

      You can also configure and setup Solr shards at this stage, this will be covered fully in another blog

    16. Scroll down to Manage Default Indexes and click on Create Alfresco to create Alfresco Core.

    17. Click on Create Archive to create the Archive Core.

    18. Scroll down to the Report section to see a response.

      Congratulation! You've now configured Alfresco to work with Solr 6.

    19. Now it's time to verify that it has created Alfresco and an Archive core. Open a browser (assuming you're in the same server where Solr 6 is installed) and navigate to:

    20. In the Solr Admin UI select the core selector dropdown and verify that both Alfresco and Archive are present in the dropdown.

    21. Allow a few minutes for the content to be searchable on Solr 6.



    1. Download Alfresco community from .
      When installing make sure you do not launch alfresco, by ticking the box (add image)
    2. Configure Alfresco to use Solr 6 as the default search subsystem
      by editing the file (ALFRESCO_HOME/tomcat/shared/classes/
      ### Solr indexing ###
    3. Download Alfresco Solr 6 distribution, unpack and copy the Solr 6 folder to $ALFRESCO_HOME.
    4. Solr 6 does not come with Alfresco workspace and archive stores by default so you'll need to create them. To do so start Solr and create the cores with the following command:
      ALFRESCO_HOME/solr6/solr/bin/solr start -a "-Dcreate.alfresco.defaults=alfresco,archive"
    5. Open a browser and navigate to http://localhost:8983/solr/to verify that Solr has started. You should see Solr's admin page
    6. Select the core selector dropdown and verify that both Alfresco and Archive are present in the dropdown.
      This signifies that both cores have been created.
    7. Start Alfresco and allow a couple of minutes to index the content.

    For your interest, we would like to submit to your attention another tutorial fromAxel Faust about using Solr6 in Alfresco Community Edition 5.2 (201612 GA).

    Filter Blog

    By date: By tag: