Skip navigation
All Places > Alfresco Content Services (ECM) > Blog > 2011 > February
Following Steve Reiner's Twitter post last week, I was inspired over the weekend to add a similar Google Maps-based dashlet to share-extras, to show the locations of geotagged content items on a map view.

Since the repository has full support for extracting geographic data using Tika in version 3.4, all I needed to do to assemble some test content was upload a few photos taken on my phone into the site Document Library.

If you look at the Document Details page of a geotagged photo, you'll see that this displays a latitude and longitude value at the end of the item's properties list. These are part of a new aspect named Geographic.

[caption id='attachment_256' align='alignnone' width='145' caption='Latitude and Longitude']Latitude and Longitude on the Document Details page[/caption]

Using Firebug's Net console, I noticed that the JSON data for the document list view makes these values available on a new geographic property placed on each list item.

[caption id='attachment_258' align='alignnone' width='397' caption='Firebug's Net Console']Firebug Net Console[/caption]

So to keep things simple the initial version of the dashlet simply re-used the doclist web script to grab a list of all content items in the root of the document library space, but the final version now on share-extras comes bundled with a dedicated webscript to list all items in the site that have the Geographic aspect applied.

Using this data, the dashlet displays a marker for each geotagged item, auto-centering itself on the centre point of all the items.

Clicking on a marker takes you to the Document Details page for that item. In the next update I'll look at displaying a snippet of information for the item, which the Google Maps API makes pretty easy.
On Friday I was asked to present at the OpenDoc Society’s ODF plug fest in Maidenhead. The plug fest seemed to be the presentation equivalent of speed dating, with each of the attendees given 10 minutes to cover an aspect of ODF.

There were a number of presentations on the different office suites built to support ODF, including Libra Office (a branch of Open Office), EuroOffice (started life as a Hungarian version of Open Office, but now available in other languages), Lotus Symphony and Lotus Live Symphony (IBM’s version for the desktop and the cloud), and of course Oracle Open Office and Oracle Cloud Office (Oracles version for the desktop and the cloud).

Michiel Leenaars, of nlnet, talked about ODF Recipes. ODF recipes are examples that show how to programmatically drive the creation of ODF documents. The demo was very impressive. You can find out more at the Software Recipes web site.

Robin La Fontaine, of, showed a great solution for tracking changes in ODF documents. This solution went way beyond the normal track changes feature provided by normal office suites. Useful for teams simultaneously working on large documents.

Possibly the most interesting part of the meeting were speeches from Adam Afriyie (Conservative MP for Windsor) and from local councillors. Adam talked about his support for open standards and open source software. He is keen to see an increased usage within government. But then the councillors shared their real life difficulty of adopting open standards within the local authorities. It seems that although the MP’s talk about adopting open standards the reality is that bureaucracy gets in the way. A requirement to support Microsoft formats when communicating with central government means that the local authorities cannot drop MS Office. This frustrates the local councillors and costs the local authorities millions.

At the end of the day I had to question why I use MS Office…

  • Is it better quality than the open source alternatives?

  • Does it have features that the others do not have?

  • Do I use it so that I can be compatible with my work colleagues?

  • …Or do I use it because I always have done and nothing has made me change?

Perhaps it is time I moved to ODF. I think I will download one of the ODF office suites and give it a go, I will let you know how I get on.
One of the new features delivered in Alfresco Enterprise 3.4 is the ability to replicate content between servers. Introducing this new feature has made me remember a project I worked on back in 1995. At the time I was at Documentum and working with a large global petroleum company. They were using Documentum to manage the creation and approval of Standard Operating Procedures (SOP’s). But they needed to make these available to remote drilling stations, often in distant parts of the world. Of course network reliability and bandwidth stopped them providing direct online access.  Content replication would have been ideal but was not available.

This is exactly the type of problem that the Alfresco content replication service is designed to solve. Content can now be replicated between servers, providing fast local access to key information. Replication can be scheduled to take place at regular intervals, run manually or triggered on an event (i.e. when new content is approved).

In the diagram below the SOP’s are replicated between the head office in Texas and the remote drilling stations. Having local copies mean that the remote workers are not affected should something happen to the network or source server.

I have presented this solution a number of times and two questions always come up:

  • Is replication the same as Clustering? No. Clustering is a means to support large-scale deployments by clustering the application over multiple systems. This is used to improve performance and reliability. But even though the application is spread over many servers it is still a single instance of Alfresco. With Alfresco replication you are running multiple, separate, instances of Alfresco and replicating a subset of the content between these servers.

  • Can’t I do the same thing with database replication? Some vendors use database replication, but this is more complex and is not as flexible as true content replication. First off, the content needs to be stored in the database (as BLOBS) or you need to synchronize both the database AND the file systems. Secondly the whole database is typically replicated – which is overkill if I only need to share a few files with the remote site. With Alfresco users have the flexibility to select a set of files and have these, and only these, replicated to a number of different servers.

The introduction of content replication in Alfresco Enterprise is a great new feature… I just wish it had been available 15 years ago!
Since Alfresco 3.2r the ProxyPortlet support in Alfresco Share has allowed developers to easily embed specific bits of Share functionality into a portal such as Liferay, using only a small amount of XML configuration to wire in existing web scripts.

This support has been substantially improved in version 3.4 of Alfresco, in order to allow the entire Share Document Library page to be embedded within a portal. Unfortunately the changes mean that the steps in Luis's original tutorial no longer work in the latest version.

As one of the features we demonstrated today at our Madrid event was the Doclib portlet, I managed to get five minutes to get the original web script-backed method working too.

Since the CMIS repository browsing web scripts used in Luis's example are no longer shipped with Share, I used my own Hello World dashlet from share-extras as a starting point instead. The web script is basic, but demonstrates displaying a simple greeting to the user including their user name.

Hello World Portlet

The following steps should work using a recent version of Alfresco 3.4 and Liferay 5.2.3 running as the portal, provided that the two components are first set up as per the Installing and Configuring instructions for the Doclib Portlet.

Once you've set everything up, the first thing to do is to add the web script files to the  instance of Share that you have already deployed to Liferay. Since you should already have created some directories in Liferay's tomcat-x.x.x/shared/classes directory to define your share-config-custom.xml, the easiest thing is to create a new directory named site-webscripts within the existing tomcat-x.x.x/shared/classes/alfresco/web-extension and place the following files in it.



   <shortname>Hello World</shortname>

   <description>Displays Hello World text to the user</description>














header=Hello World!

label.hello=Hello {0}

With those files added, you've successfully defined the web script that we'll wire into Share in the next section.

Now although the web script itself will be automatically picked up by Share at load-time, some additional config is also needed in web.xml for the ProxyPortlet to work in version 3.4.

The following lines, which define a custom servlet and servlet mapping which will be invoked by the ProxyPortlet, should be placed in the web.xml file belonging to the Share instance which has been deployed in Liferay. You should find the path to this will be something like <LIFERAY_HOME>/tomcat-6.0.18/webapps/share/WEB-INF web.xml (if you have not already started Liferay you will need to do so to force it to deploy share.war and create this structure).














You can place these two definitions anywhere within the top-level <web-app> element, but for consistency I always try to add them next to the existing <servlet> and <servlet-mapping> definitions.

Now you've done all you need to do to configure the scripts, we can move onto configuring the matching portlet definition which will be picked up by Liferay.

In the same WEB-INF directory where you modified web.xml you should find a file named portlet.xml, to which we add our new definition.


      <description>Hello World</description>












         <title>Hello World</title>

         <short-title>Hello World</short-title>















Add this right after the existing <portlet> definitions (which if you look at further it should be obvious define the three Doclib portlets) and save your changes.

Getting these details right is crucial if you're deploying your own web scripts, so a couple of notes on this are probably useful.

  1. The contents of the <portlet-name> element must match the name of the servlet mapping you have defined in web.xml

  2. The viewScriptUrl parameter must match the URL of your web script, with a prefix of /page added to the beginning (note that in 3.2r the web app context path was also required in the URL, but this now causes an error if supplied)

Lastly you should add the portlet to the two Liferay-specific configuration files in WEB-INF to ensure that authentication is handled correctly and also that the portal appears in the correct category in Liferay's Applications menu.

In liferay-portlet.xml add the following definition after the existing <portlet> elements





In liferay-display.xml, add the following within the existing <category> element - it should be obvious that this is adding your portlet to the 'Alfresco' category.

<portlet id='HelloWorld'></portlet>

You'll need to restart Liferay to get it to pick up the new portlet, and for Share in turn to load the additional web script. Once it's finished loading you should be able to follow the configuration steps in the Doclib Portlet guide to walk you through adding it to a page.

The example is basic, but shows how you can add a web script as a portlet, with a small amount of personalisation based on the identify of the user.

It's possible to add more complex web scripts, for example to load data from the Alfresco repository or other back-end data sources, but as Luis points out you should be careful how you render any hyperlinks within your scripts, to ensure that they are portal-safe.

To make sure that your URLs are correctly generated, please use the “scripturl()” function in your Freemarker templates to wrap them:

<a href='${scripturl(url.serviceContext + '/sample/cmis/repo', false)}'>CMIS Repository</a>

You can download the web script files used in this example in JAR format, which you can extract using any unzip program into the directories specified above (or even easier - simply drop the JAR file itself into Liferay's tomcat-x.x.x/shared/lib folder).
The v1.1.1 release of Share Import-Export has been up for a few days now, but I wanted to summarise some of the changes in the new version.

The most significant addition is support for importing and exporting security groups in JSON format, via the new and scripts.

As well as this the script has been made slightly more flexible, with the addition of a --users argument to allow you to import just a few users from a larger set. Since the sample data contains a large number of users that are used across all the different sample sites, you can now import just the users you need for a particular site.

As well as these functional improvements I've started cleaning up the code internally, an area I intend to focus a little more on over the next few weeks. For now I've just cleaned up the docstrings within each script, and updating the --help flags to re-use the usage information in there.

Last but not least, I should thank Dick from Formktek for reporting an issue with the user export script, which was causing some exported profile images to become corrupted when saved.

Beyond a few more tidy-ups the code is almost where I want it to be within the current constraints of the repository. But there have been a couple of ideas suggested for future uses of the scripts, so if there's a particular purpose you think the scripts could have or you just want to share your experiences, please leave a comment below.

What's in a date?

Posted by andy1 Feb 1, 2011
A quick tour of Alfresco support to query date and date time properties.

By default, Alfresco treats date and datetime properties the same. Both are indexed and queried to a resolution of day. The index actually stores the date string as yyyy-MM-dd, for example, 2011-02-01, to support both querying and ordering down to day. This approach has several limitations. While the extra resolution is currently required to be included at query time it is ignored. Date properties can be ordered at query time. Datetime properties are ordered after the query execution in Java, requiring a DB property fetch to get the missing time.

However, datetime properties can be configured to use the alternative DateTimeAnalyser supporting full time resolution, down to milliseconds. This configuration also supports variable resolution of dates. If you just include year in a query against a datetime type it will only consider the year in the match. For example, the following Alfresco FTS queries will match with increasing date resolution when using the DateTimeAnalyser.








Similarly, if only years are used in a datetime range it will ignore the lower resolution fields. (Take care to make sure the resolution of the start date and end match as mixed resolutions are not currently supported)

@cm:created:['2010' TO '2011']

@cm:created:['2011-02-01T11:03' TO '2011-02-01T11:04']

The DateTimeAnalyser is not used by default as it would require all Alfresco users to rebuild their lucene indexes. It can be configured (after stopping Alfresco)  by either

1) changing the setting in alfresco/model/ to:

or, 2) copying alfresco/model/ and related files to a new location and changing the definition of the bean that loads this file - 'dictionaryBootstrap' - currently defined in core-services-context.xml.

Once configured to use the DateTimeAnalyser, delete your existing indexes and restart Alfresco with the index.recovery.mode property set to FULL.


The date time tokeniser stores the dates in parts - as a crude trie of year, month, day, hour (24H), minutes, seconds and milliseconds. As this only supports querying there is an additional field in the index to support ordering.

Varying resolution date time queries and range queries are supported in Alfresco 3.4.0E and later. The DateTimeAnalyser has been around since Alfresco 2.1.

Filter Blog

By date: By tag: