We’re always excited for a new edition of Liferay DevCon. It always fills your day with interesting talks, plenty of heads at our booth and interesting conversations. We learn a lot, meet new people, and get to play the old school game consoles Liferay arranged (we want one for the office!). Freark, who doesn’t really need an introduction, was even mentioned by Liferay’s Olaf Kock as “the guy who knows everything”. Ha! (Don’t touch him, he’s our DevOp). Read on for summaries by Freark and Simon of a number of today’s presentations.
Equipped for Today – Prepared for Tomorrow (opening keynote)
The main point I got from this was Liferay 7.1 GA2 starting in 30 seconds. I will need to confirm this before I’ll believe it works with a normal Liferay
After downloading 7.1 GA2 A quick test on my laptop with just the default bundle gives me a startup of 49834 ms, which is a huge improvement from 7.0 or 6.2
10 pleasant surprises you will find during the process to upgrade to DXP 7.1
After seeing on the Unconference how many people had issues with migrating Liferay it is very good to see that Liferay is investing time to improve the upgrade process. The points mentioned in this talk should speed up the upgrade process as well as make debugging a lot easier
Apart from the upgrade process improvements, I’m also very happy with the fact that they’ve started removing XML from the database and storing the data in a more sensible way.
A quick list of the improvements mentioned:
1) audit logging for easier debugging
2) scripts for upgrading
4) new modular framework for Liferay core
5) Upgrades can be restarted from checkpoints
6) index update automation
7) extra tables for localisation
8) only store images in 1 repo (
9) dev tool improvements
10) performance improvements
Taking Liferay to the next level with Artificial Intelligence and Machine Learning
José and Oriol in this session show off their project where they have made a Liferay module which can suggest tags to images in the Liferay admin based on machine learning with Tensorflow. They also made functionality which uses the same concept for Document tagging and for enriching search queries with appropriate synonyms based on machine learning.
To achieve this they use a GRPC Java client communicating with a Tensorflow backend service which performs the actual classification. This backend service can use existing pre-trained models for image classification or use custom trained models. The retraining is done in an offline manner with scripting external to Liferay
Liferay Analytics Cloud: Understanding User Interactions with Digital Assets and Pages
Charles Pinon takes us through the features of Liferay’s new Analytics product. In a nutshell, it’s a mini-big-data system which by feeding it information and events from your Liferay deployment enable you to research your users’ behaviour. By using more information gleaned from Liferay data and events, it potentially shows better patterns than by just using page hits as input. It is run as a SaaS on Liferay, but judging from the audience there is a high demand for an on-prem solution, which might be much more GDPR friendly.
Liferay in the Cloud
Red Hat’s Akram and Liferay’s Achraf investigated running Liferay in containers in Red Hat’s Openshift. They take us through the architecture where they split up a Liferay environment into logically separated containers for Elasticsearch, Postgresql and Tomcat. They mention a few pitfalls they encountered setting this experiment up, among these were passing on the correct parameters to the JVM set memory parameters to use