Lunifera MQTT Xmas-Tree is online

December 1, 2014

With December just begun, we are happy to announce that the 2014 incarnation of our MQTT Xmas Tree is now online. Feel free to try it out here: The tree is standing in our office, and everybody can change the lights on it, move the Xmas star and have a tiny angel fly around 😉

How did we do it? Well, it consists of three main parts – not counting the tree itself 😉

  • a RaspberryPi that controls the LED band on the tree and the movement of the star and the Xmas angel. On this RaspberryPi, we have the Mihini framework running an MQTT client. The MQTT client ties together hardware (GPIO pins) and software (MQTT messages). In order to retrieve MQTT messages, the client uses the Lua implementation of Eclipse Paho. Messages containing valid Xmas Tree commands are then translated to the appropriate GPIO actions (controlling the LED band via an IR diode, powering the motor for the Xmas angel via a transistor and triggering an Arduino Uno that generates a PPM signal for the servo motor that moves the star).
  • a second RaspberryPi that has a webcam attached and serves a video stream via motion and apache2 (we followed this great tutorial approximately to get this running). With DDNS, this stream can be reached from the outside world.
  • a Vaadin Web UI featuring buttons that send MQTT messages with commands for our tree to our MQTT broker (to be picked up by the first RaspberryPi) and displaying the video stream so users can watch the effects of their actions.

Getting this contraption to work was great fun — a great way to spend one’s spare time. A nice team-building activity. And a perfect counterweight to tedious debugging sessions ;-)

Of course, we are going to open source the tree command software on Github. By the way here is an overview picture of the hardware we used for controlling the tree:



Happy treeing and have a joyous holiday season!

The Lunifera Crew from Vienna

Vaadin 7.3 – Valo, OSGi and e4

September 2, 2014

I got the chance to see a preview of Vaadin 7.3 some days ago and I am really really impressed about the new features it brings.

Until now, I have worked with the Vaadin Reindeer theme and tried to customize it. But since I am a Java developer, I do not have particularly deep knowledge about CSS3 and had a hard time with it. That’s why I am really looking forward to Vaadin 7.3 and going to upgrade my customer projects in the next days. The new Valo-Theme is exactly what I have been trying to do myself: a responsive and highly customizable theme. There are many different styles and most of them meet my objectives without having to change anything in the CSS.

And the best thing about Vaadin 7.3 is that it comes with a high-end Sass compiler. In the last days I was reading a lot about Sass and it is a perfect match for Java developers. Using this very intuitive styling language, Vaadin 7.3 will compile that information into proper CSS3. Really crazy… For me Sass is something like a DSL for CSS3. Thus, I do not have to schedule my CSS training anymore — I just have to use Sass 😀

 OSGi and Vaaclipse

During the next days, I will “Run a first Vaadin 7.3 OSGi application”. And I am sure right now: it is a perfect match.

Running a Vaadin 7.3 OSGi application is the base for migrating the Vaaclipse-Project to Valo too. The Vaaclipse-Project is a rendering layer to render the e4-workspace with Vaadin. See

For details about Vaadin 7.3 just click here.

I also added two screenshots about the new theme:






Going to keep you informed…

Best, Florian

Last Sharky talk

September 1, 2014

We started with our first Sharky talk in Germany Darmstadt one year ago. Now after furthermore 9 talks we decided to stop the project. We showed our Sharky in many different cities like Darmstadt, Vienna, San Francisco, Ludwigsburg, Mainz, Zurich and Munich.

Now we are on the way to find new ideas for projects, hopefully as good as the sharky project was.


In this video you can see our last Sharky presentation at IoT-Meetup in Vienna. (the Video is in german)

See you soon,

Sharky team…

Sharky at EclipseCon Europe

November 2, 2013

Klemens and I have been at the EclipseCon Europe in Ludwigsburg and got the chance to demo sharky there. When we came there and saw the big room our talk was assigned to, i became speechless. It wasn’t a room, but rather a hall. Really impressive. Never talked on such a big stage.

The talk was really nice; in fact it was the funniest talk i have ever had. You already know that sharky is a big wild one with his own mind. So this nasty fish refused to follow some of our commands. For instance sharky decided to fly higher and higher without any interest to get back down to us. (Well, it was not sharky’s fault. A loose cable blocked the diving mode). So i sent out Klemens to catch sharky again 😀 But how to catch a sharky that is flying in a height of 10 meters? Well, i still don’t know, but ask Klemens because he managed.

At the end, we could demo all things we had planned to do. And it seemed that the attendees really loved sharky and his little accidental misbehaviour.

For me it was one of the talks i will never forget. Was sooo much fun and a lot things happened to laugh about. Two people made a movie about the talk and i am looking forward to their release…

Here you can see a little movie by Benjamin Cabe. Jonas Helmig was volunteer to rc sharky by a 3D-sensor.

And an image during prepartion of sharky before the talk by my friend @ekkescorner:


Thanks a lot to Jelena from foundation. She helped a lot preparing things…

Sharky – Jnect on BeagleBone with Eclipse Paho

October 14, 2013

During weekend Klemens and me worked hard on a Jnect-M2M integration.

If you follow the image from “Sharky – his evolution progress“, you can see that OpenNI and NiTE are running on an Ubuntu hardware and not on a beagle bone black. The problem was, that NiTE does not support ARM processors for now.

Ubuntu running OpenNI and NiTE

So we got the idea to track the coordinates of parts of the human body (joints) on the Ubuntu hardware using OpenNI and to send the coordinates to an external M2M-Server running on a beagle bone.

The informations sent by OpenNI java wrapper to the M2M-Server (topic=skeleton) look like:

skeleton {
   joint = Left-Hand
   x = 123.45
   y = 211.77
   z = 86.78

Jnect with M2M Client

We used the Jnect project and added M2M support to it. So Jnect no longer depends on the Microsoft Kinect Library, but also may use a M2M-Connector to get skeleton information from the M2M-Server.

Jnect will subscribe to the topic=skeleton from the M2M-Server and gets the coordinates of the tracked joints by OpenNI. Implementing some glue code it was simple to build the EMF-body-model defined by Jnect. Since Jnect also provides API to register GestureDetectors, we could use the provided API to add a LeftHandUpGestureDetector.

Finally we installed the Jnect bundles on an Equinox OSGi runtime running on a beagle bone black.

What happens in detail

The ASUS 3D-sensor is connected to the Ubuntu hardware. Running an implemented Java-Wrapper, the sensor will send pictures to OpenNI and OpenNI will pass skeleton informations to the Java-Wrapper. We put the given coordinates to a datastructure and pass them to the M2M server using Eclipse Paho.

The M2M server receives the messages and passes it to the Jnect M2M client running on the beagle bone black.

Jnect will parse that information and adjusts the EMF-body model. Changes to the body model will invoke the LeftHandUpGestureDetector. If the gesture is matched by the changes of the coordinates sent from OpenNI, then a systemOut is sent to the console.

See details here

Sharky – his evolution progress

October 12, 2013

We could already demo, that sharky may become controlled by a Vaadin web UI properly. Now we are going to help sharky in its natural evolution.

The main idea

We would like to use a natural interface to remote control two sharkies at the same time. A 3D-sensor should observe the left and the right hand movements. Gestures by the left hand should remote control sharky-1 and gestures by the right hand should control sharky-2.

The technical solution should look like the image below.


The sensor

So we bought Xbox Kinect. The problem was, that Kinect SDK only supports windows and OpenNI dropped linux support for license reasons. Again we explored the web and found Asus Xtion Pro Live. A 3D-sensor developed for developers and native OpenNI support. The sensor will capture 3D-images and sends them to OpenNI. NiTE – as an openNI plugin – provides java API for skeleton and hand tracking. Our first idea was to install OpenNI and NiTE on a beagle bone. But NiTE does not support ARM processors for now. So we adjusted our architecture again and have been installing OpenNI and NiTE on a X86 Ubuntu. Writing some java glue code allows us to track the positions of a hand in 3-dimensions. Since we are addicted to M2M-technologies we do not further process that information on the Ubuntu device, but send them using Eclipse Paho to a M2M-Server using the MQTT protocoll.


The M2M-Server (a Mosquitto server) is running on a Beagle Bone Black. And it aims as a publish/subscribe server. Clients can subscribe to topics and will get messages sent to the topic. The Ubuntu device sends all messages to the “handinfo” topic at the M2M-server.

Jnect Bodymodel

A very nice project called Jnect provided by Jonas Helmig and Maximilian Kögel (Eclipsesource Munich) implements a body model based on EMF. It also supports gesture recognition. Your own gesture handler may be registered using extension points. So the idea is, that eclipse equinox is becoming installed on an additional beagle bone. Using Paho the beagle bone connects to the M2M-Server at the topic “handinfo”. So all changes of the human hands in any of the 3-dimensions is sent to this beagle bone. Implementing some glue code, the body model based on EMF is prepared.

At a next step, we have to add GestureHandlers. These are being notified about changes in the body model and have to calculate whether a gesture was detected. For instance “left hand up”, “right hand down”, “hands clapped”,… The gestures will be sent to the M2M-Server at the topic “gestures”.

Sharky controller

These gestures are the base information for the Sharky controller. The sharky controller is also installed on a beagle bone black and is based on Mihini. Using Lua it connects to the M2M-Server and subscribes the topic “gestures”.

So if the human raises its right hand, the SharkyController gets the information “right hand up”. Sharky controller will use that information to calculate the required GPIO outputs. The GPIOs are connected to the remote controller and sharky follows the commands introduced by hand movements.

Planned commands – left hand controlls sharky-1 and right hand controlls sharky-2

  • hand left -> “sharky turn left”
  • hand right -> “shary turn right”
  • hand up -> “sharky raise”
  • hand down -> “sharky dive”
  • hand towards the sensor -> “sharky get faster”
  • hand away from sensor -> “sharky get slower”

So we have a lot of work to do, to implement things properly. Lets see what happens…

How to install the Mosquitto-M2M Server on a BeagleBoneBlack

October 3, 2013

As you might know, we are going to present our flying shark at this year’s EclipseCon Europe. Since we won’t have much time there for setting up our equipment (and clearing the stage after the show), we are currently working on scaling things down – literally: Instead of a laptop running the M2M/MQTT server, we are going to make use of a second BeagleBoneBlack (BBB). Of course, it would be possible to have the M2M server running on the same BBB as the Mihini framework that commands our sharky, but we decided to keep it on dedicated hardware. After all, the M2M server in a real-world application might sit on another continent, and it is our goal to demonstrate M2M communication over the net and not on localhost.

Long story short, today we have set up our second BBB with the Mosquitto M2M server running on top of an embedded Linux. Since we had positive experience with the armhf flavour of Debian on our first BBB (with the Mihini framework), we decided to use it on our second BBB as well. After all, Mosquitto is provided as a Debian package …

The installation process was pretty straightforward – an excellent how-to can be found here. After writing the live image to a microSD card, pressing the USER/BOOT button on the BBB while disconnecting and reconnecting power causes the BBB to boot from the microSD card. Note that this worked only when we powered the BBB by USB (and not by adapter).

Connecting the BBB to our router, it obtained an IP address and offered a rudimentary browser terminal:



As described in the how-to, the user was “debian” with the password “temppwd”.

When logged in, we downloaded the most recent debian-armhf image from here to the /tmp folder:


The next step was to flash this image to the internal eMMC storage of our BBB:


This took some five minutes. After successfully flashing the image, we shut down the BBB, removed the microSD card and restarted it. Voilà, it came up running an embedded Debian. The only thing noteworthy was the password for the default user – in contrast to the live image on the microSD, this system uses “debian”.

What remained to be done was the installation of our M2M server. Since the Mosquitto broker is provided as a Debian package, this was as simple as

sudo apt-get install mosquitto

And voilà – mosquitto is installed and running as a service, listening on port 1883 of our BBB dedicated to MQTT:


Our next step will be to make the first BBB communicate with the second one and to subscribe to MQTT topics (and to publish of course). And after that we will set up a third BBB and use it as an additional source of commands (other than the web interface) … cool stuff coming up!

Sharky – at EclipseTestingDay

September 26, 2013

Today i arrived from eclipse testing day in darmstadt organized by bredex. As a conclusion i have to say, that the testing day is one of my favorite conferences. I enjoyed the day a lot.

The bredex team is very nice, the conference was organized properly and the speakers have been really experts in their area.

Before going to the testing day i did not have any idea about mobile testing since i am not involved with mobile development. I thought it might be very similar to the common test stuff. But there are so many aspects of testing mobiles. The talk about “energy testing” was really enlightening. Rating apps for their energy consumption. Also the insight to the mobile fragmentation, the caused problems and how to target mobile testing for various devices has been an information i will carefully remember for future projects.

Next year i am looking forward to go to the testing day again.

As i told in one of my last blog posts, we (sharky team) got the chance to keep a keynote at eclipse testing day. It was my first keynote and so i was really excited about that fact. And it was a lot of fun to demo sharky and to share the visions about M2M with other persons. The following image shows sharky flying during the keynote.


So if you are looking for a really informative and funny testing conference; visit the eclipse testing day!

Sharky escaped

September 22, 2013

Attention vienna; sharky escaped!

Today sharky used his chance in the early morning and escaped. He followed the air flow and passed the door to the balkony very silently. So nobody detected his escape until now.

If you see sharky, please keep calm. The best way to tame him, is to use a proper RC controller…

The good side of the story: We have a second sharky 😉 and a nice story to tell at the conferences…


The suspected escaping route of sharky. But i am wondering why my two budgies did not gave alarm?

Sharky – well earned break after a hard day

September 20, 2013

After a hard day, sharky rests in our living room.

Today we wired things together properly. Using a Vaadin UI and the Mosquitto M2M server to transfer rc commands to sharky. And sharky was following them properly.

We also implemented a JUnitTest that observes ultra sonic sensors. Sharky was controlled by the integration test and had to come close to the ultra sonic sensors. Then a shark alarm was sent by beagle bone and integration test became green.

Sleeping sharky


Vaadin Web UI

For now the Vaadin Web UI is very simple. You can specify a value for left, right, dive and arise. The scaler at the right side controls the speed. And an emergency stop is implemented by “ESC”.

The alarm fence TextField specifies the minimum distance sharky may come close to the ultra sonic sensors. After falling below that distance “SHARK ALARM” is triggered.

The log table in the lower side shows all commands that have been sent to the M2M server. They will be translated into hardware GPIO outputs by beagle bone black. Beagle bone therefore uses the eclipse mihini project and a lua MQTT client.


Next steps

On monday we have the last test flight in a hall, before we are leaving to our keynote on tuesday morning.