Sharky at EclipseCon Europe

November 2, 2013

Klemens and I have been at the EclipseCon Europe in Ludwigsburg and got the chance to demo sharky there. When we came there and saw the big room our talk was assigned to, i became speechless. It wasn’t a room, but rather a hall. Really impressive. Never talked on such a big stage.

The talk was really nice; in fact it was the funniest talk i have ever had. You already know that sharky is a big wild one with his own mind. So this nasty fish refused to follow some of our commands. For instance sharky decided to fly higher and higher without any interest to get back down to us. (Well, it was not sharky’s fault. A loose cable blocked the diving mode). So i sent out Klemens to catch sharky again :D But how to catch a sharky that is flying in a height of 10 meters? Well, i still don’t know, but ask Klemens because he managed.

At the end, we could demo all things we had planned to do. And it seemed that the attendees really loved sharky and his little accidental misbehaviour.

For me it was one of the talks i will never forget. Was sooo much fun and a lot things happened to laugh about. Two people made a movie about the talk and i am looking forward to their release…

Here you can see a little movie by Benjamin Cabe. Jonas Helmig was volunteer to rc sharky by a 3D-sensor.

And an image during prepartion of sharky before the talk by my friend @ekkescorner:

Ece2013_SharkyPrepare

Thanks a lot to Jelena from foundation. She helped a lot preparing things…

Sharky – Jnect on BeagleBone with Eclipse Paho

October 14, 2013

During weekend Klemens and me worked hard on a Jnect-M2M integration.

If you follow the image from “Sharky – his evolution progress“, you can see that OpenNI and NiTE are running on an Ubuntu hardware and not on a beagle bone black. The problem was, that NiTE does not support ARM processors for now.

Ubuntu running OpenNI and NiTE

So we got the idea to track the coordinates of parts of the human body (joints) on the Ubuntu hardware using OpenNI and to send the coordinates to an external M2M-Server running on a beagle bone.

The informations sent by OpenNI java wrapper to the M2M-Server (topic=skeleton) look like:

skeleton {
   joint = Left-Hand
   x = 123.45
   y = 211.77
   z = 86.78
}

Jnect with M2M Client

We used the Jnect project and added M2M support to it. So Jnect no longer depends on the Microsoft Kinect Library, but also may use a M2M-Connector to get skeleton information from the M2M-Server.

Jnect will subscribe to the topic=skeleton from the M2M-Server and gets the coordinates of the tracked joints by OpenNI. Implementing some glue code it was simple to build the EMF-body-model defined by Jnect. Since Jnect also provides API to register GestureDetectors, we could use the provided API to add a LeftHandUpGestureDetector.

Finally we installed the Jnect bundles on an Equinox OSGi runtime running on a beagle bone black.

What happens in detail

The ASUS 3D-sensor is connected to the Ubuntu hardware. Running an implemented Java-Wrapper, the sensor will send pictures to OpenNI and OpenNI will pass skeleton informations to the Java-Wrapper. We put the given coordinates to a datastructure and pass them to the M2M server using Eclipse Paho.

The M2M server receives the messages and passes it to the Jnect M2M client running on the beagle bone black.

Jnect will parse that information and adjusts the EMF-body model. Changes to the body model will invoke the LeftHandUpGestureDetector. If the gesture is matched by the changes of the coordinates sent from OpenNI, then a systemOut is sent to the console.

See details here

Sharky – his evolution progress

October 12, 2013

We could already demo, that sharky may become controlled by a Vaadin web UI properly. Now we are going to help sharky in its natural evolution.

The main idea

We would like to use a natural interface to remote control two sharkies at the same time. A 3D-sensor should observe the left and the right hand movements. Gestures by the left hand should remote control sharky-1 and gestures by the right hand should control sharky-2.

The technical solution should look like the image below.

SharkyEvolutes

The sensor

So we bought Xbox Kinect. The problem was, that Kinect SDK only supports windows and OpenNI dropped linux support for license reasons. Again we explored the web and found Asus Xtion Pro Live. A 3D-sensor developed for developers and native OpenNI support. The sensor will capture 3D-images and sends them to OpenNI. NiTE – as an openNI plugin – provides java API for skeleton and hand tracking. Our first idea was to install OpenNI and NiTE on a beagle bone. But NiTE does not support ARM processors for now. So we adjusted our architecture again and have been installing OpenNI and NiTE on a X86 Ubuntu. Writing some java glue code allows us to track the positions of a hand in 3-dimensions. Since we are addicted to M2M-technologies we do not further process that information on the Ubuntu device, but send them using Eclipse Paho to a M2M-Server using the MQTT protocoll.

M2M-Server

The M2M-Server (a Mosquitto server) is running on a Beagle Bone Black. And it aims as a publish/subscribe server. Clients can subscribe to topics and will get messages sent to the topic. The Ubuntu device sends all messages to the “handinfo” topic at the M2M-server.

Jnect Bodymodel

A very nice project called Jnect provided by Jonas Helmig and Maximilian Kögel (Eclipsesource Munich) implements a body model based on EMF. It also supports gesture recognition. Your own gesture handler may be registered using extension points. So the idea is, that eclipse equinox is becoming installed on an additional beagle bone. Using Paho the beagle bone connects to the M2M-Server at the topic “handinfo”. So all changes of the human hands in any of the 3-dimensions is sent to this beagle bone. Implementing some glue code, the body model based on EMF is prepared.

At a next step, we have to add GestureHandlers. These are being notified about changes in the body model and have to calculate whether a gesture was detected. For instance “left hand up”, “right hand down”, “hands clapped”,… The gestures will be sent to the M2M-Server at the topic “gestures”.

Sharky controller

These gestures are the base information for the Sharky controller. The sharky controller is also installed on a beagle bone black and is based on Mihini. Using Lua it connects to the M2M-Server and subscribes the topic “gestures”.

So if the human raises its right hand, the SharkyController gets the information “right hand up”. Sharky controller will use that information to calculate the required GPIO outputs. The GPIOs are connected to the remote controller and sharky follows the commands introduced by hand movements.

Planned commands – left hand controlls sharky-1 and right hand controlls sharky-2

  • hand left -> “sharky turn left”
  • hand right -> “shary turn right”
  • hand up -> “sharky raise”
  • hand down -> “sharky dive”
  • hand towards the sensor -> “sharky get faster”
  • hand away from sensor -> “sharky get slower”

So we have a lot of work to do, to implement things properly. Lets see what happens…

How to install the Mosquitto-M2M Server on a BeagleBoneBlack

October 3, 2013

As you might know, we are going to present our flying shark at this year’s EclipseCon Europe. Since we won’t have much time there for setting up our equipment (and clearing the stage after the show), we are currently working on scaling things down – literally: Instead of a laptop running the M2M/MQTT server, we are going to make use of a second BeagleBoneBlack (BBB). Of course, it would be possible to have the M2M server running on the same BBB as the Mihini framework that commands our sharky, but we decided to keep it on dedicated hardware. After all, the M2M server in a real-world application might sit on another continent, and it is our goal to demonstrate M2M communication over the net and not on localhost.

Long story short, today we have set up our second BBB with the Mosquitto M2M server running on top of an embedded Linux. Since we had positive experience with the armhf flavour of Debian on our first BBB (with the Mihini framework), we decided to use it on our second BBB as well. After all, Mosquitto is provided as a Debian package …

The installation process was pretty straightforward – an excellent how-to can be found here. After writing the live image to a microSD card, pressing the USER/BOOT button on the BBB while disconnecting and reconnecting power causes the BBB to boot from the microSD card. Note that this worked only when we powered the BBB by USB (and not by adapter).

Connecting the BBB to our router, it obtained an IP address and offered a rudimentary browser terminal:

connection

login

As described in the how-to, the user was “debian” with the password “temppwd”.

When logged in, we downloaded the most recent debian-armhf image from here to the /tmp folder:

wgetting

The next step was to flash this image to the internal eMMC storage of our BBB:

flashing

This took some five minutes. After successfully flashing the image, we shut down the BBB, removed the microSD card and restarted it. Voilà, it came up running an embedded Debian. The only thing noteworthy was the password for the default user – in contrast to the live image on the microSD, this system uses “debian”.

What remained to be done was the installation of our M2M server. Since the Mosquitto broker is provided as a Debian package, this was as simple as

sudo apt-get install mosquitto

And voilà – mosquitto is installed and running as a service, listening on port 1883 of our BBB dedicated to MQTT:

portopen

Our next step will be to make the first BBB communicate with the second one and to subscribe to MQTT topics (and to publish of course). And after that we will set up a third BBB and use it as an additional source of commands (other than the web interface) … cool stuff coming up!

Sharky – at EclipseTestingDay

September 26, 2013

Today i arrived from eclipse testing day in darmstadt organized by bredex. As a conclusion i have to say, that the testing day is one of my favorite conferences. I enjoyed the day a lot.

The bredex team is very nice, the conference was organized properly and the speakers have been really experts in their area.

Before going to the testing day i did not have any idea about mobile testing since i am not involved with mobile development. I thought it might be very similar to the common test stuff. But there are so many aspects of testing mobiles. The talk about “energy testing” was really enlightening. Rating apps for their energy consumption. Also the insight to the mobile fragmentation, the caused problems and how to target mobile testing for various devices has been an information i will carefully remember for future projects.

Next year i am looking forward to go to the testing day again.

As i told in one of my last blog posts, we (sharky team) got the chance to keep a keynote at eclipse testing day. It was my first keynote and so i was really excited about that fact. And it was a lot of fun to demo sharky and to share the visions about M2M with other persons. The following image shows sharky flying during the keynote.

eclipseTestingDay_shark

So if you are looking for a really informative and funny testing conference; visit the eclipse testing day!

Sharky escaped

September 22, 2013

Attention vienna; sharky escaped!

Today sharky used his chance in the early morning and escaped. He followed the air flow and passed the door to the balkony very silently. So nobody detected his escape until now.

If you see sharky, please keep calm. The best way to tame him, is to use a proper RC controller…

The good side of the story: We have a second sharky ;-) and a nice story to tell at the conferences…

Sharky_Escaped_Web

The suspected escaping route of sharky. But i am wondering why my two budgies did not gave alarm?

Sharky – well earned break after a hard day

September 20, 2013

After a hard day, sharky rests in our living room.

Today we wired things together properly. Using a Vaadin UI and the Mosquitto M2M server to transfer rc commands to sharky. And sharky was following them properly.

We also implemented a JUnitTest that observes ultra sonic sensors. Sharky was controlled by the integration test and had to come close to the ultra sonic sensors. Then a shark alarm was sent by beagle bone and integration test became green.

Sleeping sharky

Image

Vaadin Web UI

For now the Vaadin Web UI is very simple. You can specify a value for left, right, dive and arise. The scaler at the right side controls the speed. And an emergency stop is implemented by “ESC”.

The alarm fence TextField specifies the minimum distance sharky may come close to the ultra sonic sensors. After falling below that distance “SHARK ALARM” is triggered.

The log table in the lower side shows all commands that have been sent to the M2M server. They will be translated into hardware GPIO outputs by beagle bone black. Beagle bone therefore uses the eclipse mihini project and a lua MQTT client.

Image

Next steps

On monday we have the last test flight in a hall, before we are leaving to our keynote on tuesday morning.

Sharky – first flight

September 14, 2013

After weeks of hard work we had our first flight with sharky. Was really impressing how a little beagle bone black and arduino may controll a flying shark.

During the coming week we are wiring things together with the Virgo-ActiveMQ-MQTT-Server. Mihini running on BeagleBone black will implement a MQTT client and interpret the given commands by M2M-Server and pass them to arduino. And arduino will send RC commands to the shark.

A Vaadin-UI will be used to implement the controller dashboard UI. And the Vaadin UI will use eclipse-paho to send MQTT messages to the M2M Server.

On tuesday 24th Sept in the early morning we are going to darmstadt to hold a keynote on at the eclipse testing day (25th sept). I am so excited about that.

See flying sharks at:

Some impressions about the first flight

The team

Petra Bierleutgeb and Klemens Edler

Team-PetraAndKlemens

Team-KlemensAndFlo

Florian Pirchner and Klemens Edler

Sharky filled with helium

Helium_filled_web

Some electronic parts

Curcuit

You can see the arduino board on the left, the beagle bone black right beside.

The upper board is the level shifter und the lower board is used for the RC-controller jail break.

Sharky flys

Vaadin 7 Cookbook (PACKT publishing) – A review

July 1, 2013

Indeed, the authors of the book have very detailed knowledge about vaadin 7. I am working with vaadin for many years now. And i think that i have a good experience using vaadin. But the vaadin 7 cookbook is really one of the greatest books i ever owned. There are so many different examples contained for all the different layers of vaadin 7. From client side widgets to databinding.

The book isn’t only very useful for people with good knowledge, but also for beginners. It comes with the main concepts of vaadin an provides tons of examples how to implement things properly. Also extended examples are available too. For instance “Binding tabs with a hard URL” shows how URI fragments may be used to control tab sheets. If a tab is selected the URI fragment of the browser URL will be changed. And this change triggers a tab switch. Nice idea. I never thought about implementing it. But i will keep in mind.

Another very useful example is the use of drag and drop. Vaadin comes with some built in features and the cookbook shows their use very detailed. Even drag and drop from desktop is covered. The charts receipt demos how java script can be embedded into a widget using the @JavaScript annotation. Think that a lot of very handy usecases are available to define the required java script with the widget.

Extending client side widgets may become to a challenge sometimes. They are implemented using Java and have to become compiled to widget sets (java script). Since vaadin is based on GWT, also native java script code is supported in your java widget. The TriStateCheckbox demos how to injects snippets of js code into your client. If you wanna get started with extending or writing client side widgets, the book is a perfect way learning how to do. Another very useful feature covered by the book are vaadin forms. They became enhanced with vaadin 7 and are so more powerful then the vaadin 6 forms.

Not at least grails and spring integration is discussed in details. Since i am using OSGi in my applications this topics wheren’t of main interest for me. But they are detailed and i could learn some additional things about the JEE world.

There is only one chapter i really missed – the OSGi integration. But also if you are implementing vaadin applications based on OSGi, you will need all the other informations.

As a summary i really can recommend that book for everyone who is looking forward to start programming vaadin or already does.

redVoodo – a good idea to use eclipse databinding

June 29, 2011

Actually i am writing the UI controllers which are connecting the redVoodo UI model with the UI parts that are responsible to visualize the model. After a sleepless night i decided to refactor the whole controller code. The problem was that i had to implement tons of observing code which observes the UI model for a few nodes and causes the UI parts to reflect these changes immediately. Of course, this solutions would have worked really well, but i did not like the way how this solution works.

In this blog i am going to describe why i decided to use eclipse databinding and show some code.

The UI model

RedVoodo offers a functionality called scoped actions and menu items. See post redvoodo-scoped-actions-and-menu-items

Based on this feature actions and menu items are hidden if they are not in the scope of the active framework or sub application. For instance, an action called “delete Item” will only be visible in the “Item Subapplication” but will be hidden for the “Customer Supapplication”. These actions and menu items have “subapplication scope” which means, that they are only visible if a defined sub application is active. But there are also “framework scoped” actions and menu items. Their visibilty depends on the active framework like Sales, Inventory,…

If you want to write an application which tracks the currently active subapplication or framework, perhaps you have to write tons of code if the underlying UI model has a deep structure.

This class diagram shows a simplified version of the redVoodo UI model. The root element of the model is called YUiModel. It can contain exactly one YApplication which is an abstraction of the vaadin application. The YApplication contains many YFrameworks (like Sales, Inventory,…), but only one YFramework can be active at a time. Based on the active framework, different sub applications are shown. So each framework has its own set of YSubapplications. For instance, the framework sales could contain sub applications like Customers, SalesCampains,…

The application, each framework and each sub application can contain their own set of actions. Depending on the active framework and the active sub application, different actions should be visible at the UI.

This image shows what i am talking about:

If the sales framework or a different sub application would be selected, completely different actions and menu items would be shown.

The model shown above is an EMF model. But since redVoodo should be a very extendable framework, using EMF is the default implementation. The core of redVoodo does not have any dependency to EMF. Instead it has a dependency to model controllers. These controllers are called edit parts. The UI parts like the “framework switcher” or the “sub application tree” are using these edit parts to visualize the model but do not have a dependency to the underlying model. So everybody can easily change the default implementation and provide edit parts that are based on a JPA implementation or edit parts that are even hardcoded.

The problem

Yesterday i started to write a lot of code to observe the ui model. I had to fetch the RootModelEditPart which represents YUiModel.

  • I registered a listener at the RootModelEditPart which will be notified if the active application changes.
  • Then i implemented the same issue for the active application and added a listener to the active application which becomes notified if the active framework changes.
  • Then i added a listener to the active framework which becomes notified if the active subapplication changes.
  • Then i added a listener that becomes notified if actions are added or removed.
  • Then i started to handle many different constellations based on parts that become active and deactive.
    • Remove listener from the lower containment structure
    • Switch the parent
    • Add the listener again to the lower contaiment structure
  • Then i detected that each UI part has a different behaviour to the observables. The SubApplicationTabBar has to observe the OpenSupApplications-reference from YFramework too.
  • Then, at 10 p.m. i thought -> Hmmmmmmmmmm, what a boring job!

After a sleepless night, i took all the controller implementations and put it to trash since eclipse databinding can do all that stuff for me. I can heavily reduce the amount of required code and safe a lot of time.

The (really easy) solution

// 1. define a nested property
// 1. define a nested property
String nestedProperties =
 "application.activeFramework.activeSubapplication.actions";

// 2. create the observable model value
IObservableValue modelOV = BeansObservables.observeDetailValue(
	new WritableValue(rootModel, IRootEditPart.class),
	nestedProperties, ISubapplicationEditPart.class);

// 3. create the observable target value
IObservableValue targetOV = PojoObservables.observeValue(
	ActionbarPart.this, "subapplicationActions");

// 4. bind model to target
subapplicationActionsBinding = dbContext.bindValue(targetOV,
  modelOV, new UpdateValueStrategy(
	UpdateValueStrategy.POLICY_NEVER),
  new UpdateValueStrategy(UpdateValueStrategy.POLICY_UPDATE)
	.setConverter(new NotNullConverter()));

1. define a nested property

This nested property will be splitted by the “.” and the reference path starting at the IRootEditPart will be followed for each property.

// "application.activeFramework.activeSubapplication.actions"
// "application.activeFramework.activeSubapplication.actions"
// can be visualized as:
rootModel
	.getApplication()
	.getActiveFramework()
	.getActiveSubapplication()
	.getActions()

The property “actions” inside the “activeSubapplication” will be observed. Every time it changes – nevertheless why it changes (due to a change of active framework or something) – we are becoming notified.

So we do not have to implement that boring listener hierarchy as described under “the problem”.

2. create the observable model value

Creates an IObservableValue which represents the “actions” property inside the “activeSubapplication” inside the “activeFramework” inside the “application”

3. create the observable target value

Addresses the property “subapplicationActions” of this instance (ActionbarPart.this). Every time the “actions” from “activeSubapplication” changes, the setter this#setSubapplicationActions(IActionEditPart editPart) should be invoked.

4. bind model to target

Therefore the DatabindingContext is used to bind the ObservableValues (model and target) to each other.

And here you can see the implementation of the setSubapplicationActions-method. It stores the subapplicationActions and puts it to the UI.

/**
 * Sets the list of subapplication actions.

 * Called by databinding.
 *
 * @param subapplicationActions
 *            the subapplicationActions to set
 */
public void setSubapplicationActions(
		List subapplicationActions) {
	internalSetSubapplicationActions(subapplicationActions);

	toUI_subapplications();
}

That’s all i had to write to observe a nested property inside my model tree.

I hope you can see the big advantage of eclipse databinding too and hope you give it a try.

Additional informations

Eclipse databinding
redVoodo.org
vaadin.com


Follow

Get every new post delivered to your Inbox.