Saturday, October 29, 2011

How to Fix Eclipse BIRT Designer 3.7.1 Crash when Prevewing Reports in Linux 64-bit

When previewing a report or running the report in embedded web browser, Eclipse Reporting / BIRT 3.7.1 running in 64-bit Linux crashes with only the following messages:

No bp log location saved, using default.
[000:000] Browser XEmbed support present: 1
[000:000] Browser toolkit is Gtk2.
[000:000] Using Gtk2 toolkit
[000:000] Warning( Load: Could not open file
[000:000] No bp log location saved, using default.
[000:000] Browser XEmbed support present: 1
[000:000] Browser toolkit is Gtk2.
[000:000] Using Gtk2 toolkit
[000:273] Warning( Load: Could not open file
[000:273] No bp log location saved, using default.
[000:273] Browser XEmbed support present: 1
[000:273] Browser toolkit is Gtk2.
[000:273] Using Gtk2 toolkit
Opened debug file '/home/ceefour/tmp/mozdebug'

It happens with all reports, even with a blank one.

To reproduce:
  1. Launch Eclipse for Java & Report Developers 3.7.1 64-bit in Ubuntu 11.04 64-bit
  2. Create a new report
  3. On the blank report, click Preview tab ...... crash
It is reported as Eclipse Bugzilla Bug 362416.

Here's a workaround to fix it:

./eclipse -vmargs -Dorg.eclipse.swt.browser.DefaultType=mozilla

This solves the crashing problem entirely. So it's not a fix but a temporary workaround until the BIRT developers come up with a fix to this problem permanently.

The (upstream) Eclipse bug was marked fixed as of 3.7.1 however it still requires the manual command-line argument to avoid the crash in Eclipse Reporting/BIRT 3.7.1.

My sistem info:

Linux annafi 2.6.38-12-generic #51+kamal3~mjgbacklight5-Ubuntu SMP Wed Oct 5
20:13:06 UTC 2011 x86_64 x86_64 x86_64 GNU/Linux
java version "1.6.0_26"
Java™ SE Runtime Environment (build 1.6.0_26-b03)
Java HotSpot™ 64-Bit Server VM (build 20.1-b02, mixed mode) 

I first reported this bug in BIRT Exchange forum thread here.

Want to create complex business reports easily? Get BIRT: A Field Guide (3rd Edition) for an excellent starter resource.
To integrate BIRT reports with your application, check out Integrating and Extending BIRT (3rd Edition).

Wednesday, October 5, 2011

Generation Gap Pattern vs Protected Regions in Xtext MDD

During the development of the awesome Xtext Protected Regions Support created by Daniel Dietrich, Daniel asked a really interesting question:

Just interested in your opinion / your 2 cents:

It's best practice to separate generated from non-generated code (when it comes to put it in a versioning system).
Therefor people use the generation gap pattern - the generated class is a base class (or similar), the manual written / generated-once classes implement the generated classes.

When using protected regions, all files (including the generated) have to be checked in. Some say, this ends up in a versioning disaster.

I prefer the protected regions approach. the base classes of the ggp nearly double the count of classes. this is a technical vehicle which makes no sense for the application. but how could be avoided, that generated classes are checked in?

my only answer is, that the generated code has to be cut of the files. empty files will disappear, protected regions are preserved. when checking out files, the generator has to be started. before checking in files, the generated code has to be stripped (into a special dir which will be checked in?).

this sounds like a technical vehicle, too. would that make sense?

To which replied: (note that these are my opinion, and I don't claim to be an expert, so feel free to challenge my assumptions and prove me wrong!) :-)

I have nothing against GAP. It definitely has its valid uses. So for certain problems, GAP is a valid solution, and is even preferable to regions.

About versioning, I'm not sure it's a "disaster". Redundant maybe, but dangerous? I don't think so. Avoiding to checkin generated files maybe preferable where the source DSL files are "authoritative" and the target files are "final".

Let me illustrate: wsdlimport. The WSDL file is an authoritative source. The generator is stable. Generated Java Proxy Classes are "dumb" target files. You can regenerate those anytime, no need for customization. And the generator is never changed so you can be sure generated files will actually compile and work. No need to checkin the Java targets.

Xtext can be used for projects of that kind, and very easy for that.

There are also projects that fall in the middle: need some customization. So you do GAP: generate the base class and the customizable subclass. This technique is excellent for some cases, but fails when:

  1. You're restricted by the class hierarchy in some way.
  2. Target language doesn't support subclassing, say: XML. Or my case, this would be yet another custom DSL.

There are also cases where GAP technically works, but the generated class structure is complex enough that separating the base class from the actual class makes it "unnatural". I'm sure you've seen stuff like that. Oh, let me give a concrete example from one of my prototypes:

def StringConcatenation genMainFile(String fileName) { val result = new StringConcatenation() result.append( augmenter.beforeMainFile(fileName) ) result.append( augmenter.aroundMainFile(fileName, [fn1 | { doMainFile(fn1, [ fn2 | augmenter.innerMainFile(fn2, [fn3 | { genImportBlock() genClassBlock() null }]) ]) }]) ) result.append( augmenter.afterMainFile(fileName) ) result }

Mind you the above code actually works, but it's damn ugly! At least with protected regions there will only be (marker) comments. And with most editors, comments can be folded and not so distracting, so it's much more bearable.

Some may say "it's only the base class, nobody will touch it". On the contrary, you'll see those structural methods on your debugging stack traces, just "perfect" at the time when you're in deep need of clear & cohesive program structure, but oh... you're buried in nested method calls. :-( AspectJ/AOP/weaving also has this problem, though it's manageable to some extent. They only fall apart when pushed too far, so moderate use is OK.

The third class is projects that require extensive customization. The source DSL only comprises of ~20% of the target, providing structure or supporting form, and gives places to fill, where these are filled by the programmer or some other DSL.

In my (currently hypothetical, but hopefully realized soon) the Entity->UI generator will not generate JSF/GWT forms/pages directly, but generate to an intermediate UI DSL. (you can argue this is Model-to-Model, not M2T, but hey, textual models are much easier to inspect/hack than something buried in XML!) The UI DSL can then be processed to generate JSF or GWT.

I'm not really interested in supporting class inheritance in a UI DSL. In fact the "class" concept itself may not exist in a UI domain, it only matters to OO world. So protected regions is the only option, and thankfully Xtext supports comments by default.

With that, it's possible to customize the generated UI DSL files right in the places where they're needed.

Another use case that I'm exploring, is two or more generators (which may or may not be sourcing the same model) generating to the same file, but in different regions. And the FSA should automagically merge them.

And then, add to the mix that the generators themselves are in constant development. That means the same source DSL when processed, may yield target files that are broken, uncompilable, buggy, etc. And the target files are needed because they form the foundation of yet another project, so unless the previous "working" target files can be recovered, the development of the derived project effectively stops.

For those uses cases, would I checkin the target files? Of course.

With all of the above said, I have nothing against GAP or un-checkin generated files. GAP & un-checkin may be common, but I believe there are classes of problems where they're inappropriate.

To learn Modeling with Eclipse Modeling Framework (EMF), I highly recommend the book EMF: Eclipse Modeling Framework.

Wednesday, August 3, 2011

How to Fix Ant Build Error: "Could not load a dependent class com/jcraft/jsch/Logger. It is not enough to have Ant's optional JARs"

If you get an error message like this while running a problematic Ant build script file :


/home/ceefour/git/magento-id/build.xml:13: The following error occurred while executing this line:
/home/ceefour/git/magento-id/build.xml:22: Problem: failed to create task or type sshexec
Cause: Could not load a dependent class com/jcraft/jsch/Logger
       It is not enough to have Ant's optional JARs
       you need the JAR files that the optional tasks depend upon.
       Ant's optional task dependencies are listed in the manual.
Action: Determine what extra JAR files are needed, and place them in one of:
        -a directory added on the command line with the -lib argument

Do not panic, this is a common problem.
The commonest cause is a missing JAR.

This is not a bug; it is a configuration problem

It means your Ant build script file using optional Ant libraries and need to tell Ant where to find them.

In my case, it needs jsch.jar aka libjsch-java package in Debian/Ubuntu Linux.

First you need to install ant-optional package for Ant optional libraries support :
sudo apt-get install ant-optional

Then the libjsch-java Debian/Ubuntu package:
sudo apt-get install libjsch-java

Then put it in correct directories so that Ant can find them :

sudo ln -s /usr/share/java/jsch.jar /usr/share/ant/lib/
mkdir -vp ~/.ant/lib
ln -s /usr/share/java/jsch.jar ~/.ant/lib/

Ubuntu's Ant look for libraries in /usr/share/ant/lib folder, while Eclipse IDE's Ant look for optional Ant libraries in $HOME/.ant/lib folder.

Note that for Eclipse IDE, you may need to refresh Eclipse Ant Plug-in's Runtime Classpath, by going to Window > Preferences > Ant > Runtime, and clicking "Restore Defaults".
Make sure that the required libraries are now listed under "Global Entries".

Ant build system is frequently used in typical Java EE 6 enterprise application development. I highly recommend The Java EE 6 Tutorial: Basic Concepts (4th Edition) for a practical guide to the Java EE 6 technology.

Saturday, July 16, 2011

How to Create A Software Site from Plug-in Bundles for Eclipse Tycho Maven

As of Eclipse Tycho Maven plugin 0.12.0, it can only use "InstallationUnit" type (aka Software Site sources) from a target platform definition file. So you cannot use Features, Installation, or Directory in your target definition file.

If all the plug-ins your Eclipse Platform application requires are already in a Software Site/Update Site, you're lucky. But if not, don't despair. Gather all the plug-ins in a directory and...
(note: you may need to OSGify some files beforehand using Bnd or Eclipse PDE)

The lighting way:
If it already contains feature(s) that list the available plug-ins, just create a .target definition file and in Eclipse right click the .target and export it. Done!
If not, continue below...

The quick way:

  1. Create a Feature project, initial plug-in list is empty.
  2. Create a .target platform definition file in PDE, inside the Feature Project. Add the bundles. Activate it.
  3. Open the feature.xml, Plug-ins tab. Select all plug-ins (use "*.*" filter)
  4. Right click the Feature project, Export... as a deployable feature :-) There, you get your Software Site neatly prepared and ready for some serious Tycho action! :D

The hard way:

To learn more about Eclipse platform programming, I highly recommend Eclipse Rich Client Platform (2nd Edition).

Configuring Eclipse Tycho Maven Plugin to use Target Definition File and Publishing Target Platform Bundles to Update Site

I answered an Eclipse Tycho Maven Plugin question on StackOverfow and I think I should repeat it here (for my own purpose, hehe.. I am forgetful :-)

I should expand this to have better coverage of Tycho workflow but usually I'll get lazy so here it is pretty much verbatim.

Create a Target Definition file (.target) and put it inside a Maven project, see here for example target:

You need to attach the .target file to the artifact, using the build helper:


(from )

Then, in the parent POM or the plug-in projects that use that target definition file, you need to configure the "target" of target-platform-configuration Maven plugin, for example:


(taken from )

Then your project(s) should build very nicely using Tycho. :-) If your .target references remote p2 repositories and not already in the p2 bundle pool, the necessary artifacts will be downloaded automatically.

Good luck!

Known Issue:

[WARNING] Target location type: Profile is not supported 

As of Tycho 0.12.0, It means the "Eclipse Installation" target source type cannot be used with Tycho (yet?), along with "Directory" and "Features".

Solution: Use the "Update Site" target source.

If you don't have yet an update site, here's to generate an update site from an Eclipse installation (or from any folder containing bundles, for that matter):

/opt/eclipse_rcp/eclipse -consolelog -nosplash -verbose \
  -application org.eclipse.equinox.p2.publisher.FeaturesAndBundlesPublisher \
  -metadataRepository file:/home/ceefour/p2/bonita/ \
  -artifactRepository file:/home/ceefour/p2/bonita/ \
  -source /home/ceefour/BOS-5.5.1/studio/ \


  • change /opt/eclipse_rcp to your own Eclipse SDK installation
  • metadataRepository and artifactRepository is the folder where the new update site will be created
  • source is --you guessed it-- the folder/installation containing the original bundles

To learn more about Eclipse platform programming, I highly recommend Eclipse Rich Client Platform (2nd Edition)

Thursday, June 23, 2011

Hot Deploy & F5/Refresh-Driven Web Application Development Are Ancient Compared to Eclipse RAP!


Most web applications developer would be very familiar with F5/Refresh-Driven development. You know, make a little change and press F5 in the web browser and you can view the updated page. This was the good old PHP days.

Java EE web application developers used to not that lucky. While some changes like JSP pages, JSF facelets, etc. take effect immediately (and thus, "refresh-driven"), in some cases they have to "redeploy". This usually means developer changes a Java class backing bean or an "important" file like web.xml. Redeploy means undeploying the web app from the Java EE container or application server, then redeploying the web app or WAR again. IDEs like the excellent Eclipse IDE Indigo (yay!) automate this but a redeploy can take anything between a few seconds to... minutes! I think typical web apps would deploy in about 20-30 seconds so that is painful.

JRebel from ZeroTurnaround (which just won the Most Innovative Java Technology in JAX Innovation Awards 2011, congratulations guys!) really helps here, by allowing most common changes to not cause a full redeploy, but just... hot deploy! It's like JSP/JSF but for the rest of Java app, Spring beans, etc. JRebel is a commercial plug-in but is definitely worth it.

But I'd argue Eclipse RAP should won the Most Innovative Java Technology title... Here's why!

(Eclipse Rich Ajax Platform/RAP is framework to develop AJAX-powered web applications easily based on Eclipse RCP programming model, see Eclipse Rich Client Platform (2nd Edition) for more information.)

I've just noticed something today. I know I should've noticed this long ago, but when you launch an Eclipse RAP rich internet application from Eclipse IDE using Debug (F11 key), ALL your code changes take effect immediately! No exceptions!

No need to even refresh the web browser!

Change the code for a menu item or a view or an action, save the .java file, go to the browser and click it... your new code is there!

"No refresh? But how can it be!"

Part of the magic is due to OSGi Dynamic Module System, that is brilliantly integrated as part of the Eclipse platform itself.

So when you save a Java file, Eclipse IDE will compile your class (and only your class, due to incremental builder feature, so it's very fast!), then update the OSGi bundle or Eclipse plug-in in the Eclipse RAP application. And only your bundle/plug-in is updated/refreshed in the application, so again, even if it's a different process it's also very fast. The whole process typically takes less than a second on a typical developer workstation, even on moderately complex apps! Most of the time the process is already done before you have a chance to hit Alt+Tab. ;-)

The other part of the magic is even though Eclipse RAP application comes with full AJAX features by default (it's not an option, it's actually a requirement), most of the business logic is server-side Java. So even if the most of the render JavaScript/HTML presentation layer in the web browser, when you perform an action for example by clicking a menu item, this will trigger a server request...

Which means your updated code! Yay! :)

Also important feature of Eclipse RAP is that for background/long-running jobs or "server push" operations, Eclipse RAP supports several approaches: Eclipse Jobs API or session-long UICallback.

This is pretty much automatic if you're already an Eclipse RCP programmer utilizing Jobs API. There's no need to do workarounds and hacks like traditional AJAX web development or learn yet another new API (and programming model) just for server push.

To learn more about Eclipse platform programming, I highly recommend Eclipse Rich Client Platform (2nd Edition). It's really good for learning Eclipse RCP/RAP development, most of the things that apply to RCP also applies to RAP. In fact, you can single-source an application to two target platforms (RCP for desktop, RAP for web) simultaneously. :-)

Eclipse Virgo IDE Tooling 1.0.0.M01 Released

Martin Lippert from SpringSource announced:

I am happy to announce that we released the first milestone build of the Virgo IDE tooling. For installation instructions, please take a look at the this wiki page:

This is the first milestone build after the code contribution from SpringSource and there aren't that much changes with regards to features or bugs in there compared to the latest dm server tooling releases. But this will change from now on... :-)


Eclipse Virgo Web Server / Kernel is an Enterprise OSGi web server, capable of serving dynamic OSGi web applications via OSGi Web Bundles (WABs). It works with Eclipse Gemini project to provide Java EE 6 capabilities to server-side OSGi applications.

For more in-depth explanation on using OSGi for enterprise applications, I highly recommend OSGi in Action: Creating Modular Applications in Java.

Thursday, June 16, 2011

How to Create Felix GoGo Commands with OSGi Blueprint in Eclipse Virgo Kernel / Web Server

Felix GoGo aka OSGi RFC-147 is a standard-based way to implement modular CLI (command-line interface / console / shell) commands in Java.

Apache Karaf supports Felix GoGo console out-of-the box, along with MINA SSH server integration. This blog post's focus would be Eclipse Virgo Web Server (Eclipse Virgo Kernel), and how we can leverage the same CLI support, along with SSH, in Eclipse Virgo. Now this article would be very short if I were using Apache Karaf, because it already had the right ingredients built-in and well-integrated: Felix GoGo, SSH server, Apache Aries as OSGi Blueprint implementation. I hope these features will also be in Eclipse Virgo soon (here's hoping)! :-)

Modular Java programming uses OSGi technology, if you aren't yet familiar with OSGi I recommend reading OSGi in Action: Creating Modular Applications in Java.

Enhancing Eclipse Virgo with Felix GoGo and SSH

I hope Virgo will soon have built-in support for Felix GoGo and SSH server, but for now, we must do this ourselves.

Hristo Iliev from SAP has written an excellent tutorial on how to enable Felix GoGo and SSH support in Eclipse Virgo, so I'll pretty much just paste it here :-)

Step 1: Download Gogo shell

To run the Gogo you will need three bundles from Apache Felix downloads:

Step 2: Equinox and RFC-147

To enable  RFC-147 integration in Equinox you will need some console supportability features that are provided by the Console supportability project in Equinox Incubator.

20110611 build is fine for me. Download the Incubator build that includes the bundle we'll need. Extract only the supportability JAR (org.eclipse.equinox.console.supportability) from plugins folder.

Step 3: Enabling SSH Support

First download Apache Mina's binary distribution for SSHD, and then from the lib directory of the archive extract these two bundles:

  • sshd-core

  • mina-core

Step 4: Setting up Virgo

  • Copy the console supportability bundle (org.eclipse.equinox.console.supportability) in lib directory

  • Place the rest of the bundles (3xGogo, SSHD and Mina) in lib/kernel directory

  • Edit config/ file and add the bundles to base bundles list:

baseBundles = \
  • In lib/ remove the old Virgo shell  by deleting the line org.eclipse.virgo.osgi.console.telnet.hook.TelnetHookConfigurator. The hooks entry should look like this afterwards:


  • Delete (or move outside lib) the bundle lib/org.eclipse.virgo.osgi.console-3.0.0.M05.jar

Step 5: Configuring Virgo

  • Comment or delete osgi.console property in lib/ (should already be commented)

  • Add the following property entries to lib/

  • Add in config/org.eclipse.virgo.kernel.authentication.config file JAAS configuration for the SSH:

equinox_console { 
    org.eclipse.equinox.console.jaas.SecureStorageLoginModule REQUIRED; 
  • Edit the bin/dmk.bat file to add org.eclipse.equinox.console.jaas.file and ssh.server.keystore VM system properties. After the changes the file should look as follows: 

set KERNEL_JAVA_PARMS=%KERNEL_JAVA_PARMS% -Dorg.eclipse.virgo.kernel.authentication.file="%CONFIG_DIR%\"
set KERNEL_JAVA_PARMS=%KERNEL_JAVA_PARMS% -Dorg.eclipse.equinox.console.jaas.file="%CONFIG_DIR%/store"
set KERNEL_JAVA_PARMS=%KERNEL_JAVA_PARMS% -Dssh.server.keystore="%CONFIG_DIR%/hostkey.ser"
  • Edit bin/ as follows: (following is in diff format)

diff --git a/bin/ b/bin/ index 5f8112b..4122d55 100755 --- a/bin/ +++ b/bin/ @@ -174,6 +174,8 @@ then -XX:HeapDumpPath=$KERNEL_HOME/serviceability/heap_dump.hprof \$CONFIG_DIR/org.eclipse.virgo.ker -Dorg.eclipse.virgo.kernel.authentication.file=$CONFIG_DIR/org.ecli + -Dorg.eclipse.equinox.console.jaas.file=$CONFIG_DIR/store \ + -Dssh.server.keystore=$CONFIG_DIR/hostkey.ser \$TMP_DIR \ -Dorg.eclipse.virgo.kernel.home=$KERNEL_HOME \ -classpath $CLASSPATH \

Step 6: Connecting to Virgo

Fire your favourite SSH client on the specified port (2422 for instance).

$ ssh -p2422 equinox@localhost

  • Login with the default user and password (equinox/equinox) 

  • Create a new user and password (roles are optional)

Step 7: Fun with Gogo

Check the features provided by GoGo and Equinox Console Supportability:

  • Tab Completion (...doesn't yet work for me... any suggestions?) / Line editing

  • Built-in commands like lb, ls, cat to name a few

  • Create some scripts

  • Create some commands using RFC-147

Step 8: Disconnecting

Do not use "exit" since this will exit Virgo/OSGi framework. To end the session you'll have to close the console/SSH window. There is no command to close the session currently.

Create the Felix GoGo Command Implementation Project

You'll need Eclipse IDE for RCP Development distribution or PDE plug-in for this. Create a new Eclipse PDE Plug-in project with the following class:

package gogosimple1;

public class SimpleCommand {

    public void walk() {
    public void talk() {

Simple isn't it? And no external dependency on Felix GoGo API whatsoever (at this point).

I will use OSGi Blueprint to publish that class as a service and register a command scope to Felix GoGo. Create src/OSGI-INF/blueprint/gogosimple1.xml file with the following:

<blueprint xmlns=""

    <bean id="simpleCommand" class="gogosimple1.SimpleCommand" />
    <service ref="simpleCommand" auto-export="all-classes">
            <entry key="osgi.command.scope"><value>simple</value></entry>
            <entry key="osgi.command.function">
                <array value-type="java.lang.String">


For your convenience, the gogosimple1 example project is available on GitHub.

Installing Eclipse Gemini Blueprint on Virgo Web Server

Virgo 3.0 milestones shipped with (and unfortunately, will still ship) the ancient Spring-DM 1.2.1, so to enable OSGi Blueprint we need to install a OSGi Blueprint implementation by ourselves.

Fortunately Eclipse Gemini Blueprint seems to work just fine.

Download Eclipse Gemini Blueprint.

Copy the following files to Virgo's pickup/ folder: (in order)

  1. gemini-blueprint-core-1.0.0.M1.jar

  2. gemini-blueprint-extender-1.0.0.M1.jar

  3. gemini-blueprint-io-1.0.0.M1.jar

Alternative: Apache Aries Blueprint

Actually, I originally used Apache Aries Blueprint implementation for this experiment.

Download Apache Aries Blueprint.

Put these bundles in pickup/ folder: (in order)




  4. org.apache.aries.util-0.3.jar

  5. org.apache.aries.proxy-0.3.jar

  6. org.apache.aries.blueprint-0.3.jar

Trying the Commands

Build the simplegogo1 project (right click the project > Export... > Deployable plug-ins and fragments), and copy the resulting bundle JAR to Virgo's pickup/ folder.
If you don't feel like compiling, you can download prebuilt gogosimple1 JAR here (hint: it's very small, just 12 KB).
Virgo should install it, and Blueprint should register SimpleCommand bean as OSGi service:

g! lb | grep -i gogo
   44|Active     |    1|Apache Felix Gogo Command (0.8.0)
   45|Active     |    1|Apache Felix Gogo Runtime (0.8.0)
   46|Active     |    1|Apache Felix Gogo Shell (0.8.0)
  128|Active     |    1|Gogosimple1 (1.0.0.qualifier)

g! b 128
gogosimple1_1.0.0.qualifier [128]
  Id=128, Status=ACTIVE      Data Root=/home/ceefour/project/Soluvas/web-as/virgo-tomcat-server-3.0.0.M05/work/osgi/configuration/org.eclipse.osgi/bundles/128/data
  "Registered Services"
    {gogosimple1.SimpleCommand}={osgi.command.scope=simple, osgi.command.function=[walk,talk],, osgi.service.blueprint.compname=simpleCommand, Bundle-SymbolicName=gogosimple1, Bundle-Version=1.0.0.qualifier,}
    {org.osgi.service.blueprint.container.BlueprintContainer}={Bundle-SymbolicName=gogosimple1, Bundle-Version=1.0.0.qualifier, osgi.blueprint.container.version=1.0.0.qualifier, osgi.blueprint.container.symbolicname=gogosimple1,}
    {org.eclipse.gemini.blueprint.context.DelegatedExecutionOsgiBundleApplicationContext, org.eclipse.gemini.blueprint.context.ConfigurableOsgiBundleApplicationContext, org.springframework.context.ConfigurableApplicationContext, org.springframework.context.ApplicationContext, org.springframework.context.Lifecycle, org.springframework.beans.factory.ListableBeanFactory, org.springframework.beans.factory.HierarchicalBeanFactory, org.springframework.context.MessageSource, org.springframework.context.ApplicationEventPublisher,, org.springframework.beans.factory.BeanFactory,, org.springframework.beans.factory.DisposableBean}={, Bundle-SymbolicName=gogosimple1, Bundle-Version=1.0.0.qualifier,}
  Services in use:
    {org.xml.sax.EntityResolver}={, spring.osgi.core.bundle.timestamp=1308232521056,}
    {org.springframework.beans.factory.xml.NamespaceHandlerResolver}={, spring.osgi.core.bundle.timestamp=1308232521056,}
    {org.osgi.service.packageadmin.PackageAdmin}={service.ranking=2147483647,, - Equinox,}
  No exported packages
  Imported packages
    org.osgi.framework; version="1.6.0"<org.eclipse.osgi_3.7.0.v20110224 [0]>
  No fragment bundles
  Named class space
    gogosimple1; bundle-version="1.0.0.qualifier"[provided]
  No required bundles

You can try GoGo commands you just installed:

g! simple:walk
g! simple:talk

"help" command is also available to list all available commands:

g! help

Using Virgo Tools Eclipse IDE Plugin to Launch Eclipse Virgo Web Server

Building project and manually copying the JAR is boring, not to mention it's not helpful during development/debugging.
Thankfully there is Virgo Tools, which has been part of SpringSource Tool Suite. You can install it in your Eclipse IDE using one of the following update sites (depending on your preference of stability, I use the "milestone" one, and I am not having problems):

Choose: Core / dm Server Tools (as of Virgo Tools 2.7.0-M2)

Open the Servers view (Shift+Alt+Q Q, choose Servers), then add a new server, pick Eclipse Virgo Web Server.

Before launching, you need to edit the launch configuration to enable SSH server.
Double-click Eclipse Virgo Web Server in the Servers view, then click "Edit launch configuration".

On Arguments > VM arguments, add the following:

-Dorg.eclipse.equinox.console.jaas.file="$CONFIG_DIR/store" -Dssh.server.keystore="$CONFIG_DIR/hostkey.ser"

You need to manually replace $CONFIG_DIR above with the Virgo Web Server's config folder.

Now you can launch Eclipse Virgo Web Server from Eclipse IDE, develop & debug, inspect its configuration, deploy artifacts, etc. right inside the Eclipse IDE.

Deploying PDE Plug-in Projects into Eclipse Virgo Web Server

Currently Virgo Tools is integrated with WTP, but not with PDE.  Which means:

  • Eclipse plugin projects cannot be deployed straightforwardly into Virgo Web Server using Virgo Tools plugin

  • What can be deployed are VWS-specific OSGi Plan project and OSGi Bundle project. You can right click a "plain" Eclipse plugin project > Spring Tools > Add OSGi Bundle project nature to convert it to VWS OSGi Bundle project.

  • OSGi Bundle nature has its own way of generating MANIFEST.MF, it doesn't use Eclipse PDE's META-INF/MANIFEST.MF file.

State as of 13 June 2011: (source)

Unfortunately there is no integration of STS and PDE. The Libra project is doing a great job at integrating WTP and PDE and looks like a great way of developing OSGi standard bundles, although it doesn't yet have a way to launch Virgo. The Virgo tooling, contributed to Eclipse from STS, is also developing, is able to launch Virgo (of course), and we hope to align it with Libra in due course.

For now, why not ditch the PDE manifest and run with STS and Maven? That's the approach we took in the Greenpages sample application which you may like to refer to. See also "Creating an Application with Virgo" which goes through the Greenpages sample in some detail.

Better tooling for OSGi, Eclipse Virgo, Eclipse Gemini, and integration with PDE are goals of Eclipse Libra project. Maybe you can help? :-)

To deploy gogosimple1 using Virgo Tools, first you need to enable the OSGi bundle project nature: right-click on the project, click Spring Tools > Add OSGi Bundle project nature.

Then right-click on the Virgo Web Server in Servers view, click Add and Remove, now you can choose gogosimple1 to deploy the project as artifact to Virgo.

By doing the steps above, the development cycle with Virgo and Eclipse IDE becomes very convenient, anytime you change the project's source code, resource files, etc. Virgo will immediately pick it up. You can also launch Eclipse Virgo in debug mode and have all Eclipse IDE debugging tools at your fingertips!

Recommended Resources

For more in-depth explanation on OSGi, I highly recommend OSGi in Action: Creating Modular Applications in Java.

Thursday, April 28, 2011

Hot Deploy OSGi CLI Commands in Apache Karaf with Eclipse PDE

Apache Felix GoGo is a really cool command line interface (CLI, aka shell aka console) framework that runs in an OSGi runtime.
Apache Karaf (based on Apache Felix OSGi runtime) just so happens to include Felix GoGo built-in, and the console is very extensible too, making it easy to add more commands at your will.

Karaf / Felix GoGo Features

A very nice feature of Apache Karaf is it has a built-in SSH server, so you can literally and practically just ssh to the Karaf server :

ceefour@annafi:/opt/apache-karaf-2.2.0$ ssh -p 8102 karaf@localhost
karaf@localhost's password: 
        __ __                  ____      
       / //_/____ __________ _/ __/      
      / ,<  / __ `/ ___/ __ `/ /_        
     / /| |/ /_/ / /  / /_/ / __/        
    /_/ |_|\__,_/_/   \__,_/_/         

  Apache Karaf (2.2.0)

Hit '<tab>' for a list of available commands
and '[cmd] --help' for help on a specific command.
Hit '<ctrl-d>' or 'osgi:shutdown' to shutdown Karaf.

karaf@bipposhell> erp:hello
Hello Ibrahim!

Default credentials is user=karaf, password=karaf .
And even (this I just found out!) the amazing thing is you can run a command directly from ssh :

$ ssh -p 8102 karaf@localhost erp:hello
karaf@localhost's password: 
Hello Ibrahim!

Isn't it amazing? What's more, the Karaf/Felix GoGo command line also supports POSIX/GNU-style option arguments, Tab completion, and UNIX-like pipes! (yes, pipes, where you can do 'grep' and stuff)

The other, and paramountly useful feature, is the ability to hot deploy exploded OSGi bundles! Now hot deploy in itself is really useful, the ability to deploy exploded bundles is even more fantastic. We'll see about that soon.

What About Virgo?

Some of you may (almost) cry that I didn't mention Eclipse Virgo Kernel. Which is, of course, a direct competitor to Apache Karaf.
I haven't used Virgo yet (but I should try it one day), so because I have used Karaf in the past so for now I'm using Karaf, and I know it has a cool built-in CLI ;-)
I think the steps below can be adapted to Virgo with little (or not so little?) work. Let me know if you have tips, ideas, etc. :-) I will appreciate it a lot.

Developing CLI on Felix GoGo

For my next Bippo Shell project I plan to use Felix GoGo as the CLI framework. I'll share my first steps experience on developing CLI Commands with Karaf and Felix GoGo with you.
To get started, I first download Apache Karaf OSGi runtime.

Felix GoGo looks for command services exposed by other OSGi bundles. To develop this OSGi bundle, my IDE of choice happens to be... (surprise!) Eclipse IDE. ;-)
Now there are several approaches to develop OSGi bundles in Eclipse IDE, but the path I want to take right now is using Eclipse PDE (Plug-in Development Environment). At least for developing the bundle (i.e. Eclipse plugin) itself, not for running/launching the runtime (yet...)

I assume you are already somewhat familiar with Eclipse IDE and Eclipse PDE. If not, it's not too late to reach enlightenment ;-) get Eclipse Rich Client Platform (2nd Edition) book.

Creating the Target Platform

The first thing I need to do before developing anything (for you guys familiar with Eclipse RCP would already know this: any Eclipse-based project for that matter) is prepare the target platform.
The target platform contains all the OSGi bundles (Eclipse plugins) that are already in the target runtime environment. In this case: Felix Karaf.

So, copy these files from Apache Karaf distribution (hint: look for lib/ and system/ folders) to another folder. These little commands may help you:

mkdir -vp ~/felix-karaf-target-platform/
cp -v lib/karaf-jaas-boot.jar ~/felix-karaf-target-platform
find system -iname '*.jar' -exec cp -v '{}' ~/felix-karaf-target-platform \;

Then I created a target platform in Eclipse PDE, and add content based on the directory that contains the Karaf JARs, then activate the target platform.

Creating the Eclipse Plugin Project

Although Karaf only needs an OSGi bundle, I decided to create an Eclipse plugin project. Why not, since an Eclipse plugin is always an OSGi bundle anyway! And it might be useful if I want to use my plugin projects in an Eclipse RCP project, for example.

First I create an Eclipse plugin project, then set the Import-Package in META-INF/MANIFEST.MF to:

Import-Package: org.apache.felix.gogo.commands;version="0.6.1",;version="2.2.0",

This allows me to refer to the Felix GoGo API classes and interfaces.

Writing the Command Implementation

Thanks to annotations in similar spirit of JAX-RS, writing the CLI command implementation class is simple:


import org.apache.felix.gogo.commands.Command;

@Command(scope="erp", name="hello", description="Says hello")
public class ErpCommand extends OsgiCommandSupport {

/* (non-Javadoc)
* @see
protected Object doExecute() throws Exception {
System.out.println("Hello Ibrahim!");
return null;


I think the annotations and what this class does is pretty much self-explanatory.

Registering the Command Service using OSGi Blueprint

To register the command service implementation, OSGi Blueprint is used (although not mandatory, you can use manual OSGi wiring if you want, but why do it the hard way?)
Apache Karaf uses Apache Aries Blueprint as the OSGi Blueprint implementation, but I believe it should work with Eclipse Gemini Blueprint as well.

Put this in OSGI-INF/blueprint/myshell.xml (actual filename doesn't matter):

    <command-bundle xmlns="">
        <command name="erp/hello">
            <action class=""/>


This registers the command that can be executed by typing: erp:hello

I wondered why the command name is repeated again here. It turns out that the Blueprint declaration actually registers the command under the specified name, and the class annotations are only used for usage help, i.e. when typing: erp:hello --help

Deploying the Command Bundle/Plugin

I'm not yet using fancy launcher PDE feature here, just old-school style:
  1. Export the plugin as a binary JAR bundle to a folder
  2. Copy the plugin JAR to Karaf's deploy/ folder
Karaf will detect it and register the command:

karaf@bipposhell> la
START LEVEL 100 , List Threshold: 0
   ID   State         Blueprint      Level  Name
[   0] [Active     ] [            ] [    0] System Bundle (3.0.8)
[   1] [Active     ] [            ] [    5] OPS4J Pax Url - wrap: (1.2.5)
[  40] [Active     ] [Created     ] [   30] Apache Karaf :: Deployer :: Blueprint (2.2.0)
[  41] [Active     ] [Created     ] [   30] Apache Karaf :: Shell :: ConfigAdmin Commands (2.2.0)
[  42] [Active     ] [Created     ] [   60] Bippo Shell (1.0.0.qualifier)

karaf@bipposhell> erp:hello
Hello Ibrahim!

karaf@bipposhell> erp:hello --help

        Says hello

        erp:hello [options]

                Display this help message

Hot Deploying Eclipse Plugins

Karaf has very useful ability to do hot deploy of OSGi bundles during development. (actually, being OSGi, it can perform hot deployment anytime, not just "during development")
osgi:install and dev:watch Karaf commands will help you on this.

But I prefer the low-tech solution:

ceefour@annafi:~/project/bipponext/workspace/$ ln -s $PWD/bin /opt/apache-karaf-2.2.0/instances/bipposhell/deploy/

This creates a symlink to the ${project_loc}/bin folder inside the Karaf's deploy/ folder.
You may want to also copy the META-INF/MANIFEST.MF file. If not, it will still work in most cases, because Karaf is very smart to wrap non-OSGi "bundles" to be OSGi-friendly.
This magic works because Karaf supports "exploded bundles", i.e. OSGi bundles not packaged in a JAR, but exploded as a directory structure.

karaf@bipposhell> erp:hello
Hello Ibrahim!
(......edit the source in Eclipse, then Save...and......)
karaf@bipposhell> erp:hello
Hello Yudha!

No need to restart, nothing! You don't even need to build/package the JAR!
Pretty nifty, eh? :-)

Note: While doing this, I uncovered bug KARAF-602.
Karaf Tip: If something goes wrong, tail:log, log:set, and dev:show-tree Karaf commands are your friends. :-)

What's Next ?

As demonstrated, OSGi development can be easy and in some cases even easier than "plain" Java.

Actually I've managed to do a headless build using Eclipse Tycho (formerly Sonatype Tycho Maven plugin), so I hope to write about this in a later post.

To learn more about Eclipse plugins, I recommend the Eclipse Rich Client Platform (2nd Edition) book.
For a thorough explanation in developing OSGi applications, check out the recently released OSGi in Action, Creating Modular Applications in Java.

Sunday, January 2, 2011

Unit Testing EMF Models with EasyMock

During unit testing, I use EasyMock to mock an EMF-generated interface (I don't plan on persisting the interface nor its implementation).

Example usage is like this:

protected void setUp() {
voucher = AbispulsaFactory.eINSTANCE.createVoucher();
owner = new DummyOwner();

refillListener will be used somewhere in the implementation :

try {
} catch (IOException e) {
throw new AbispulsaException(e);
return refill;

The intention is to use a mock object for refillListener attribute/interface so I don't have to provide a real object.

But I got this error:

java.lang.ClassCastException: $Proxy0 cannot be cast to org.eclipse.emf.ecore.InternalEObject
at org.eclipse.emf.ecore.impl.EStructuralFeatureImpl$InternalSettingDelegateSingleEObject.dynamicGet(
at org.eclipse.emf.ecore.impl.BasicEObjectImpl.eDynamicGet(
at com.abispulsa.impl.DealerImpl.getRefillListener(
at com.abispulsa.impl.DealerImpl$RefillTemplate.create(
at com.abispulsa.impl.DealerImpl.refill(
at com.abispulsa.tests.DealerTest.testRefill__Voucher_String_RefillOwner(
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(
at sun.reflect.DelegatingMethodAccessorImpl.invoke(
at java.lang.reflect.Method.invoke(
at junit.framework.TestCase.runTest(
at junit.framework.TestCase.runBare(
at junit.framework.TestResult$1.protect(
at junit.framework.TestResult.runProtected(
at junit.framework.TestSuite.runTest(
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(

The solution is to mark the attribute as Resolve Proxies = false. (In addition, I also mark Transient = true)

I'm not sure how this affects persistence mechanism such as Teneo or CDO, and if this is the proper way or not. If you know the proper way please advise. Thanks.

Saturday, January 1, 2011

POJO Remoting over Camel-XMPP within Eclipse App

I spent a lot of hours trying to make Remote Observer Pattern with Camel over XMPP work inside my Eclipse RCP Application, so here's my guide to make it run successfully.

Getting Camel

Getting Apache Camel itself to run is the easiest part, because Camel is already packaged as OSGi bundles, also available from the Maven Central repository.

Here are the artifacts that I needed:

  1. org.apache.camel:camel-core:2.5.0 (dependency of the ones below)
  2. org.apache.camel:camel-blueprint:2.5.0
  3. org.apache.camel:camel-eclipse:2.5.0
  4. org.apache.camel:camel-xmpp:2.5.0
  5. org.apache.camel:camel-xstream:2.5.0
These will pull additional dependencies, here are the JARs:


You'll notice that it includes the Apache Aries Blueprint container.
Camel can also use Blueprint service to run in OSGi. (It seems Camel can work in Eclipse without Blueprint, but I haven't tried it. Besides I think Blueprint is a nice touch, hehe..)
It's nice to know that Apache Aries Blueprint works well within Eclipse/Equinox.
(I was tempted to use Eclipse Gemini Blueprint but it seems "not ready yet", and I suspect it brings on Spring Framework dependencies, which I'm not interested at the moment.)

I need to replace some of the dependencies with OSGi bundles from:

Note that I've left out Smack JARs there, for reasons later below.

Hint: To know which OSGi-ready dependencies are available, check Camel for Karaf features list here :

Starting Apache Camel Bundles

Apache Camel and the Apache Aries Blueprint needs to be explicitly started, so after copying these files to the target platform, I need to go the Eclipse Launch configuration and mark these bundles as "start". I'm not sure which exact bundles need to be started, but I just start all camel-* bundles and all its dependencies and Apache Aries Blueprint.

If there is a trouble, it might be useful to tweak the start order, but I hope it's not necessary.

Using Camel with Blueprint

There are several ways to use Camel inside Eclipse, the traditional way without Blueprint is like this:

PackageScanClassResolver eclipseResolver = new EclipsePackageScanClassResolver();
CamelContext context = new DefaultCamelContext();

To use Camel from a Blueprint beans XML file, create OSGI-INF/blueprint/config.xml file (directly below your plugin project folder, not inside the src folder) like this:

<?xml version="1.0" encoding="UTF-8"?>
<blueprint xmlns=""

<camelContext xmlns="" id="camelContext" autoStartup="true">
<from uri="direct:invoice" />
<to uri="seda:invoice.async" />

<bean id="invoicing" class=""
init-method="init" destroy-method="destroy">
<property name="camelContext" ref="camelContext" />

Note that autoStartup is true by default, you can change it to false.

The CamelContext instance will be injected into the class by Blueprint, I created a class like this:

import org.apache.camel.CamelContext;
import org.apache.camel.builder.ProxyBuilder;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.component.eclipse.EclipsePackageScanClassResolver;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

public class Invoicing {

private CamelContext camelContext;
private Logger logger = LoggerFactory.getLogger(getClass());
protected InvoiceListener loggerInvoiceListener = new LoggerInvoiceListener();

public void init() throws Exception {
camelContext.setPackageScanClassResolver(new EclipsePackageScanClassResolver());
camelContext.addRoutes(new RouteBuilder() {

public void configure() throws Exception {
// .. configure additional routes ...

InvoiceListener invoiceListener = new ProxyBuilder(camelContext)
invoiceListener.invoiceCreated(123, "Bippo Indonesia");

public void destroy() throws Exception {

public void setCamelContext(CamelContext camelContext) {
this.camelContext = camelContext;


The simple application seems to run fine without setting EclipsePackageScanClassResolver, but I guess it's safer to use it.

Another way create a CamelContext using Blueprint XML is by instantiating it:

<bean id="camelContext" class="org.apache.camel.blueprint.BlueprintCamelContext">
<property name="bundleContext" ref="blueprintBundleContext" />
<property name="blueprintContainer" ref="blueprintContainer" />
<property name="packageScanClassResolver">
<bean class="org.apache.camel.component.eclipse.EclipsePackageScanClassResolver" />

Using the factory bean directly:

<bean id="camelFactory" class="org.apache.camel.blueprint.CamelContextFactoryBean">
<property name="bundleContext" ref="blueprintBundleContext" />
<property name="blueprintContainer" ref="blueprintContainer" />
<property name="autoStartup" value="false" />
<bean id="camelContext" factory-ref="camelFactory" factory-method="getContext" />

Xstream Dependencies

Xstream is needed to marshal/serialize objects to/from XML representation and/or JSON.

Xstream requires several dependencies:

  • javax.xml
  • (aka StaX API)
  • xmlpull
  • xpp3 (aka xml.pull.mxp1)
  • Jettison

These are available from Apache ServiceMix repositories (above).

Making Xstream work with OSGi

Camel has a shortcut way of marshalling/unmarshalling objects with Xstream :


However with OSGi we need to set the classloader which is used to resolve marshalled classes. The configuration becomes:

import org.apache.camel.dataformat.xstream.XStreamDataFormat;
import com.thoughtworks.xstream.XStream;
camelContext.addRoutes(new RouteBuilder() {
public void configure() throws Exception {
XStream xstream = new XStream();
XStreamDataFormat xstreamDataFormat = new XStreamDataFormat(xstream);


To use the JSON format with Xstream, we need to configure the Xstream driver:

xstreamDataFormat.setXstreamDriver(new JettisonMappedXmlDriver());

OSGi-fying Smack

In the plain Java world, we'll be done. But not so with Eclipse and OSGi.

I had the most trouble with the Smack library. The repackaged Smack library from Apache ServiceMix also didn't just work due to missing META-INF/smack.providers file and XMPPConnection not registering ServiceDiscoveryManager, which seem to happen because of static { ... } statements.

One problem is this code from Smack:

    public static Collection<String> getServiceNames(XMPPConnection connection) throws XMPPException {
        final List<String> answer = new ArrayList<String>();
        ServiceDiscoveryManager discoManager = ServiceDiscoveryManager.getInstanceFor(connection);
        DiscoverItems items = discoManager.discoverItems(connection.getServiceName());
        for (Iterator<DiscoverItems.Item> it = items.getItems(); it.hasNext();) {

discoManager returns null, because the map of instances is empty.

The static code block here is not being called within OSGi runtime:

    // Create a new ServiceDiscoveryManager on every established connection
    static {
        XMPPConnection.addConnectionCreationListener(new ConnectionCreationListener() {
            public void connectionCreated(XMPPConnection connection) {
                new ServiceDiscoveryManager(connection);

So I had to tweak Smack and published the changes to my Smack repository on GitHub.

I've also published the tweaked Smack 3.1.0 library as Eclipse plugin to Bippo OSS p2 Repository.

You can also download the bundle directly from:

Note that my Smack bundle now requires SLF4j API for logging.

Tip: A nice way to debug Smack is by adding -Dsmack.debugEnabled=true to VM launch options.

I also learned several (re-)packaging tips with Eclipse plugins:

  • make sure to add "." in Runtime classpath when adding other classpaths (Apache ServiceMix repackaged Smack library includes other dependencies in folder containing .class files)
  • Workarounds for missing files can potentially be done using OSGi Fragment bundles, for example to augment
META-INF/smack.providers if so desired.

The App

Here's how the Camel XMPP test Eclipse app looks like:

import org.apache.camel.CamelContext;
import org.apache.camel.builder.ProxyBuilder;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.component.eclipse.EclipsePackageScanClassResolver;
import org.apache.camel.dataformat.xstream.XStreamDataFormat;
import org.apache.camel.model.dataformat.JsonDataFormat;
import org.apache.camel.model.dataformat.JsonLibrary;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import com.thoughtworks.xstream.XStream;

public class Invoicing {

private CamelContext camelContext;
private Logger logger = LoggerFactory.getLogger(getClass());
protected InvoiceListener loggerInvoiceListener = new LoggerInvoiceListener();

public void init() throws Exception {"Invoicing started");
camelContext.setPackageScanClassResolver(new EclipsePackageScanClassResolver());
camelContext.addRoutes(new RouteBuilder() {

public void configure() throws Exception {
JsonDataFormat dataFormat = new JsonDataFormat(JsonLibrary.XStream);
XStream xstream = new XStream();
XStreamDataFormat xstreamDataFormat = new XStreamDataFormat(xstream);
xstreamDataFormat.setXstreamDriver(new JettisonMappedXmlDriver());

final String xmppUri = "xmpp://abispulsabot@geulis.local/?room=invoice&password=test";

InvoiceListener invoiceListener = new ProxyBuilder(camelContext)
invoiceListener.invoiceCreated(123, "Bippo Indonesia");

public void destroy() throws Exception {
camelContext.stop();"Invoicing stopped");

public void setCamelContext(CamelContext camelContext) {
this.camelContext = camelContext;


Log output:

04:41:04.302 [Blueprint Extender: 2] WARN  o.a.a.b.c.BlueprintContainerImpl - Bundle is waiting for namespace handlers [(&(objectClass=org.apache.aries.blueprint.NamespaceHandler)(osgi.service.blueprint.namespace=]
04:41:04.301 [Start Level Event Dispatcher] INFO  org.apache.camel.impl.osgi.Activator - Camel activator starting
04:41:04.318 [Start Level Event Dispatcher] INFO  org.apache.camel.impl.osgi.Activator - Camel activator started
04:41:05.683 [Blueprint Extender: 2] INFO  org.apache.camel.impl.osgi.Activator - Found 13 @Converter classes to load
04:41:05.707 [Blueprint Extender: 2] INFO  o.a.c.b.BlueprintCamelContext - Starting Apache Camel as property ShouldStartContext is true
04:41:05.708 [Blueprint Extender: 2] INFO  o.a.camel.impl.DefaultCamelContext - Apache Camel 2.5.0 (CamelContext: camelContext) is starting
04:41:05.708 [Blueprint Extender: 2] INFO  o.a.camel.impl.DefaultCamelContext - JMX enabled. Using ManagedManagementStrategy.
04:41:05.711 [Blueprint Extender: 2] WARN  o.a.camel.impl.DefaultCamelContext - Could not find needed classes for JMX lifecycle strategy. Needed class is in spring-context.jar using Spring 2.5 or newer (spring-jmx.jar using Spring 2.0.x). NoClassDefFoundError: org/springframework/jmx/export/metadata/JmxAttributeSource
04:41:05.711 [Blueprint Extender: 2] WARN  o.a.camel.impl.DefaultCamelContext - Cannot use JMX. Fallback to using DefaultManagementStrategy (non JMX).
04:41:05.717 [Blueprint Extender: 2] INFO  o.a.camel.impl.DefaultCamelContext - Total 0 routes, of which 0 is started.
04:41:05.718 [Blueprint Extender: 2] INFO  o.a.camel.impl.DefaultCamelContext - Apache Camel 2.5.0 (CamelContext: camelContext) started in 0.010 seconds
04:41:05.718 [Blueprint Extender: 2] INFO - Invoicing started
04:41:06.613 [Smack Packet Reader (0)] ERROR org.jivesoftware.smack.PacketReader - Empty IQ packet! </iq>
04:41:06.715 [Blueprint Extender: 2] INFO  o.a.c.c.xmpp.XmppGroupChatProducer - Joined room: invoice@conference.geulis as: abispulsabot
04:41:06.729 [Blueprint Extender: 2] INFO  o.a.camel.impl.DefaultCamelContext - Route: route1 started and consuming from: Endpoint[direct://invoice]
04:41:06.779 [Blueprint Extender: 2] INFO  o.a.c.component.xmpp.XmppConsumer - Joined room: invoice@conference.geulis as: abispulsabot
04:41:06.780 [Blueprint Extender: 2] INFO  o.a.camel.impl.DefaultCamelContext - Route: route2 started and consuming from: Endpoint[xmpp://abispulsabot@geulis.local/?password=******&room=invoice]
04:41:06.805 [Blueprint Extender: 2] INFO  o.a.camel.impl.DefaultCamelContext - Route: route3 started and consuming from: Endpoint[seda://loggerInvoiceBean]
04:41:06.806 [Blueprint Extender: 2] INFO  o.a.camel.impl.DefaultCamelContext - Apache Camel 2.5.0 (CamelContext: camelContext) is starting
04:41:06.806 [Blueprint Extender: 2] INFO  o.a.camel.impl.DefaultCamelContext - Total 3 routes, of which 3 is started.
04:41:06.806 [Blueprint Extender: 2] INFO  o.a.camel.impl.DefaultCamelContext - Apache Camel 2.5.0 (CamelContext: camelContext) started in 0.000 seconds
04:41:06.846 [Camel Thread 1 - Threads] INFO  xmpp-output - Exchange[ExchangePattern:InOut, BodyType:byte[], Body:{"org.apache.camel.component.bean.BeanInvocation":{"org.apache.camel.component.bean.MethodBean":{"name":"invoiceCreated","type":"","parameterTypes":{"java-class":["int","java.lang.String"]}},"object-array":{"int":123,"string":"Bippo Indonesia"}}}]
04:41:06.858 [Camel Thread 2 - Threads] INFO  xmpp-input - Exchange[ExchangePattern:InOnly, BodyType:String, Body:{"org.apache.camel.component.bean.BeanInvocation":{"org.apache.camel.component.bean.MethodBean":{"name":"invoiceCreated","type":"","parameterTypes":{"java-class":["int","java.lang.String"]}},"object-array":{"int":123,"string":"Bippo Indonesia"}}}]
04:41:06.880 [Camel Thread 0 - seda://loggerInvoiceBean] INFO  i.c.b.usexmpp.LoggerInvoiceListener - Invoice #123 name: Bippo Indonesia created
04:41:08.380 [Camel Thread 0 - seda://loggerInvoiceBean] INFO  i.c.b.usexmpp.LoggerInvoiceListener - 123/Bippo Indonesia done!


Apache Camel is unobtrusive way to add flexibility of routing, messaging, and integration patterns with XMPP or other connectors supported by Camel.
(you can create your own connectors if you want...)

I highly suggest the Camel in Action Book for the best in-depth guide and examples for using Camel to develop your applications more productively.