Mittwoch, 7. März 2012

Testing Akka actors from Java

If you're looking for a general introduction into using Akka from Java have a look at this post

In a recent project I've been using Akka for a concurrent producer-consumer setup. It is an actor framework for the JVM that is implemented in Scala but provides a Java API so normally you don't notice that your dealing with a Scala library.

Most of my business code is encapsulated in services that don't depend on Akka and can therefore be tested in isolation. But for some cases I've been looking for a way to test the behaviour of the actors. As I struggled with this for a while and didn't find a real howto on testing Akka actors from Java I hope my notes might be useful for other people as well.

The main problem when testing actors is that they are managed objects and you can't just instanciate them. Akka comes with a module for tests that is documented well for using it from Scala. But besides the note that it's possible you don't find a lot of information on using it from Java.

When using Maven you need to make sure that you have the akka-testkit dependency in place:

<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-testkit</artifactId>
<version>2.1-SNAPSHOT</version>
<scope>test</scope>
</dependency>

I will show you how to implement a test for the actors that are introduced in the Akka java tutorial. It involves one actor that does a substep of calculating Pi for a certain start number and a given number of elements.

To test this actor we need a way to set it up. Akka-testkit provides a helper TestActorRef that can be used to set it up. Using scala this seems to be rather simple:

val testActor = TestActorRef[Worker]

If you try to do this from Java you will notice that you can't use a similar call. I have to admit that I am not quite sure yet what is going on. I would have expected that there is an apply() method on the TestActorRef companion object that uses some kind of implicits to instanciate the Worker object. But when inspecting the sources the thing that comes closest to it is this definition:

def apply[T <: Actor](factory: ⇒ T)(implicit system: ActorSystem)

No sign of implicit for the factory. Something I still have to investigate further.

To use it from Java you can use the method apply that takes a reference to a Function0 and an actor system. The actor system can be setup easily using

actorSystem = ActorSystem.apply();

The apply() method is very important in scala as it's kind of the default method for objects. For example myList(1) is internally using myList.apply(1).

If you're like me and expect that Function0 is a single method interface you will be surprised. It contains a lot of strange looking methods that you really don't want to have cluttering your test code:

TestActorRef workerRef = TestActorRef.apply(new Function0() {

@Override
public Worker apply() {
throw new UnsupportedOperationException("Not supported yet.");
}

@Override
public void apply$mcV$sp() {
throw new UnsupportedOperationException("Not supported yet.");
}

@Override
public boolean apply$mcZ$sp() {
throw new UnsupportedOperationException("Not supported yet.");
}

@Override
public byte apply$mcB$sp() {
throw new UnsupportedOperationException("Not supported yet.");
}

@Override
public short apply$mcS$sp() {
throw new UnsupportedOperationException("Not supported yet.");
}

@Override
public char apply$mcC$sp() {
throw new UnsupportedOperationException("Not supported yet.");
}

@Override
public int apply$mcI$sp() {
throw new UnsupportedOperationException("Not supported yet.");
}

@Override
public long apply$mcJ$sp() {
throw new UnsupportedOperationException("Not supported yet.");
}

@Override
public float apply$mcF$sp() {
throw new UnsupportedOperationException("Not supported yet.");
}

@Override
public double apply$mcD$sp() {
throw new UnsupportedOperationException("Not supported yet.");
}
}, actorSystem);

The only method we really are interested in is the normal apply method. Where do those other methods come from? There is no obvious hint in the scaladocs.

During searching for the solution I found a mailing list thread that explains some of the magic. The methods are performance optimizations for boxing and unboxing that are automatically generated by the scala compiler for the @specialized annotation. Still, I am unsure about why this is happening exactly. According to this presentation I would have expected that I am using the specialized instance for Object, maybe that is something special regarding traits?

Fortunately we don't really need to implement the interface ourself: There's an adapter class, AbstractFunction0, that makes your code look much nicer:

@Before
public void initActor() {
actorSystem = ActorSystem.apply();
actorRef = TestActorRef.apply(new AbstractFunction0() {

@Override
public Pi.Worker apply() {
return new Pi.Worker();
}

}, actorSystem);
}

This is like I would have expected it to behave in the first place.

Now, as we have setup our test we can use the TestActorRef to really test the actor. For example we can test that the actor doesn't do anything for a String message:

@Test
public void doNothingForString() {
TestProbe testProbe = TestProbe.apply(actorSystem);
actorRef.tell("Hello", testProbe.ref());

testProbe.expectNoMsg(Duration.apply(100, TimeUnit.MILLISECONDS));
}

TestProbe is another helper that can be used to check the messages that are sent between cooperating actors. In this example we are checking that no message is passed to the sender for 100 miliseconds, which should be enough for execution.

Let's test some real functionality. Send a message to the actor and check that the result message is send:

@Test
public void calculatePiFor0() {
TestProbe testProbe = TestProbe.apply(actorSystem);
Pi.Work work = new Pi.Work(0, 0);
actorRef.tell(work, testProbe.ref());

testProbe.expectMsgClass(Pi.Result.class);
TestActor.Message message = testProbe.lastMessage();
Pi.Result resultMsg = (Pi.Result) message.msg();
assertEquals(0.0, resultMsg.getValue(), 0.0000000001);
}

Now we use the TestProbe to block until a message arrives. When it's there we can have a look at using the lastMessage().

You can look at the rest of the test on Github. Comments are more than welcome as I am pretty new to Scala as well as Akka.

Update

As Jonas Bonér points out I've been using the Scala API. Using the Props class the setup is easier:

    @Before
public void initActor() {
actorSystem = ActorSystem.apply();
actorRef = TestActorRef.apply(new Props(Pi.Worker.class), actorSystem);
}

Sonntag, 19. Februar 2012

Legacy Code Retreat

Yesterday I attended the first german Legacy Code Retreat in Bretten. The event was organized by Softwerkskammer, the german software craftsmanship community.

A legacy code retreat doesn't work like a common code retreat where you implement a certain functionality again and again. It instead starts with some really flawed code and the participants apply different refactoring steps to make it more testable and maintainable. There are six iterations of 45 minutes with different tasks or aims. For each iteration you work with a different partner and after a short retrospective with all participants you mostly start again from the original code.

The github repository for the legacy code contains the code in several languages among which are Java, C++, C# and Ruby.

Iteration 1

The first iteration was used to get to know the functionality of the code. There were no real rules so the participants were free to explore the code in any way they liked.

I paired with Heiko Seebach who I already knew to be a Ruby guy. We were looking at the code with a standard text editor, already quite unfamiliar to standard Java IDE work. I already got enough Ruby knowledge to understand code when I see it so this was no problem. For quite some time we tried to understand a certain aspect that was happening when running the code. It turned out that this was a bug in the Ruby-version of the code. Next we tried to setup RSpec and get starting with some tests.

During this iteration I didn't learn that much about the legacy code but more about some Ruby stuff.

Iteration 2

The target of the second iteration was to prepare a golden master test that could be used during all of the following iterations. The original legacy code is triggered by random input (in the Java version using java.util.Random) and writes all its state to System.out. We should capture the output for a certain input sequence and write it to a file. This can then automatically be compared to the output of a modified version. If both files are the same there are likely no regresions in the code.

I paired with another Java guy and we were working on my machine in Netbeans. I noticed how unfamiliar I am with standard Netbeans project setup as I am using Maven most of the time. We were doing the test and started some refactorings, all in all a quite productive iteration. Things I learned: java.util.Random really only uses the seed for its number generation so if you are using the same seed again and again you always get the same result. Also, when doing file stuff in plain Java I really miss commons-io.

Iteration 3

In Iteration 3 we were supposed to use an antipattern for testing: Subclass to Test. You take the original class and overwrite some methods in it that are called from the method to test.

It turned out that the original code is not suited well for this approach. There are only few methods that really rely on other methods. Most of the methods are accessing the state via the fields directly. Me and my partner therefore didn't really overwrite the methods but instead use an initializer block for prepareing the state of method calls. This is similar to an approach for Map-initialization that I started to apply only just recently:

Map data = new HashMap() {
{
put("key", "value");
}
};

The approach worked quite fine for the given code but it's probably true that the tests won't stay maintainable.

Iteration 4

Iteration 4 was based on the previous iteration. All the methods that have been subclassed for testing should be moved to delegates and passed into the original class using a dependency injection approach.

I paired with a C++ guy who is doing embedded stuff during his day job on his C++ code. It turned out that we had quite different opinions and experiences. He was really focused on performance and couldn't understand why you would want to move methods to another class just to delegate to them as you are introducing overhead with the method call.

I haven't done any C++ programming since University. Eclipse seems to be suited well for development but compared to Java it still seems to lack a lot of convenience functionality.

Iteration 5


On Iteration 5 I paired with Tilman, a Clean Code Aficionado who I already knew from our local Java User Group. We were supposed to change as many methods as possible to real functions that don't work on fields but on parameter values only.

A lot of people were struggling with this approach at first. But it turns out if you are doing this you have a really good starting position for doing further refactorings more easily.

My partner was doing most of the coding with some input from me. We were taking some directions I wouldn't have taken by myself but the resulting code was really well structured and could be reduced in size. Also we worked with an interesting Eclipse plugin I had seen before already: Infinitest always runs the tests in the background, no need to run the tests manually. Have to check if there's something like this available for Netbeans as well.

Iteration 6

To be honest, I don't know what the goal of the sixth iteration really was. I was pairing with a developer that was still fighting with the failing tests from the previous iteration. Most of the iteration we tried to get these running again. In the last few minutes we managed to extract some clases and clean up some code.

Conclusion

The first german legacy code retreat really was a great experience. I learned a lot and, probably even more important, had a lot of fun.

The food and the location both were excellent. Thanks to the organizers Nicole and Andreas as well as the sponsors for making it possible. It's great to be able to attend a high quality event totally for free.

Dienstag, 10. Januar 2012

Running my Tests again

For some time I've been bugged by a Netbeans problem that I couldn't find any solution to. When running a unit test from within Netbeans from time to time it happended that the tests just failed. They seemed to be executed in an old state. Running them again didn't help either, it seemed that some parts of the project didn't get recompiled. When executing the tests from a command line Maven build there were never any problems and afterwards the tests could be run again from Netbeans. The problem only occured very infrequently but nevertheless it was really annoying. I started not running the tests from Netbeans at all but only using Maven. That is also not a good solution as you either run all tests or have to edit the command line all the time for running only a single test.

Recently I noticed what caused the problem: Netbeans has its compile on save feature on for tests. This means it is using its internal incremental compile feature which doesn't seem to work fine at least for some project setups.


You can disable it in the project properties on the Build/Compile node. I haven't seen any problems since disabling it. Saves me a lot of time to run the tests from the IDE again.

Donnerstag, 29. Dezember 2011

Talking about Code

Yesterday I attended the Softwerkskammer Karlsruhe meetup for the first time. Softwerkskammer tries to connect the Software craftmanship community in Germany.

The topic for the evening was simple: More Code. We looked at a lot of samples from a real project and discussed what's wrong with them and what could be done better. There were a lot of different opinions but that's a good thing as I got to question some habits I have when programming.

This has been the first time I've been to a meeting where there is a lively discussion like this. The conferences and user groups I attend mostly have classic talks with one speaker and far less audience participation. Talking about code is a really good way to learn and this won't be the last time I attended a meetup. Thanks to the organizers.

Montag, 26. Dezember 2011

Spring in Action

Sometimes it's comfortable to not be an absolute expert in a certain technology. This makes it really easy to learn new stuff, e.g. by profane methods like reading a book. Even if you are a Spring expert it is still likely that you will take something from the latest edition of Spring in Action by Craig Walls as it covers a wide range of topics. I haven't read one of the predecessors but people told me that those are even better.

Having finished the book recently I just wanted to take the time to write down two interesting small configuration features that I learned from it.

p-Namespace

A feature that I just didn't know before but seems to be quite useful is the p-Namespace. It's a namespace that is not backed by a schema and allows to configure beans in a really concise way. For example look at how a bean might be configured normally:

    <bean id="foo" class="foo.bar.Baz">
<property name="myLongProperty" value="2"/>
<property name="myStringProperty" value="Hallo"/>
</bean>

The properties we'd like to set are children of the bean node. Netbeans comes with nice autocompletion support for the property names as you can see from the screenshot.

The p-Namespace is a more concise version where the property names itself become attributes of the bean node:

    <bean id="foo" class="foo.bar.Baz"
p:myLongProperty="2" p:myStringProperty="Hallo"/>

See that Netbeans is also clever enough to offer code completion here as well.

I am not sure if I will use the short form of the p-Namespace a lot. A consistent use of the features in a project is quite important so I think if the short form is used it should be used everywhere in the project.

Accessing Constants

Sometimes you need to access some constants in your Spring configuration files. There are several ways to handle this, one of it using the util-Namespace:

<property name="day">
<util:constant static-field="java.util.Calendar.WEDNESDAY"/>
</property>

Another way can be to use the Spring Expression Language to access it:

<property name="day" value="#{T(java.util.Calendar).WEDNESDAY}"/>

I think this can be used more commonly as the value doesn't need to be registered as a subnode. For example I had some problems using util:constant as key or value in a util:map. That would have been easy just using the EL version.

Mittwoch, 7. Dezember 2011

Not another Diamond Operator Introduction

I just returned from the talk "Lucky Seven" of our local Java User Group. It was far better than I expected. Not that I expected Wolfgang Weigend to be a bad speaker but though I organized the event I got the feeling that I had seen one too many Java 7 introductions already. But there was more ...

One of the interesting aspects that I haven't been paying that much attention to is the merge of the JRockit and Hotspot VM. Hotspot will be the base of the new development and JRockit features will be merged in. Some of these features will already be available in OpenJDK during the JDK 7 timespan.

One of the changes got some amount of interest lately: The PermGen space will be removed. Sounds like a major change but, once it works, it will definitively be a huge benefit.

JRockit has been highly respected for its monitoring features. Among those is the interesting Java Flight Recorder that reminds me of the commercial project Chronon. It will be an always on recording of data in the JVM that can be used for diagnostic purposes. Sounds really interesting!

The overall goal of the convergence is to have a VM that can tune itself. Looking forward to it!

The (mixed german and english) slides of the talk are available for download.

Mittwoch, 26. Oktober 2011

Getting started with Gradle

Maven has been my build tool of choice for some years now. Coming from Ant the declarative approach, useful conventions as well as the dependency management offered a huge benefit. But as with most technologies the more you are using it the more minor and major flaws appear. A big problem is that with Maven builds are sometimes not reproducable. The outcome of the build is influenced by the state of your local repository.

Gradle is a Groovy based build system that is often recommended as a more advanced system. The features that make it appealing to me are probably the easier syntax and the advanced dependency cache.

For a recent project that I just uploaded for someone else I needed to add a simple way to build the jar. Time to do it with Gradle and see what it feels like.

The build script


The purpose of the build is simple: compile some classes with some dependencies and package those to a jar file. Same as Maven and Ant, Gradle also needs at least one file that describes the build. This is what build.gradle looks like:

apply plugin: 'java'

repositories {
mavenCentral()
mavenRepo url: "http://bp-cms-commons.sourceforge.net/m2repo"
}

dependencies {
compile group: 'org.opencms', name: 'opencms-core', version: '7.5.4'
compile group: 'javax.servlet', name: 'servlet-api', version: '2.5'
}

sourceSets {
main {
java {
srcDir 'src'
}
}
}


Let's step through the file line by line. The first line tells gradle to use the java plugin. This plugin ships with tasks for compiling and packaging java classes.

In the next block we are declaring some dependency repositories. Luckily Gradle supports Maven repositories so existing repositories like Maven central can be used. I guess without this feature Gradle would not gain a lot of adoption at all. There are two repos declared: Maven central where most of the common dependencies are stored and a custom repo that provides the OpenCms dependencies.

The next block is used to declare which dependencies are necessary for the build. Gradle also supports scopes (in Gradle: configurations) so for example you can declare that some jars are only needed during test run. The dependency declaration is in this case similar to the Maven coordinates but Gradle also supports more advanced features like version ranges.

The last block isn't really necessary. It's only there because my Java sources are located in src instead of the default src/main/java. Gradle uses a lot of the Maven conventions so it's really easy to migrate builds.

Building


To build the project you need Gradle installed. You can download a single distribution that already packages Grooovy and all the needed files. You only need to add the bin folder to your path.

Packaging the jar is easy: You just run the jar task in the java plugin: gradle :jar. Gradle will start to download all direct and transitive dependencies. The fun part: It uses a nice command line library that can display text in bold, rewrite lines and the like. Fun to watch it.

I like the simplicity and readability of the build script. You don't need to declare anything if you don't really need it. No coordinates, no schema declaration, nothing. I hope I will find time to use it in a larger project so I can see what it really feels like in the daily project work.