Skip to main content
blog title image

4 minute read - REST API Java For Testers

Overview of Spark and HTTP Testing with JUnit

Apr 26, 2018

TLDR: Spark is static so having it run in an @BeforeClass allows HTTP request testing to begin.

I use Spark as the embedded web server in my applications. I also run simple HTTP tests against this as part of my local maven build. And I start Spark within the JUnit tests themselves. In this post I’ll show how.

We all know that there are good reasons for not running integration tests during our TDD Red/Green/Refactor process. We also know that we can run subsets of tests during this process and avoid any integration tests. And hopefully we recognise that expedient fast automated integration verification can be useful.

What is Spark?

Spark is a small, easy to add to your project with a single maven dependency, embedded web server for Java.

I use it for my

Spark is easy to configure within code

get("/games/", (req, res) -> {res.redirect("/games/buggygames/index.html"); return "";});

And it will look in a resource directory for the files:

staticFileLocation("/web");

And it is easy to change the port (by default 4567)

Spark.port(1234);

And I can do fairly complicated routings if I want to for all the HTTP verbs.

        get(ApiEndPoint.HEARTBEAT.getPath(), 
                (request, response) -> {
                    return api.getHeartbeat(
                        new SparkApiRequest(request),
                        new SparkApiResponse(response)).getBody();
            });
        options(ApiEndPoint.HEARTBEAT.getPath(), 
                (request, response) -> { 
                    response.header("Allow", "GET"); 
                    response.status(200); 
                    return "";});
        path(ApiEndPoint.HEARTBEAT.getPath(), () -> {
            before("", (request, response) -> {              
                if(!api.isMethodAllowed(ApiEndPoint.HEARTBEAT.getPath(),
                                        new SparkApiRequest(request))){
                    halt(405);
                }
            });
        });

I tend to use abstraction layers so I have:

  • Classes to handle Spark routing
  • Application Classes to handle functionality e.g. api
  • Domain objects to bridge between domains e.e. SparkApiRequest represents the details of an HTTP request without having Spark bleed through into my application.

Running it for Testing

It is very easy, when using Spark to simply call the main method to start the server and run HTTP requests against it.

String [] args = {};
Main.main(args);

Once Spark is running, because it is all statically accessed the server stays running while our @Test methods are running.

I’m more likely to start my Spark using the specific Spark abstraction I have for my app:

    public void startServer() {
        server = new RestServer("");
    }

We just have to make sure we don’t keep trying to start running it again, so I use a polling mechanism to do that.

Because this is fairly common code now. I have an abstraction called SparkStarter which I use.

This has a simple polling start mechanism:

public void startSparkAppIfNotRunning(int expectedPort){

    sparkport = expectedPort;

    try {
        if(!isRunning()) {

            startServer();

        }
    }catch(IllegalStateException e){
        e.printStackTrace();
    }

    try{
        sparkport = Spark.port();
    }catch(Exception e){
        System.out.println("Warning: could not get actual Spark port");
    }

    waitForServerToRun();
}

And the wait is:

private void waitForServerToRun() {
    int tries = 10;
    while(tries>0) {
        if(!isRunning()){
            try {
                Thread.sleep(1000);
            } catch (InterruptedException e1) {
                e1.printStackTrace();
            }
        }else{
            return;
        }
        tries --;
    }
}

These methods are on an abstract class so I create a specific ‘starter’ for my application that knows how to:

  • check if it is running
  • start the server
    public boolean isRunning(){

        try{
            HttpURLConnection con = (HttpURLConnection)
                        new URL("http",host, sparkport, heartBeatPath).
                                openConnection();
            return con.getResponseCode()==200;
        }catch(Exception e){
            return false;
        }

    }

    @Override
    public void startServer() {
        server = CompendiumDevAppsForSpark.runLocally(expectedPort);
    }

You can see an example of this in CompendiumAppsAndGamesSparkStarter.java

And in the JUnit code

    @BeforeClass
    public static void ensureAppIsRunning(){
        CompendiumAppsAndGamesSparkStarter.
                    get("localhost", "/heartbeat" ).
                    startSparkAppIfNotRunning(4567);
    }

e.g. PageRoutingsExistForAppsTest.java

You can find examples of this throughout my TestingApp

Because it is static, this will stay running across all my tests.

Pretty simple and I find it very useful for the simple projects that I am working on .

Bonus Video

“Spark Java Embedded WebServer And Testing Overview”

https://www.youtube.com/watch?v=7b0SnEznYnk

Spark is a simple embedded Java WebServer. I can also spin it up during JUnit tests to make my testing easy.

In this video I show:

  • An overview of Spark Java Embedded Web Server
  • How to use it during JUnit execution
  • Abstraction code separating Spark from my Application