Build REST Service with Kotlin, Spark Java and Requery — Part 1

In this article I would like to give you a quick introduction how you can use Kotlin together with the Spark Java Micro Framework and the Requery library.


List of all parts:


We will create a simple small REST service with Kotlin language, run it by using the Spark micro framework which is my favorite lightweight framework for running microservices.

Database schema will be created and handled by the Requery framework.

Because I’m a big fan of dependency injection so we will utilize the Dagger 2 DI library.

I will separate a whole tutorial to small parts. In this article (part 1) we will create a new Gradle Kotlin project for our service and set up Gradle and dependencies. Then we create a simple REST API to allow communication with our service by using Spark Java Micro Framework.

In part 2 we will add more logic to our service (more REST functions) plus we will start to use DI framework Dagger 2 and the Retrofit library.

In the final part 3, you will learn how to use the Requery framework for our persistent layer.

Import project from GitHub

The source code you can clone from my GitHub repo here or you can follow post to create a complete project step-by-step.

Branches:

  • master: complete project created from parts 1–3
  • part1, part2 and part 3
  • base-project: An empty project with Kotlin, Gradle and dependencies (all from a first section “Creating a New Project”)

Creating a New Project

For this tutorial I’m using the Intellij IDEA. Create a new Gradle Kotlin Java project with Java 1.8 (Spark requires minimal Java 1.8)

Next fill your groupId and artifactId and press button Next. On next window you can let it all checked by default and just check if your Gradle JVM is 1.8 and press button Next again.

Now set your project name an location i.e.:

Press button Finish. Now Intellij will create a new Kotlin project with the Gradle build tool.


Note: If a Kotlin Gradle Plugin error appears (depends on which version of Kotlin you already have)

Then please update your Kotlin version to 1.1


Configure Gradle

If you will look at to your build.gradle file you should have there Kotlin integration created by Intellij during project creation. It can happen that you have here EAP Kotlin version or some old version. You can edit it manually or you can do it by using Tools -> Kotlin -> Configure Kotlin in Project. I prefer manual way because I will know what I have in my Gradle files.

Your build.gradle file should be similar to my:

group 'cz.kotliners.kotlin'
version '1.0.0-SNAPSHOT'

buildscript {
ext.kotlin_version = '1.1.0'

repositories {
jcenter()
mavenCentral()
}
dependencies {
classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlin_version"
}
}

apply plugin: 'java'
apply plugin: 'kotlin'

sourceCompatibility = 1.8

repositories {
jcenter()
mavenCentral()
}

dependencies {
compile "org.jetbrains.kotlin:kotlin-stdlib- jre8:$kotlin_version"

testCompile group: 'junit', name: 'junit', version: '4.12'
}

You can see that Kotlin plugin added definition to ext. part which version of Kotlin we are using 1.1.0 + added kotlin-gradle-plugin on the classpath.

Project Structure

Next, please create source folder src/main/kotlin and define your package. In my case a project structure you case on image below:

Now please crate a new Kotlin file and named it ContainerRunner.kt and a new Kotlin class with name ServiceRunner.kt New -> Kotlin File/Class:

ContainerRunner.kt
ServiceRunner.kt

ContainerRunner.kt — In this file, we defined the main method which is required by the Spark framework because Spark will wrap it.

ServiceRunner.kt — Main Runner of the Kotlin-Spark-Requery demo service. It makes initialization of a whole service and setup of REST API

Now run our “application” to see if everything is working correctly and text Hello Kotlin!!! will be shown. You can run it easily (for first time) by clicking on the Kotlin icon in the ContainerRunner.kt class.

You should see:

Hello Kotlin!!!
Process finished with exit code 0

Nothing interesting. But now we will start playing with the Spark Java Microframework

Spark Java Micro Framework

Spark is a micro framework for creating web applications in Java 8 with minimal effort:

  • Tiny framework for setting up routes
  • No XML configuration
  • No annotations
  • Starts a web server automatically

Request, Parameters, and Response objects

Provide full control over HTTP headers and functionalities

We can access query and URL parameters and specific headers like Content-Type are provided by a dedicated method

TODO EXAMPLE IN KOTLIN

Filters, Static Files, Response Transformers

staticFileLocation("/public")
  • Filters are a series of hooks that can be inserted before or after the execution of request.
  • For instance you want to check if user is logged for each request before you will handle or just log the request.
  • Perhaps you want to compress the result using gzip or you want to set allowed methods: Spark offers the before() and after() methods where you can specify the logic:
before { request, response ->
response.header("Access-Control-Allow-Origin", "*")
response.header("Access-Control-Allow-Methods", "GET, POST, DELETE")
}

Routes

  • Can be provided in a sequential order, each specified for a particular HTTP method like GET, POST with callback which will be executed.
  • Are useful to specify a mapping between the URLs that your server is handling and the java code that handles the request
  • Routes are matched in the order in which they are specified. The first route that is matched the incoming request will be executed
Route /hello

For detail information and description see an official documentation here.

Less theory, more code & fun

Open build.gradle and add to buildscript block Spark version:

buildscript {
...
ext.spark_version = '2.5.5'
    ...
}

And add Spark dependency:

dependencies {
...
compile "com.sparkjava:spark-core:$spark_version"
...
}

And that’s all :) Now we have Spark in our project which is able to run. It will do nothing but is there. So let’s add a fist REST function:

import spark.Spark

class ServiceRunner {

fun run() {
Spark.get("/hello", { req, res -> "Hello Spark!!!"})
}

}

All Spark REST members are inside of the spark.Spark. Import all from spark.Spark to have nicer REST API (for reading purposes)

import spark.Spark.*

class ServiceRunner {

fun run() {
get("/hello", { req, res -> "Hello Spark!!!"})
}

}

We have now our fist REST method written in Spark and the last step, for now, is to try if service is running. Default port is 4567. Enter the following URL into your browser:

http://localhost:4567/hello
Hello Spark Example

It is cool and so easy do not you think?

Now see text in the log output where you will see:

SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.

To enable logging, add SLF4J dependency to your project in the build.gradle and run project again:

compile "org.slf4j:slf4j-simple:1.7.24"

Ta-da Spark is logging:

[Thread-0] INFO org.eclipse.jetty.util.log - Logging initialized @698ms
[Thread-0] INFO spark.embeddedserver.jetty.EmbeddedJettyServer - == Spark has ignited ...
[Thread-0] INFO spark.embeddedserver.jetty.EmbeddedJettyServer - >> Listening on 0.0.0.0:4567
[Thread-0] INFO org.eclipse.jetty.server.Server - jetty-9.3.6.v20151106
[Thread-0] INFO org.eclipse.jetty.server.ServerConnector - Started ServerConnector@4f4279{HTTP/1.1,[http/1.1]}{0.0.0.0:4567}
[Thread-0] INFO org.eclipse.jetty.server.Server - Started @1011ms

How to change a default port?

If you want to change a default port, just add call Spark’s port method with new port number. Method must be done before route mapping has begun:

fun run() {
port(9091)

get("/hello", { req, res -> "Hello Spark!!!"})
}

and try again run service and see a port number in the log output or just try enter URL mentioned above with port 9091:

...
[Thread-0] INFO spark.embeddedserver.jetty.EmbeddedJettyServer - >> Listening on 0.0.0.0:9091
...

Creating Controller

It could be a best practice to have REST definition in Spark defined inside of the Controller class. What we will do is just move our hello GET - REST function to new Kotlin class i.e. ServiceController.kt:

package cz.kotliners.kotlin.spark_requery.controller

import spark.Spark.*

class ServiceController {

init {
get("/hello", { req, res -> "Hello Spark!!!" })
}

}

The ServiceRunner.kt should looks like:

class ServiceRunner {

fun run() {
initControllers()
}

private fun initControllers() {
ServiceController()
}

}

Summary of Part 1

In this part 1, you learned how to create and set up a new Kotlin Gradle project. Next, you got an introduction to the Spark Java Micro Framework. Now you know how to create a REST simply with Spark.

In the next part 2, you will learn how to return JSON response and how simply uses the Dagger 2 as a tool for dependency injection

Thank you for reading and stay tuned

Next article: Part 2: Improving REST API & Dagger 2 Integration


List of all parts: