Take a step back in history with the archives of PragPub magazine. The Pragmatic Programmers hope you’ll find that learning about the past can help you make better decisions for the future.

FROM THE ARCHIVES OF PRAGPUB MAGAZINE SEPTEMBER 2017

Refactoring to Functional Style in Java 8: Elegant Ways to Work with Files

By Venkat Subramaniam

PragPub
The Pragmatic Programmers
6 min readAug 25, 2022

--

In this installment of his series, Venkat looks at common operations for processing text files in the imperative style and then explores how to perform those operations in the functional style.

https://pragprog.com/newsletter/
https://pragprog.com/newsletter/

Suppose we wanted to write a program to count the number of lines in a file. The (imperative) code might look something like the following:

import java.io.*;
public class CountLinesImperative {
public static void main(String[] args) throws Exception {
BufferedReader reader =
new BufferedReader(new FileReader(“CountLinesImperative.java”));
long count = 0;
String line = null;
while((line = reader.readLine()) != null) {
count++;
}
System.out.println(“Number of lines in the file: “ + count);
}
}

The program is straightforward: it simply counts the number of lines in the file that contains the code. Most Java programmers are used to this kind of code. But it is noisy and verbose. Also, the condition we provide to while is very smelly. We read each line, check if it is null and if not, continue to process that line.

We can make use of the Stream API to improve this code quite a bit in JDK

8. The Files class provides a convenience method to read the contents of a file into a Stream. Let’s rework the example using the lines method of Files, like so:

import java.io.*;
import java.nio.file.*;
public class CountLinesFunctional {
public static void main(String[] args) throws Exception {
long count =
Files.lines(Paths.get(“CountLinesFunctional.java”)).count();
System.out.println(“Number of lines in the file: “ + count);
}
}

Ignoring blank lines, we went from seven to two lines of code. The real benefit is not the reduced lines of code but the lack of ceremony and intermediate steps.

The lines method returns a Stream, which is a lazy internal iterator, as we have seen before in previous articles in this series. Each line in the file becomes a value provided or yielded by the Stream. The count method readily returns the number of values present in the Stream, that is, the number of lines in this example.

Since the lines method returns a Stream, we can use the functional pipeline to operate on the contents of the files quite elegantly. We will explore some examples of that next.

Extracting Lines with a Select Word

Instead of counting the number of lines, suppose we wanted to extract lines from the file that contained a select word — say class. A imperative style code for that will look like the following:

import java.io.*;
public class PrintLinesWithWordImperative {
public static void main(String[] args) throws Exception {
BufferedReader reader =
new BufferedReader(new FileReader
“PrintLinesWithWordImperative.java”));
String line = null;
while((line = reader.readLine()) != null) {
if(line.contains(“class”)) {
System.out.println(line);
}
}
}
}

Inside the loop we check if the line contains the word we’re looking for, and if so, print the line. We can trade the if statement to filter in the functional style, like so:

import java.io.*;
import java.nio.file.*;
public class PrintLinesWithWordFunctional {
public static void main(String[] args) throws Exception {
Files.lines(Paths.get(“PrintLinesWithWordFunctional.java”))
.filter(line -> line.contains(“class”))
.forEach(System.out::println)
}
}

Once we obtain a reference to Stream, the rest of the code is much like how we work with the functional pipeline. We compose the filter operation on the Stream, followed by the call to the forEach method. While the filter extracts only lines we care about, forEach instructs the selected line to be printed.

In addition to filter, the function pipeline may include other operations like map, to transform values as they flow through the pipeline, and reduce to accumulate the values.

Let’s next look at a variation of the previous example that requires a bit more effort.

Counting Occurrences of a Word

Instead of counting the number of lines, what if we wanted to count the number of times a string appears in a file? Here’s a imperative style code for that:

import java.io.*;
public class CountWordsImperative {
public static void main(String[] args) throws Exception {
BufferedReader reader =
new BufferedReader(new FileReader(“CountWordsImperative.java”));
long count = 0;
String line = null;
while((line = reader.readLine()) != null) {
for(String word : line.split(“ “)) {
if(word.contains(“class”)) {
count++;
}
}
}
System.out.println(“Number of occurrences of the word class
in this class :” + count);
}
}

Within the while loop, we split the line into separate words delimited by space and then check if each word contains the (sub)string class. Finally, we print the count of the number of occurrences of that string.

Let’s rework this code in functional style. At first though, the transformation of each line into words in that line appears like a map operation. However, we generally use map when the function transforms a value into another value. In this case, the value, which is a line, transforms into a collection of words.

If we use the map method, then Stream<String> will convert to Stream<String[]>. However, we want to check each word, so we would like to transform Stream<String> representing lines to Stream<String> representing words.

When the transformation function converts a value to a collection but we want the overall result of transformation to be a Stream of values, the candidate function for that is flatMap. Let’s use that method to reimplement the previous code but this time in functional style.

import java.io.*;
import java.nio.file.*;
import java.util.stream.*;
public class CountWordsFunctional {
public static void main(String[] args) throws Exception {
long count = Files.lines(Paths.get(“CountWordsFunctional.java”))
.flatMap(line -> Stream.of(line.split(“ “)))
.filter(word -> word.contains(“class”))
.count();
System.out.println(“Number of occurrences of the word class
in this class :” + count);
}
}

We pass the Stream returned by the lines method to the flatMap operation to convert the Stream of lines to a Stream of words. Then we filter out only words that contain the desired string and finally count the number of occurrences. We can see the function pipeline in action again in this example although we used a different combination of methods here.

Conclusion

The lines method of the Files class is great to get a Stream of the lines in a text file. Once we get hold of a Stream we can apply the operations like filter, map, and so on, on the stream to extract specific contents or transform the contents. Using these methods, we can create concise and expressive code to work with data from text files.

About Venkat Subramaniam

Dr. Venkat Subramaniam is an award-winning author, founder of Agile Developer, Inc., and an instructional professor at the University of Houston. He has trained and mentored thousands of software developers in the U.S., Canada, Europe, and Asia, and is a regularly invited speaker at several international conferences. Venkat helps his clients effectively apply and succeed with agile practices on their software projects. Venkat is a (co)author of multiple books, including the 2007 Jolt Productivity award-winning book Practices of an Agile Developer.

Cover from PragPub magazine, September 2017 featuring 3 women looking up through paper eclipse glasses
Cover from PragPub magazine, September 2017

--

--

PragPub
The Pragmatic Programmers

The Pragmatic Programmers bring you archives from PragPub, a magazine on web and mobile development (by editor Michael Swaine, of Dr. Dobb’s Journal fame).