If I want to have application specific logging in spark, say I have 5 spark applications and I want…

In your application conf directory you could have your application config’s settings, in particular you wan to put there a log4j.properties file. This file looks like the one we saw on the post and within it you could define where you want to put your logs. The only problem will be that all Spark nodes (workers) are going to use this file, so all of them will create their own log file. For example, if your logs are in /myspark_logs/ each worker will create this folder and put there their own logs (at the worker level). If you really want to create exactly one log file per application, another approach needs to be taken. I already wrote about it, please, find it here: https://medium.com/@anicolaspp/how-to-log-in-apache-spark-a-functional-approach-e48ffbbd935b#.sic3ziiy1

After reading the functional approach, you will have a configuration per application (I just explained how) and a single place where your application writes the logs (application driver) so you will not have to look at all nodes for your logs.

Remember we have two problems, 1) having application specific logs and 2) having exactly one log file per application. I just showed you how to solve both.

Please, leave me your feedback about this response. Thanks