Spark is being used in large scale projects. Data processing at scale, streaming processing (life doesn’t happen is batches anymore), machine learning, etc. Even though Scala is used for Spark, it is not only about Spark. It is probably about Functional Programming. The rise of multicore computers and cluster computing are pushing towards new model of computations that allow high concurrency. Functional Programming enables us to build these kinds of environments, and Scala its taking its place here. Scala run on the JVM, which is great, its flexibility (functional and imperative programming), and it's statically typed, which is good in a lot of cases.
Again, it's not only about Spark, its about the needs of new models of computation