Spark error: "Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources"

If you're trying to run something on a Spark cluster and are getting

Your hostname, resolves to a loopback/non-reachable address: , but we couldn’t find any external IP address

and subsequently

Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources

and can't figure out why, check if your machine that you are posting the job from (aka the Spark driver) is reachable from the cluster.

In my case, the problem was that I had the built-in OSX firewall turned on and set to `Block all incoming connections`.

Like what you read? Give jonas a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.