Pyspark Py4JJavaError: An error occurred while and OutOfMemoryError

Software1453
1 min readNov 16, 2019

--

Increase the default configuration of your spark session. You need to essentially increase the driver memory by something like this.To do this, you need to make some settings in the spark installation directory.

Find the file spark-defaults.conf from the directory and make the following settings.

  • spark.driver.memory 16g
  • spark.executor.heartbeatInterval 3600s
  • spark.executor.memory 16g

If the solution did not work again, disable the high-dimensional fields by looking at the data set used. Because the computer resources are not enough because it puts too much load on the memory.

--

--

Software1453
0 Followers

Software Engineer | Big Data | Machine Learning | Apache Spark