Hi, thanks for this post.
Larissa Leite
2

Hi Larissa,

You can indeed run the tasks in parallel using Airflow’s LocalExecutor and setting the parallelism to however many tasks you would like to run simultaneously. Both are configs in airflow.cfg, namely you would need to change the “executor” and “parallelism” configs.

This should not be a problem for Spark, but you do have to be careful in your configs for your Spark job. You would, for example, need to make sure the resources you ask for to submit the jobs (number of executors, cores per executor, driver memory, etc) are enough to run multiple tasks at the same time.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.