Describe the Spark execution model.

bookmark

The Spark execution model is based on directed acyclic graphs (DAGs). It consists of a sequence of transformations and actions applied to RDDs, forming a logical execution plan. When an action is triggered, Spark optimizes the plan and schedules tasks to be executed on worker nodes. The execution is pipelined, meaning the output of one stage becomes the input of the next stage without unnecessary data shuffling.