Data Prophet
Published in

Data Prophet

Essential concepts in Spark

Photo by Warren Wong on Unsplash

The Apache Spark core is the basic execution engine of the Spark platform. All other functions are built on this engine. It not only provides memory computing functions to improve speed but also provides a general execution model to support various applications. In addition, users can use Java, Scala, and Python API to develop applications. Spark core is built on a unified abstract RDD, which allows various components of Spark to be…

--

--

--

This blog is for big data and data analysis

Recommended from Medium

Research Paper On Java Technology

Ubuntu and Windows 10 Dual Boot Setup

The version 2 of the Meta Spatial beta testing

WHY AM LEARNING PROGRAMMING?

Be careful the Voices in your Mind

Bonsai Speed Run II

Password rotation: workflows for Hashicorp Vault, Linux & Servicenow (part 3)

Mergesort & Quicksort: a simple Python implementation.

Sorting algorithms

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Sajjad Hussain

Sajjad Hussain

Digital Nomad

More from Medium

What is Apache Spark? A Beginners Guide

SPARK : An Efficient way of handling Big Data

In Search of Relationships between Temperature, Homeless Encampments, and Criminal Summons in NYC…

PySpark UDF (User Defined Function)