swetha muraliApache Spark — How much resource my spark job require ~approximately.We need to consider the below factorsApr 101Apr 101
swetha muraliApache Spark — ExecutorsExecutors are responsible for actually executing the work that the driver assigns them. All the computation requires a certain amount of…Dec 21, 20231Dec 21, 20231
swetha muraliApache Spark — caching a RDDGenerally we cache data, so they can be accessed faster. Same Concept goes here as well.Dec 15, 2023Dec 15, 2023
swetha muraliApache Spark — RDDsSpark jobs are typically executed against RDDs (Resilient Distributed Dataset). They are the basic units which hold records of data that…Dec 15, 2023Dec 15, 2023
swetha muraliExploring Thailand.We wish to share our travel experience in exploring some places in Thailand which we thoroughly enjoyed.Jul 17, 2019Jul 17, 2019