Create SparkSession in Scala Spark

Parmanand
Nov 16, 2020

Spark applications must have a SparkSession. which acts as an entry point for an applications. It was added in park 2.0 before this Spark Context was the entry point of any spark application. It allows you to control spark applications through a driver process called the SparkSession.

Let’s get started !

In Interactive mode : The variable is available as spark when you start the console by typing spark-shell.

What does it contain?

  1. Spark Context
  2. SQL Context
  3. Hive Context

Lets create a Sparksession using SparkSession.builder metnod

import org.apache.spark.sql.SparkSessionobject Main {
def main(args: Array[String]): Unit = {
val sparkSession=SparkSession.builder.
appName("TestAPP").
master("local[2]").
getOrCreate()

}

}

Explanations-

appName - Sets a name for the App, which will be shown in the Spark web UI.master -  Where the porgarm is going to run. Set "local[2]" to run locally with 2 core.getOrCreate - Gets an existing SparkSession or, if there is no existing one, creates a new one based on the options set in this builder.  

To enable hive support -

val sparkSession=SparkSession.builder
.appName("TestAPP")
.master("local[2]")
.enableHiveSupport()
.getOrCreate()

Thanks for reading!

Please do share the article, if you liked it. Any comments or suggestions are welcome.

--

--