Unlike the Jupyter notebooks, when you use the spark-submit
command, you need to prepare the SparkSession
yourself and configure it so your application runs properly.
In this section, we will learn how to create and configure the SparkSession
as well as how to use modules external to Spark.
Note
If you have not created your free account with either Databricks or Microsoft (or any other provider of Spark) do not worry - we will be still using your local machine as this is easier to get us started. However, if you decide to take your application to the cloud it will literally only require changing the --master
parameter when you submit the job.