If neither sbt nor Maven suits your needs, you may decide to use another build system. Thankfully, Spark supports building a fat JAR file with all the dependencies of Spark, which makes it easy to include in the build system of your choice. Simply run sbt/sbt assembly
in the Spark directory and copy the resulting assembly JAR file from core/target/spark-core-assembly-0.7.0.jar
to your build dependencies, and you are good to go.
Tip
No matter what your build system is, you may find yourself wanting to use a patched version of the Spark libraries. In that case, you can deploy your Spark library locally. I recommend giving it a different version number to ensure that sbt/maven
picks up the modified version. You can change the version by editing project/SparkBuild.scala
and changing the version :=
part of the code. If you are using sbt, you should run an sbt/sbt
update in the project that is importing the custom version. For other build systems, you...