scala - spark 1.5.2 - Programmatically launching spark on yarn-client mode -


we using spark 1.3.1 , launching our spark jobs on yarn-client mode programmatically via creating sparkconf , sparkcontext object manually. inspired spark self-contained application example here:

https://spark.apache.org/docs/1.5.2/quick-start.html#self-contained-applications\

only additional configuration provide related yarn executor instance, cores etc.

however after upgrading spark 1.5.2 above application breaks on line val sparkcontext = new sparkcontext(sparkconf)

it throws following in driver application:

16/01/28 17:38:35 error util.utils: uncaught exception in thread main  java.lang.nullpointerexception  @ org.apache.spark.network.netty.nettyblocktransferservice.close(nettyblocktransferservice.scala:152)  @ org.apache.spark.storage.blockmanager.stop(blockmanager.scala:1228)  @ org.apache.spark.sparkenv.stop(sparkenv.scala:100)  @ org.apache.spark.sparkcontext$$anonfun$stop$12.apply$mcv$sp(sparkcontext.scala:1749)  @ org.apache.spark.util.utils$.trylognonfatalerror(utils.scala:1185)  @ org.apache.spark.sparkcontext.stop(sparkcontext.scala:1748)  @ org.apache.spark.sparkcontext.<init>(sparkcontext.scala:593) 

so approach still supposed work? or must use sparklauncher class spark 1.5.2 launch spark job programmatically on yarn-client mode?


Comments