java - Spark Executor : Invalid initial heap size: -Xms0M -


i have configured spark query on hive table.

run thrift jdbc/odbc server using below command :

cd $spark_home ./sbin/start-thriftserver.sh --master spark://myhost:7077 --hiveconf hive.server2.thrift.bind.host=myhost --hiveconf hive.server2.thrift.port=9999 

then checked @ spark worker ui , executor startup failing below error , jvm initialization failing because of wrong -xms :

invalid initial heap size: -xms0m error: not create java virtual machine. error: fatal exception has occurred. program exit. 

following changed configurations in conf/spark-env.sh

export spark_java_opts="-dspark.executor.memory=512m" export spark_executor_memory=1g export spark_driver_memory=512m export spark_worker_memory=2g export spark_worker_instances=1 

i don't have clue value -xms0m coming or how has been derived ? please me understand issue , change value.

it working ...

thrift server not picking executor memory spark-env.sh​ , added in thrift server startup script explicitly.

./sbin/start-thriftserver.sh

exec "$fwdir"/sbin/spark-daemon.sh spark-submit $class 1 --executor-memory 512m "$@" 

with , executor start getting valid memory , jdbc queries getting results.

conf/spark-env.sh​ (executor memory configurations not picked thrift-server)

export spark_java_opts="-dspark.executor.memory=512m" export spark_executor_memory=512m 

Comments