How to force close Spark Hive metastore connections to mysql after stopping context? -


i using spark querying hive followed transformations. scala app creates multiple spark applications. new spark app created after closing sparksession , spark context of previous spark app.

however, on stopping sc , spark, somehow connections hive metastore (mysql) not destroyed properly. every, spark app can see around 5 mysql connections being created (old connections being still active!). eventually, mysql starts rejecting new connections after 150 open connections. how can force spark close hive metastore connections mysql (after spark.stop() , sc.stop())?

note: have used spark 2.1.1. using spark's thriftserver instead of hiveserver. so, don't think have used hive metastore service.


Comments

Popular posts from this blog

python Tkinter Capturing keyboard events save as one single string -

android - InAppBilling registering BroadcastReceiver in AndroidManifest -

javascript - Z-index in d3.js -