python - SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243) -
i see several post contain same error error receiving, none leading me fix on code. have used exact same code many times no issue , having problems. here error receive:
y4j.protocol.py4jjavaerror: error occurred while calling none.org.apache.spark.api.java.javasparkcontext. : org.apache.spark.sparkexception: 1 sparkcontext may running in jvm (see spark-2243).
here how start context within python script:
spark = ps.sql.sparksession.builder \ .master("local[*]") \ .appname("collab_rec") \ .config("spark.mongodb.input.uri", "mongodb://127.0.0.1/bgg.game_commen$ .getorcreate() sc = spark.sparkcontext sc.setcheckpointdir('checkpoint/') sqlcontext = sqlcontext(spark)
please let me know if have suggestion.
sparksession new entry point in spark 2.x. replacement sqlcontext, uses sqlcontext in internal code.
everything making sqlcontext should possible sparksession.
if want use sqlcontext, use spark.sqlcontext variable
Comments
Post a Comment