scala - SBT cannot build with spark-hive dependency -
i trying compile scala script submit job through spark-submit. using sbt windows command line compile. directory structure defined sbt.
path = c:\users\{$username}
{$path}\test |-src | |-main | | |-scala | | |-java | | |-resources | |-test | |-scala | |-java | |-resources |-target build.sbt
here build file: build.sbt
name := "testquery" version := "1.0" scalaversion := "2.11.8" librarydependencies ++= { val sparkver = "2.1.0" seq("org.apache.spark" %% "spark-core" % sparkver % "provided" withsources(), "org.apache.spark" %% "spark-hive" % sparkver % "provided" ) }
my testquery.scala file under ./test/src/main/scala/testquery.scala
from windows cmd, switch directory ./test , run sbt. when run compile
command, sbt gives following error:
[error]./test/src/main/scala/testquery.scala:2:29: object hive not member of package org.apache.spark.sql
sbt uses maven2 repository , spark-hive exists under: https://repo1.maven.org/maven2/org/apache/spark/spark-hive_2.11/1.2.0/
also, import command works in spark-shell. (spark-shell runs spark 2.1.1. , scala 2.11.8).
why can't find it?
Comments
Post a Comment