ClassNotFoundException for JDBC driver


#1

I am getting ClassNotFoundException for the JDBC driver…In spite of copying the JDBCdriver jar in the project folder I am getting the error. Please find the spark code I am using for this along with output on the Spark-shell

val spark = SparkSession.builder().appName(“XXX”).enableHiveSupport().getOrCreate()
val tableRDD = spark.sql("select * from ")
val driver = "com.mapd.jdbc.MapDDriver"
val url = "jdbc:mapd:
**::test"
val username = "XXX"
val password = "XXXXXX"
Class.forName(driver)
tableRDD.write.format(“jdbc”).option(“url”,“url”).option(“driver”,“driver”).option(“dbtable”,“YYY”).option(“user”,“username”).option(“password”,“password”).save()

java.lang.ClassNotFoundException: driver
at scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)


#2

You can check this thread

I guess the software cannot find the jar on classpath


#3

Thank you for reply.

I tried running the spark code using three ways (mentioned below) but still getting same error.

./bin/spark-submit --class Hive2Mapd hive_2.11-2.0.jar --conf “spark.driver.extraClassPath=XXX/mapdjdbc-1.0-SNAPSHOT-jar-with-dependencies.jar”

./bin/spark-submit --jars XXX/mapdjdbc-1.0-SNAPSHOT-jar-with-dependencies.jar
–driver-class-path XXX/mapdjdbc-1.0-SNAPSHOT-jar-with-dependencies.jar
–conf spark.executor.extraClassPath=XXX/mapdjdbc-1.0-SNAPSHOT-jar-with-dependencies.jar
–class Hive2Mapd XXX/MapD/aster2hive_2.11-2.0.jar

./bin/spark-submit
–driver-class-path XXX/mapdjdbc-1.0-SNAPSHOT-jar-with-dependencies.jar
–conf spark.executor.extraClassPath=XXX/mapdjdbc-1.0-SNAPSHOT-jar-with-dependencies.jar
–class XXX/aster2hive_2.11-2.0.jar

Please advice.


#4

Hi,

Looking at your error message

java.lang.ClassNotFoundException: driver
at scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)

Makes me think there is something a miss in the code. You are looking for a driver called driver

if you look at your call

tableRDD.write.format(“jdbc”).option(“url”,“url”).option(“driver”,“driver”).option(“dbtable”,“YYY”).option(“user”,“username”).option(“password”,“password”).save()

reformatted to make readable

tableRDD.write.format(“jdbc”).
    option(“url”,“url”).
    option(“driver”,“driver”). 
    option(“dbtable”,“YYY”).
    option(“user”,“username”). 
    option(“password”,“password”).
    save()

you will note you are passing the literal value of "driver" to the option driver, where I suspect you meant to pass the string value driver to it.

I assume you have obsfuscated he url etc, but if you haven’t you will need to populate real values in all of these fields to get the connection to work.

A working set of options should look something more like

    option("url", "jdbc:mapd:localhost:9091:mapd").
    option("driver", "com.mapd.jdbc.MapDDriver").
    option("dbtable", "people").
    option("user", "mapd").
    option("password", "HyperInteractive").
    save();

regards


Writing from Databricks to MapD Community (AWS)
#5

Actually, I have defined the variable with those values in the beginning of the code.

val driver = "com.mapd.jdbc.MapDDriver"
val url = "jdbc:mapd:**::test"
val username = "XXX"
val password = “XXXXXX”

And then passed those variables here

tableRDD.write.format(“jdbc”).
option(“url”,“url”).
option(“driver”,“driver”).
option(“dbtable”,“YYY”).
option(“user”,“username”).
option(“password”,“password”).
save()

So similar as you stated just different way but still getting the error…:frowning:


#6

Hi,

When you are calling the option methods you are calling it with literals not the variables you have set?

You will need to remove the " from the second parameters to option to try to get it to work

regards


#7

In spite of trying both ways i.e. by calling with variables as well as by commenting the variable and passing real values directly into the .option method it is failing with same error.
Just fyi…same driver jar file works fine for java though.


#8

Hi

Can you share the new code and the exact error message you see now

Michael.


#9

Here is my code:

object Hive2Mapd {
def main(args: Array[String]) {
val conf = new SparkConf()
//val sc = new SparkContext(conf)
val spark = SparkSession.builder().appName(“Hive2Mapd”).enableHiveSupport().getOrCreate()
val tableRDD = spark.sql(“select * from default.xxx”)
val driver = "com.mapd.jdbc.MapDDriver"
val url = "dbc:mapd:xxxxx:x:xx"
val username = "xxx"
val password = "xxxxx"
Class.forName(driver)
tableRDD.write.format(“jdbc”).option(“url”,url).option(“driver”,driver).option(“dbtable”,“fannie_mae_mapped_a”).option(“user”,username).option(“password”,password).save()
//sc.stop()

Spark submit command: (Tried two different commands)

/bin/spark-submit --jars */mapdjdbc-1.0-SNAPSHOT-jar-with-dependencies.jar
–driver-class-path /mapdjdbc-1.0-SNAPSHOT-jar-with-dependencies.jar
–conf spark.executor.extraClassPath=
/mapdjdbc-1.0-SNAPSHOT-jar-with-dependencies.jar
–class Hive2Mapd /-2.0.jar

/bin/spark-submit
–driver-class-path /mapdjdbc-1.0-SNAPSHOT-jar-with-dependencies.jar
–conf spark.executor.extraClassPath=
/mapdjdbc-1.0-SNAPSHOT-jar-with-dependencies.jar
–class Hive2Mapd /-2.0.jar

Here is output on console:

Exception in thread “main” java.lang.ClassNotFoundException: com.mapd.jdbc.MapDDriver
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at Hive2Mapd$.main(Hive2MapD.scala:24)
at Hive2Mapd.main(Hive2MapD.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)


#10

Hi,

it is now looking for the correct driver now

Exception in thread “main” java.lang.ClassNotFoundException: com.mapd.jdbc.MapDDriver

is the path to the jar actually the root directory?

–driver-class-path /mapdjdbc-1.0-SNAPSHOT-jar-with-dependencies.jar 

Does the spark user have privilege and access to / directory

regards


#11

Hi,

Its not in the root…I removed the full path before sharing the details with you. Also, I checked, spark should able to access the jar in that directory.


#12

Have you tried this with some/any other database and its driver jar in the same path as the MapD driver and report if it runs into the same CNF exception?