Error while dumping data from Spark to MapD?


I am using Spark cluster to read the data from Azure data lake store and using some sparkSQL do some query over the data and I want to dump the data in MapD for Visualization in MapD immerse . I am able to read data from Azure data lake store but while dumping it to MapD i am getting some error which i am not able to rectify is it because of spark or because of MapD. I have already created the table in the MapD to dump the data .

I am getting this error
Exception in thread “main” java.sql.SQLException: Query failed : Syntax error at: "
at com.mapd.jdbc.MapDStatement.executeUpdate(
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.createTable(JdbcUtils.scala:692)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:89)
at org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:518)

And my dataset Schema is
|-- CurrencyPair: string (nullable = true)
|-- ExchangeName: string (nullable = true)
|-- MachineTime: string (nullable = true)
|-- OrderId: string (nullable = true)
|-- OrderSide: string (nullable = true)
|-- Price: double (nullable = true)
|-- Quantity: double (nullable = true)

public class App {
public static void main(String[] args) throws Exception {
	SparkSession spark = SparkSession.builder().appName("Java Spark SQL basic example").master("local[2]")

	Dataset<Row> df ="C:\\Users\\test\\Downloads\\BITFINEX_DSHBTC_ORDER.json");


	Dataset<Row> sqlDF = spark.sql("SELECT * FROM trade");;
	sqlDF.write().format("jdbc").option("url", "jdbc:mapd:**.**.**.***:9091:mapd")
			.option("driver", "com.mapd.jdbc.MapDDriver").option("dbtable", "expample").option("user", "mapd")
			.option("password", "HyperInteractive").save();





It is reporting that he command being submitted by spark has syntax errors.

The error info is not very helpful here so I have added some additional error reporting and uploaded a new jdbc driver here

Which should give us some more info to go on.

Please rerun with this driver and lets see what it is reporting the sql command is.