You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
dbWriteTable() creates a table but fails to populate. Subsequent calls to dbAppendTable() produce the same error. The same error is in #422.
dbWriteTable(conn, 'an_test', data.frame(x = 1, y = 'a'), overwrite=FALSE, row.names = NULL)
#> Error: nanodbc/nanodbc.cpp:1655: 00000: [Simba][Hardy] (80) Syntax or semantic analysis error thrown in server while executing query. Error message from server: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.catalyst.parser.ParseException:
#> no viable alternative at input '(?'(line 2, pos 8)
#>
#> == SQL ==
#> INSERT INTO `an_test` (`x`, `y`)
#> VALUES (?, ?)
#> --------^^^
#>
#> at org.apac
#> <SQL> 'INSERT INTO `an_test` (`x`, `y`)
#> VALUES (?, ?)'
Database
Simba Spark 64-bit, Spark SQL Version: 3.1.2
Reproducible Example
library(odbc)
library(DBI)
conn<- dbConnect(
odbc::odbc(),
Driver="Simba Spark 64-bit",
...,
ThriftTransport=2,
UseNativeQuery=1,
SSL=1)
dbRemoveTable(conn, 'an_test') #case senstistive
dbWriteTable(conn, 'an_test', data.frame(x=1, y='a'), overwrite=FALSE, row.names=NULL)
#> Error: nanodbc/nanodbc.cpp:1655: 00000: [Simba][Hardy] (80) Syntax or semantic analysis error thrown in server while executing query. Error message from server: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.catalyst.parser.ParseException: #> no viable alternative at input '(?'(line 2, pos 8)#> #> == SQL ==#> INSERT INTO `an_test` (`x`, `y`)#> VALUES (?, ?)#> --------^^^#> #> at org.apac #> <SQL> 'INSERT INTO `an_test` (`x`, `y`)#> VALUES (?, ?)'
The table exists, but with no rows
dbReadTable(conn, 'an_test')
#> [1] x y
#> <0 rows> (or 0-length row.names)
dbAppendTable() produces the same error as above. Manually inserting with dbExecute() works.
dbExecute(conn, "INSERT INTO `an_test` (`x`, `y`)
VALUES (1, 'a')")
#> [1] 0
dbReadTable(conn, 'an_test')
#> x y
#> 1 1 a
Attempting to use dbExecute() with sqlAppendTable() produces a potentially related error.
sql_append <- sqlAppendTable(conn, 'an_test', value = data.frame(x=2, y='b'))
dbExecute(conn, sql_append)
#> Error: nanodbc/nanodbc.cpp:1655: 00000: [Simba][Hardy] (80) Syntax or semantic analysis error thrown in server while executing query. Error message from server: org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.AnalysisException: cannot resolve '`b`' given input columns: []; line 4 pos 6;
#> 'InsertIntoStatement 'UnresolvedRelation [an_test], [], false, [x, y], false, false
#> +- '
#> <SQL> 'INSERT INTO `an_test`
#> (`x`, `y`)
#> VALUES
#> (2, b)'
double quoting the character column seems to work
sql_append <- sqlAppendTable(conn, 'an_test', value = data.frame(x=2, y="'b'"))
dbExecute(conn, sql_append)
#> [1] 0
dbReadTable(conn, 'an_test')
#> x y
#> 1 2 b
#> 2 1 a
The text was updated successfully, but these errors were encountered:
ahernnelson
changed the title
dbWriteTable() and dbAppendTable() produce parsing errors in Hive
dbWriteTable() and dbAppendTable() produce parsing errors in Spark SQL
Feb 15, 2023
Further digging has led me to believe that the issue is with the prepared statement on line 88 of Table.R which is being executed verbatim and the result is never assigned.
I can't find any support for prepared statements in Spark-odbc, so I'm not sure there is much to do outside of writing my own template or attempting to write to S3, but I am hoping this is not the case.
This seems to be the case in both #500 (Snowflake) #276 (Impala).
Hi - i am not a Databricks user but thought I would ask: I am sure you have already tried this, but what happens when connecting without the UseNativeQuery option?
Actually I think the problem is the use of UseNativeQuery=1 — this suppresses the driver's default SQL translation which unfortunately breaks prepared query support. But I don't think there's anything we can do about this unfortunately.
Issue Description and Expected Result
dbWriteTable()
creates a table but fails to populate. Subsequent calls todbAppendTable()
produce the same error. The same error is in #422.Database
Simba Spark 64-bit, Spark SQL Version: 3.1.2
Reproducible Example
The table exists, but with no rows
dbAppendTable()
produces the same error as above. Manually inserting withdbExecute()
works.Attempting to use
dbExecute()
withsqlAppendTable()
produces a potentially related error.double quoting the character column seems to work
The text was updated successfully, but these errors were encountered: