You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Query in the book:
INSERT INTO partitioned_flights
PARTITION (DEST_COUNTRY_NAME="UNITED STATES")
SELECT count, ORIGIN_COUNTRY_NAME FROM flights
WHERE DEST_COUNTRY_NAME='UNITED STATES'
LIMIT 12
In Spark 3.0, The above query returns the below error in SQL statement: AnalysisException: Cannot write incompatible data to table 'default.partitioned_flights':
Cannot safely cast 'count': string to bigint
so, modified the query as below to cast the count column as an integer.
INSERT INTO partitioned_flights
PARTITION (DEST_COUNTRY_NAME = "UNITED STATES")
SELECT ORIGIN_COUNTRY_NAME, cast(count as int) count1 FROM flights
WHERE DEST_COUNTRY_NAME = "UNITED STATES"
LIMIT 12
could you please check on this?
The text was updated successfully, but these errors were encountered:
Query in the book:
INSERT INTO partitioned_flights
PARTITION (DEST_COUNTRY_NAME="UNITED STATES")
SELECT count, ORIGIN_COUNTRY_NAME FROM flights
WHERE DEST_COUNTRY_NAME='UNITED STATES'
LIMIT 12
In Spark 3.0, The above query returns the below error in SQL statement: AnalysisException: Cannot write incompatible data to table '
default
.partitioned_flights
':so, modified the query as below to cast the count column as an integer.
INSERT INTO partitioned_flights
PARTITION (DEST_COUNTRY_NAME = "UNITED STATES")
SELECT ORIGIN_COUNTRY_NAME, cast(count as int) count1 FROM flights
WHERE DEST_COUNTRY_NAME = "UNITED STATES"
LIMIT 12
could you please check on this?
The text was updated successfully, but these errors were encountered: