hadoop - Sqoop - Error while exporting from hive to mysql -
i have problem using sqoop export hive bigint data mysql.
the type of column in mysql , hive bigint.
i following error:
caused by: java.lang.numberformatexception: input string: "3465195470" ... @ java.lang.integer.parseint (integer.java:583)
it seems error occurs when converting string stored in hdfs numeric type.
both hive , mysql columns bigint types, how solve problem?
add sqoop command
export -connect "jdbc:mysql://{url}/{db}?{option}" --username {username} --password {password} --table {username} --columns "column1,column2,column3" --export-dir /apps/hive/warehouse/tmp.db/{table} --update-mode allowinsert --update-key column1 --input-fields-terminated-by "\001" --input-null-string "\\n" --input-null-non-string "\\n" --null-string "\\n" --null-non-string "\\n"
it issue due missing column or wrong column position.
also there no need of --null-string
, -null-non-string
. these used in sqoop import
commands.
Comments
Post a Comment