jdbc - Hadoop Sqoop export to Teradata Error -
i'm trying export data teradata using sqoop. entire mapreduce.job gets completed data not loaded , shows following.
15/07/08 01:27:36 info processor.teradataoutputprocessor: input postprocessor com.teradata.connector.teradata.processor.teradatabatchinsertprocessor starts at: 1436333256770 15/07/08 01:27:36 info processor.teradatabatchinsertprocessor: insert staget table target table 15/07/08 01:27:36 info processor.teradatabatchinsertprocessor: insert select sql starts at: 1436333256969 what's wrong?
i used following script load
sqoop export --connect jdbc:teradata://172.xx.xx.xx/database=prd_xxx_xxx \ --connection-manager org.apache.sqoop.teradata.teradataconnmanager \ --username gdw_xyv \ --password 123 \ --export-dir /user/xxxx/xxx_xxx/2001/ \ --table prd_xxx_xxx.table_t_hd \ --input-fields-terminated-by '|' \ --input-escaped-by '\\' \ --input-enclosed-by '\"' \ --input-optionally-enclosed-by '\"' \ --mapreduce-job-name sq_exp_xxx_xxx_2001 \ --verbose \ -m 20
as max said, taking split.by.partition creating temporary stage table. can force stage table creation in database have access, or can use split.by.value, split.by.hash or split.by.amp
note: split.by.hash creates partitioned staging table if input table not partitioned. if input table partitioned, no stage table created.
Comments
Post a Comment