hadoop - How to import data into Hive warehouse from SQL Server 2014 (Unicode) for specific schema -


i want import data sql server , query hive.

i created virtualbox using cloudera template , started reading tutorial.

i able import data sql server using sqoop avro files , create table in hive , import data avro file. query hive.

but import-all-tables command of sqoop imports table of schema "dbo". if want import tables schema dw also? tried use import command import specific table exist in dw schema. doesn't work.

any idea how import data sql sever using sqoop non dbo. schema related tables avro? or import data sql server other dbo. schema , load directly hive?

download jdbc driver , copy sqoop directory

$ curl -l 'http://download.microsoft.com/download/0/2/a/02aae597-3865-456c-ae7f-613f99f850a8/sqljdbc_4.0.2206.100_enu.tar.gz' | tar xz $ sudo cp sqljdbc_4.0/enu/sqljdbc4.jar /var/lib/sqoop/ 

import table sql server using sqoop

sqoop import --driver="com.microsoft.sqlserver.jdbc.sqlserverdriver" --connect="jdbc:sqlserver://sqlserver;database=databasename;username=username;password=passwordofuserprovidedinusername" --username=username --password= passwordofuserprovidedinusername --table="schemaname.tablename" --split-by=primarykeyoftable --compression-codec=snappy --as-avrodatafile --warehouse-dir=/user/hive/warehouse/tablename 

verify if table imported properly

hadoop fs -ls /user/hive/warehouse ls -l *.avsc 

create new directory , provide appropriate permissions

sudo -u hdfs hadoop fs -mkdir /user/examples sudo -u hdfs hadoop fs -chmod +rw /user/examples hadoop fs -copyfromlocal ~/*.avsc /user/examples 

start hive

hive 

import table schema , data sqoop hive warehouse

create external table tablename row format serde 'org.apache.hadoop.hive.serde2.avro.avroserde' stored inputformat 'org.apache.hadoop.hive.ql.io.avro.avrocontainerinputformat' outputformat 'org.apache.hadoop.hive.ql.io.avro.avrocontaineroutputformat' location 'hdfs:///user/hive/warehouse/tablename’ tblproperties ('avro.schema.url'='hdfs://quickstart.cloudera/user/examples/sqoop_import_schemaname_tablename.avsc'); 

note: make sure while typing command single quote may change if coping command. there should not space in path or filenames.


Comments

Popular posts from this blog

toolbar - How to add link to user registration inside toobar in admin joomla 3 custom component -

linux - disk space limitation when creating war file -