Sqoop does not import varchar2 data type

sqoop does not import the varchar2 data type into hadoop I have a table in Oracle Database and I want to import data into hdfs. I am trying to do this using sqoop, but varchar2 columns are not imported. I mean that this data does not go to the hdfs file. my sqoop team

sqoop import -D mapred.job.name='default oraoop'  --driver oracle.jdbc.driver.OracleDriver --connect "jdbc:oracle:thin:MyIp:MyServiceName" --username "XXXX" --password "XX" --target-dir "My_dir" --query 'select * from MyTable where $CONDITIONS' --split-by "coulmn"  --boundary-query "SELECT min(splitColumn),max(SplitCoulmn)  FROM DUAL" --num-mappers 30
+4
source share
2 answers

you can try lowering ojdbc instead of using the higher ojdbc "ojdbc6 or ojdbc7" using "ojdbc14", this solved the problem for me, but in order not to encounter an exception when some coding classes were not found, delete or rename "ori18n.jar" when importing data from orale9i.

you can find the paths to these jar files in $ HADOOP_CLASSPATH "and" $ SQOOP_HOME "

+1
source

Maybe sqoop was not able to identify the corresponding java type VARCHAR2, so try with -map-column-java.

let's say column A is a type VARCHAR2, then your sqoop command will be,

sqoop import -D mapred.job.name = 'default oraoop' --driver oracle.jdbc.driver.OracleDriver --connect "jdbc: oracle: thin: MyIp: MyServiceName" --username "XXXX" --password "XX" --target-dir "My_dir" --query 'select * from MyTable, $CONDITIONS' - map-column-java a = String --split-by "coulmn" - "SELECT min (splitColumn), max (SplitCoulmn) FROM DUAL" --num-mappers 30

, .

0

Source: https://habr.com/ru/post/1660266/


All Articles