本站文章版权归原作者及原出处所有 。内容为作者个人观点, 并不代表本站赞同其观点和对其真实性负责。本站是一个个人学习交流的平台,并不用于任何商业目的,如果有任何问题,请及时联系我们,我们将根据著作权人的要求,立即更正或者删除有关内容。本站拥有对此声明的最终解释权。
24 个评论
谢谢mars老师
这是在sqoop1中的用法,在sqoop2中可以吗?
mars老师你好,我有个错误好几天没解决了,好崩溃。
注: /tmp/sqoop-root/compile/7f6e950988adf217def102c75b2a5f5f/_class.java使用或覆盖了已过时的 API。
注: 有关详细信息, 请使用 -Xlint:deprecation 重新编译。
Job job_1485222347988_0010 failed with state FAILED due to: Application application_1485222347988_0010 failed 2 times due to AM Container for appattempt_1485222347988_0010_000002 exited with exitCode: -1000
老师,这应该怎么解决呢?
注: /tmp/sqoop-root/compile/7f6e950988adf217def102c75b2a5f5f/_class.java使用或覆盖了已过时的 API。
注: 有关详细信息, 请使用 -Xlint:deprecation 重新编译。
Job job_1485222347988_0010 failed with state FAILED due to: Application application_1485222347988_0010 failed 2 times due to AM Container for appattempt_1485222347988_0010_000002 exited with exitCode: -1000
老师,这应该怎么解决呢?
使用SQOOP import 导入命令,加了--delete-target-dir 为什么还会提示如下信息呢,delete不是删除hdfs上面的文件的么
sqoop import --hive-import --connect $connect_str --verbose -m 1 --hive-database $hive_database --table $table_name --delete-target-dir --create-hive-table --hive-table $hive_table
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. AlreadyExistsException(message:Table customer already exists)
sqoop import --hive-import --connect $connect_str --verbose -m 1 --hive-database $hive_database --table $table_name --delete-target-dir --create-hive-table --hive-table $hive_table
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. AlreadyExistsException(message:Table customer already exists)
使用命令 hadoop dfs -rm -R /user/hive/warehouse/ods.db/customer 可以删除文件,但是使用show tables 仍然能看到customer表。
如果使用drop table customer,则 /user/hive/warehouse/ods.db/customer 文件就删除了。
使用 -rm -R不起作用呢,视频里是起作用的。
如果使用drop table customer,则 /user/hive/warehouse/ods.db/customer 文件就删除了。
使用 -rm -R不起作用呢,视频里是起作用的。
sqoop 中 warehouse-dir 与 target-dir 本质区别是什么,使用warehousedir,全表导入提示 目录已存在,使用warehousedir则正确。sqoop import --connect jdbc:mysql://172.11.11.11:3308/test --username test --password test --table parm_type --target-dir /user/test -m 1
老师,你好,在Mac下如何用sqoop工具链接sqlite数据库呢,用户名和密码不清楚是什么
使用target-dir,加上delete-dir
sqoop import --connect jdbc:mysql://IP:3308/test --username root --password root --table tables --target-dir /user/hive/warehouse/tables --delete-target-dir
sqoop import --connect jdbc:mysql://IP:3308/test --username root --password root --table tables --target-dir /user/hive/warehouse/tables --delete-target-dir
老师,我用的sqlserver2008,查看数据库是可以的,查看表的时候就提示 ERROR manager.CatalogQueryManager: Failed to list tables
com.microsoft.sqlserver.jdbc.SQLServerException: 端口号 1433/bigdata 无效。这是什么原因,老师帮忙看一下。一直没解决,谢谢。
com.microsoft.sqlserver.jdbc.SQLServerException: 端口号 1433/bigdata 无效。这是什么原因,老师帮忙看一下。一直没解决,谢谢。
老师,我用的sqlserver2008,查看数据库是可以的,查看表的时候就提示 ERROR manager.CatalogQueryManager: Failed to list tables
com.microsoft.sqlserver.jdbc.SQLServerException: 端口号 1433/bigdata 无效。这是什么原因,老师帮忙看一下。一直没解决,谢谢。
com.microsoft.sqlserver.jdbc.SQLServerException: 端口号 1433/bigdata 无效。这是什么原因,老师帮忙看一下。一直没解决,谢谢。
我sqoop import --connect jdbc:mysql://master:3306/mysql --username root --password mysql --table class --check-column last_mod_fs --incremental lastmodified --last-value "2017-05-17 15:05:08.0" --merge-key class_id -m 1这个merge-key命令导入的时候报这个错误:ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Could not load jar /tmp/sqoop-hadoop/compile/1e42b4097de2a8f922e25dad4d15264f/class.jar into JVM. (Could not find class class.)
MarsJ 回复 zzwzzwcool
服务是正常的吗?或者你的SQLServer是否可以允许远程访问
老师:我在实际操作的时候卡在了这里,请教一下大概是什么原因造成的?
17/11/16 22:29:42 INFO db.DBInputFormat: Using read commited transaction isolation
17/11/16 22:29:42 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`class_id`), MAX(`class_id`) FROM `sqoop_user`
17/11/16 22:29:43 INFO mapreduce.JobSubmitter: number of splits:2
17/11/16 22:29:43 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1510816836630_0032
17/11/16 22:29:43 INFO impl.YarnClientImpl: Submitted application application_1510816836630_0032
17/11/16 22:29:43 INFO mapreduce.Job: The url to track the job: http://192.168.0.102:18088/proxy/application_1510816836630_0032/
17/11/16 22:29:43 INFO mapreduce.Job: Running job: job_1510816836630_0032
17/11/16 22:29:42 INFO db.DBInputFormat: Using read commited transaction isolation
17/11/16 22:29:42 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`class_id`), MAX(`class_id`) FROM `sqoop_user`
17/11/16 22:29:43 INFO mapreduce.JobSubmitter: number of splits:2
17/11/16 22:29:43 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1510816836630_0032
17/11/16 22:29:43 INFO impl.YarnClientImpl: Submitted application application_1510816836630_0032
17/11/16 22:29:43 INFO mapreduce.Job: The url to track the job: http://192.168.0.102:18088/proxy/application_1510816836630_0032/
17/11/16 22:29:43 INFO mapreduce.Job: Running job: job_1510816836630_0032
谢谢老师,学习了
老师,我没有装数据库,sqoop导入操作实践之前,数据库这块怎么安装?以及你的数据库连接工具怎么安装
MarsJ 回复 17621238112
Linux系统自带MySQL,或者yum install安装。so easy
使用sqoop将mysql数据库表导入hive时,总是报这个错,上网差了好多也没找到解决方法,求问老师这个如何解决,谢谢!
Warning: /root/sqoop-1.4.7.bin__hadoop-2.6.0/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /root/sqoop-1.4.7.bin__hadoop-2.6.0/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /root/sqoop-1.4.7.bin__hadoop-2.6.0/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /root/sqoop-1.4.7.bin__hadoop-2.6.0/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
18/04/01 14:03:59 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
18/04/01 14:03:59 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/04/01 14:03:59 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
18/04/01 14:03:59 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
18/04/01 14:03:59 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
18/04/01 14:03:59 INFO tool.CodeGenTool: Beginning code generation
Exception in thread "main" java.lang.NoClassDefFoundError: Could not initialize class org.apache.derby.jdbc.AutoloadedDriver40
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:340)
at java.sql.DriverManager.isDriverAllowed(DriverManager.java:556)
at java.sql.DriverManager.getConnection(DriverManager.java:661)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:904)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:59)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:763)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:786)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:289)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:260)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:246)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:327)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1872)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1671)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:106)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:501)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Warning: /root/sqoop-1.4.7.bin__hadoop-2.6.0/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /root/sqoop-1.4.7.bin__hadoop-2.6.0/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /root/sqoop-1.4.7.bin__hadoop-2.6.0/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /root/sqoop-1.4.7.bin__hadoop-2.6.0/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
18/04/01 14:03:59 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
18/04/01 14:03:59 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/04/01 14:03:59 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
18/04/01 14:03:59 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
18/04/01 14:03:59 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
18/04/01 14:03:59 INFO tool.CodeGenTool: Beginning code generation
Exception in thread "main" java.lang.NoClassDefFoundError: Could not initialize class org.apache.derby.jdbc.AutoloadedDriver40
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:340)
at java.sql.DriverManager.isDriverAllowed(DriverManager.java:556)
at java.sql.DriverManager.getConnection(DriverManager.java:661)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:904)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:59)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:763)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:786)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:289)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:260)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:246)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:327)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1872)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1671)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:106)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:501)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
老师,我的数据就一直卡死在:19/03/31 16:48:06 INFO mapreduce.Job: Running job: job_1554021812146_0001 ,这个节点就没反应。是否有日志可以排查原因,谢谢
连接远程mysql的时候出现问题了,一直提示链接失败。
19/11/05 14:42:46 ERROR manager.SqlManager: Error executing statement: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
19/11/05 14:42:46 ERROR manager.SqlManager: Error executing statement: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
看过视频,讲的很仔细,对自己工作有很大帮助