2016-08-19 40 views
1

我發現很多人有類似的錯誤,但沒有提示正在爲我的問題工作。Sqoop導入錯誤:訪問被拒絕用戶'root'@'localhost',即使權利正常

我的命令行:

sqoop import --connect jdbc:mysql://localhost/databaseY --username=root -P --table tableX --target-dir /user/ec2-user/databaseY/tableX --as-textfile --fields-terminated-by "\t" 

錯誤

16/08/19 11:25:51 INFO mapreduce.Job: Job job_1471608424445_0028 running in uber mode : false 
16/08/19 11:25:51 INFO mapreduce.Job: map 0% reduce 0% 
16/08/19 11:25:58 INFO mapreduce.Job: map 25% reduce 0% 
16/08/19 11:26:04 INFO mapreduce.Job: map 50% reduce 0% 
16/08/19 11:26:06 INFO mapreduce.Job: Task Id :  attempt_1471608424445_0028_m_000000_0, Status : FAILED 
Error: java.lang.RuntimeException: java.lang.RuntimeException: java.sql.SQLException: Access denied for user 'root'@'localhost' (using password: YES) 

它是如何可能的地圖開始,並且由於這個錯誤後停止?

它看起來像我有需要,因爲這兩個命令行中的所有正確正在:

sqoop list-databases --connect jdbc:mysql://localhost --username root -P 

而且

在有root帳號MySQL的我可以做

select * from databaseY.tableX 

- -EDIT ---

該命令行正在執行:

sqoop import --connect jdbc:mysql://localhost/databaseY --username root --password PASSWORD --query "select * from databaseY.tableX where number = 1474 AND \$CONDITIONS" --target-dir /tmp/ok --as-textfile --direct --split-by number 

但是這一次沒有:

sqoop import --connect jdbc:mysql://localhost/databaseY --username root --password PASSWORD --query "select * from databaseY.tableX where 1 = 1 AND \$CONDITIONS" --target-dir /tmp/ok --as-textfile --direct --split-by number 

然後,我意識到,如果我用-m 1我sqoop進口工作。僅限於-m 1

是否意味着我的集羣配置很差?爲什麼我的工作只能在一個地圖任務上工作?

----- SOLUTION -----

這只是一個IP地址的問題。我通過IP地址更改了localhost,現在工作正常。

+0

你可以用'--username root'而不是'--username = root'來試試第一個命令嗎? –

+0

是的,我嘗試了=或沒有但沒有改變。這不是兩個都有效 – Selverine

回答

2

這就是Sqoop的工作原理。請參閱official doc

If a table does not have a primary key defined and the --split-by is not provided, then import will fail unless the number of mappers is explicitly set to one with the --num-mappers 1 option or the --autoreset-to-one-mapper option is used. The option --autoreset-to-one-mapper is typically used with the import-all-tables tool to automatically handle tables without a primary key in a schema.

0

此處的用戶名和密碼指定您嘗試連接的數據庫的用戶名和密碼。這裏的用戶名和密碼不適用於mysql登錄用戶。因此,請爲您嘗試連接的數據庫指定的用戶使用用戶名和密碼。

0

較遲,但並非最不重要的,我想我要在這裏給出答案的一部分,在這個線程

類似的問題,我用sqoop面臨進口從Windows主機表中的數據,以虛框來賓Ubuntu的操作系統,只是因爲我的Windows主機安裝了mysql服務器的用戶名沒有足夠的權限來授予遠程請求的連接和模式。

所以,現在不是說太多的問題,我說明我所面臨的問題 -

[email protected]:~/Installations/sqoop-1.4.4$ bin/sqoop import --connect jdbc:mysql://192.168.56.1/india --table india_most_populated_cities --target-dir /user/vm4learning/remotedir/ --username root --password password -m 1 
no main manifest attribute, in /home/vm4learning/Installations/hbase-0.94.14/lib/coprocessor.jar 
find: paths must precede expression: sqoop-test-1.4.4.jar 
Usage: find [-H] [-L] [-P] [-Olevel] [-D help|tree|search|stat|rates|opt|exec] [path...] [expression] 
Warning: $HADOOP_HOME is deprecated. 

17/12/06 11:01:10 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 
17/12/06 11:01:11 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 
17/12/06 11:01:11 INFO tool.CodeGenTool: Beginning code generation 
17/12/06 11:01:13 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: Access denied for user 'root'@'Administrator' (using password: YES) 
java.sql.SQLException: Access denied for user 'root'@'Administrator' (using password: YES) 

於是,我來到了我的windows主機,從CMD運行ipconfig,並首先獲得主機的ip地址機而在Linux(Ubuntu的),它是「使用ifconfig」:

Ethernet adapter VirtualBox Host-Only Network: 

    Connection-specific DNS Suffix . : 
    Link-local IPv6 Address . . . . . : fe80::54d4:4f16:4bdb:885%18 
    IPv4 Address. . . . . . . . . . . : 192.168.56.1 
    Subnet Mask . . . . . . . . . . . : 255.255.255.0 
    Default Gateway . . . . . . . . . : 

所以在這裏我的IP地址192.168.56.1是在windows主機,然後我在Windows主機上登錄到MySQL通過命令和運行,下面說的腳本授予特權 -

的MySQL -u根-p 輸入密碼:********

的MySQL> GRANT ALL ON TO [email protected]'192.168.56.1'IDENTIFIED BY'password'; Query OK,0 rows affected,1 warning(0.00 sec)

mysql> flush privileges; 查詢OK,0行受到影響(0.00秒)

的mysql>退出

就是這樣,然後我去虛框裏面的客人的Ubuntu機(這樣做我的窗戶,無線網絡進行連接),並重新 - 運行上述相同的sqoop命令,並將我想要的表數據導入到羣集。

我想和大家分享我成功後腳本運行線路 -

[email protected]:~/Installations/sqoop-1.4.4$ bin/sqoop import --connect jdbc:mysql://192.168.56.1/india --table india_most_populated_cities --target-dir /user/vm4learning/remotedir/ --username root --password password -m 1 
no main manifest attribute, in /home/vm4learning/Installations/hbase-0.94.14/lib/coprocessor.jar 
find: paths must precede expression: sqoop-test-1.4.4.jar 
Usage: find [-H] [-L] [-P] [-Olevel] [-D help|tree|search|stat|rates|opt|exec] [path...] [expression] 
Warning: $HADOOP_HOME is deprecated. 

17/12/06 17:00:55 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 
17/12/06 17:00:55 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 
17/12/06 17:00:55 INFO tool.CodeGenTool: Beginning code generation 
17/12/06 17:00:58 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `india_most_populated_cities` AS t LIMIT 1 
17/12/06 17:00:58 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `india_most_populated_cities` AS t LIMIT 1 
17/12/06 17:00:58 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/vm4learning/Installations/hadoop-1.2.1 
Note: /tmp/sqoop-vm4learning/compile/fc8e526de8f7a74171941455a22f573f/india_most_populated_cities.java uses or overrides a deprecated API. 
Note: Recompile with -Xlint:deprecation for details. 
17/12/06 17:01:02 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-vm4learning/compile/fc8e526de8f7a74171941455a22f573f/india_most_populated_cities.jar 
17/12/06 17:01:02 WARN manager.MySQLManager: It looks like you are importing from mysql. 
17/12/06 17:01:02 WARN manager.MySQLManager: This transfer can be faster! Use the --direct 
17/12/06 17:01:02 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path. 
17/12/06 17:01:02 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql) 
17/12/06 17:01:02 INFO mapreduce.ImportJobBase: Beginning import of india_most_populated_cities 
17/12/06 17:01:06 INFO mapred.JobClient: Running job: job_201712061003_0002 
17/12/06 17:01:07 INFO mapred.JobClient: map 0% reduce 0% 
17/12/06 17:01:29 INFO mapred.JobClient: map 100% reduce 0% 
17/12/06 17:01:36 INFO mapred.JobClient: Job complete: job_201712061003_0002 
17/12/06 17:01:37 INFO mapred.JobClient: Counters: 18 
17/12/06 17:01:37 INFO mapred.JobClient: Job Counters 
17/12/06 17:01:37 INFO mapred.JobClient:  SLOTS_MILLIS_MAPS=24525 
17/12/06 17:01:37 INFO mapred.JobClient:  Total time spent by all reduces waiting after reserving slots (ms)=0 
17/12/06 17:01:37 INFO mapred.JobClient:  Total time spent by all maps waiting after reserving slots (ms)=0 
17/12/06 17:01:37 INFO mapred.JobClient:  Launched map tasks=1 
17/12/06 17:01:37 INFO mapred.JobClient:  SLOTS_MILLIS_REDUCES=0 
17/12/06 17:01:37 INFO mapred.JobClient: File Output Format Counters 
17/12/06 17:01:37 INFO mapred.JobClient:  Bytes Written=10406 
17/12/06 17:01:37 INFO mapred.JobClient: FileSystemCounters 
17/12/06 17:01:37 INFO mapred.JobClient:  HDFS_BYTES_READ=87 
17/12/06 17:01:37 INFO mapred.JobClient:  FILE_BYTES_WRITTEN=80423 
17/12/06 17:01:37 INFO mapred.JobClient:  HDFS_BYTES_WRITTEN=10406 
17/12/06 17:01:37 INFO mapred.JobClient: File Input Format Counters 
17/12/06 17:01:37 INFO mapred.JobClient:  Bytes Read=0 
17/12/06 17:01:37 INFO mapred.JobClient: Map-Reduce Framework 
17/12/06 17:01:37 INFO mapred.JobClient:  Map input records=271 
17/12/06 17:01:37 INFO mapred.JobClient:  Physical memory (bytes) snapshot=81043456 
17/12/06 17:01:37 INFO mapred.JobClient:  Spilled Records=0 
17/12/06 17:01:37 INFO mapred.JobClient:  CPU time spent (ms)=2530 
17/12/06 17:01:37 INFO mapred.JobClient:  Total committed heap usage (bytes)=49807360 
17/12/06 17:01:37 INFO mapred.JobClient:  Virtual memory (bytes) snapshot=973082624 
17/12/06 17:01:37 INFO mapred.JobClient:  Map output records=271 
17/12/06 17:01:37 INFO mapred.JobClient:  SPLIT_RAW_BYTES=87 
17/12/06 17:01:37 INFO mapreduce.ImportJobBase: Transferred 10.1621 KB in 33.9572 seconds (306.4444 bytes/sec) 
17/12/06 17:01:37 INFO mapreduce.ImportJobBase: Retrieved 271 records. 

希望它會幫助你,也把我的帖子這個線程的答案,謝謝。

相關問題