我想將數據插入配置單元表。將數據插入配置單元表
1)創建數據庫。
2)在特定的數據庫中創建表。
3)在特定位置創建一個虛擬表。
4)使用虛擬表插入數據到主表中。
當我插入數據的過程完成沒有例外,但數據沒有插入表中。
hive> create database final;
OK 耗時:2.56秒
hive> create table final.abc (user_name string, password string)
> ROW FORMAT DELIMITED
> FIELDS TERMINATED BY ','
> LINES TERMINATED BY '\n'
> STORED AS TEXTFILE;
OK 耗時:0.591秒
hive> create table foo (user string , password string)
> ROW FORMAT DELIMITED
> FIELDS TERMINATED BY ','
> LINES TERMINATED BY '\n'
> STORED AS TEXTFILE
> Location '/usr/hive/hive-0.10.0/fiels';
OK 耗時:0.051秒
hive> insert into table final.abc select 'username','password' from foo;
Total MapReduce jobs = 3
Launching Job 1 out of 3
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_201306191046_0002, Tracking URL =/jobdetails.jsp?jobid=job_201306191046_0002
Kill Command = /usr/hadoop/hadoop-1.1.2/libexec/../bin/hadoop job -kill job_201306191046_0002
Hadoop job information for Stage-1: number of mappers: 0; number of reducers: 0
2013-06-19 12:04:36,870 Stage-1 map = 0%, reduce = 0%
2013-06-19 12:04:37,878 Stage-1 map = 100%, reduce = 100%
Ended Job = job_201306191046_0002
Ended Job = -331805541, job is filtered out (removed at runtime).
Ended Job = -1750065493, job is filtered out (removed at runtime).
Moving data to: hdfs://localhost:9000/tmp/hive-root/hive_2013-06-19_12-04-32_830_4819535129373917658/-ext-10000
Loading data to table final.abc
Table final.abc stats: [num_partitions: 0, num_files: 0, num_rows: 0, total_size: 0, raw_data_size: 0]
MapReduce Jobs Launched:
Job 0: HDFS Read: 0 HDFS Write: 0 SUCCESS
Total MapReduce CPU Time Spent: 0 msec
OK
Time taken: 5.475 seconds
如果有任何想法,請給我建議。我在哪裏錯了。
你應該有虛表的一些數據ATLEAST文件中一行 –