2013-10-21 57 views
0

當嘗試從M/R批量加載到啓用了Snappy壓縮的表中時。我得到以下錯誤:使用Snappy壓縮的HBase表中的BulkLoad獲取UnsatisfiedLinkError

ERROR mapreduce.LoadIncrementalHFiles: Unexpected execution exception during splitting 
java.util.concurrent.ExecutionException: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z 
at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:252) 
    at java.util.concurrent.FutureTask.get(FutureTask.java:111) 
    at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.groupOrSplitPhase(LoadIncrementalHFiles.java:335) 
    at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.doBulkLoad(LoadIncrementalHFiles.java:234) 

表的描述是:

DESCRIPTION                             ENABLED                
{NAME => 'matrix_com', FAMILIES => [{NAME => 't', BLOOMFILTER => 'NONE', REPLICATION_SCOPE => '0', COMPRESSION => 'SNAPPY' true                 
, VERSIONS => '12', TTL => '1555200000', MIN_VERSIONS => '0', BLOCKSIZE => '65536', IN_MEMORY => 'false', BLOCKCACHE => 't                  
rue'}]} 

如果Hadoop的已安裝所有活潑的編解碼器,也HBase的不給人以活潑的創建表時的錯誤,我爲什麼我得到這個錯誤?

回答

相關問題