2016-04-01 71 views
1

我讀一個JSON文件,並嘗試使用生產它卡夫卡.. 這裏是我的代碼:kafka.common.FailedToSendMessageException:卡夫卡產生錯誤

public class FlatFileDataProducer { 

    private String topic = "JsonTopic"; 
    private Producer<String, String> producer = null; 
    KeyedMessage<String, String> message = null; 
    public JsonReader reader; 

    public void run(String jsonPath) throws ClassNotFoundException, FileNotFoundException, IOException, ParseException{ 
     reader = new JsonReader(); 
     System.out.println("---------------------"); 
     System.out.println("JSON FILE PATH IS : "+jsonPath); 
     System.out.println("---------------------"); 
     Properties prop = new Properties(); 
     prop.put("metadata.broker.list", "192.168.63.145:9092"); 
     prop.put("serializer.class", "kafka.serializer.StringEncoder"); 
     // prop.put("partitioner.class", "example.producer.SimplePartitioner"); 
     prop.put("request.required.acks", "1"); 


     ProducerConfig config = new ProducerConfig(prop); 
     producer = new Producer<String, String>(config); 
     List<Employee> emp = reader.readJsonFile(jsonPath);  
     for (Employee employee : emp) 
     { 
      System.out.println("---------------------"); 
      System.out.println(employee.toString()); 
      System.out.println("---------------------"); 
      message = new KeyedMessage<String, String>(topic, employee.toString()); 

      producer.send(message); 
      producer.close(); 

     } 
     System.out.println("Messages to Kafka successfully"); 
    } 

和代碼讀取JSON文件是:

public List<Employee> readJsonFile(String path) throws FileNotFoundException, IOException, ParseException{ 
     Employee employee = new Employee(); 
     parser=new JSONParser(); 
     Object obj = parser.parse(new FileReader(path)); 
     JSONObject jsonObject = (JSONObject) obj; 
     employee.setId(Integer.parseInt(jsonObject.get("id").toString()));  
     employee.setName((String)jsonObject.get("name")); 
     employee.setSalary(Integer.parseInt(jsonObject.get("salary").toString())); 
     list.add(employee); 
     return list; 
    } 

但是,當我執行程序, 問題1:

> [[email protected] ~]# java -jar sparkkafka.jar /root/customer.json 
> JSON FILE PATH IS : /root/customer.json 
> log4j:WARN No appenders could be found for logger (kafka.utils.VerifiableProperties). log4j:WARN Please 
> initialize the log4j system properly. 
> 1,Smith,25 
> Exception in thread "main" kafka.common.FailedToSendMessageException: Failed to send messages 
> after 3 tries. 
>   at kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:91) 
>   at kafka.producer.Producer.send(Producer.scala:77) 
>   at kafka.javaapi.producer.Producer.send(Producer.scala:33) 
>   at com.up.jsonType.FlatFileDataProducer.run(FlatFileDataProducer.java:41) 
>   at com.up.jsonType.FlatFileDataProducer.main(FlatFileDataProducer.java:49) 

提示錯誤,但是當我檢查cosumer殼,我得到如下圖所示:FOR ONE行JSON文件我看到殼4項.. 問題2:

[根@沙斌]# [根@沙斌]#./kafka-console-consumer.sh --zookeeper本地主機:2181 --topic JsonTopic --from-開始

1,Smith,25 
1,Smith,25 
1,Smith,25 
1,Smith,25 

我收到了相同的數據的4倍的行。

回答

1

您需要同時刪除以下proprty:

//prop.put("request.required.acks", "1"); 
    //prop.put("producer.type","async"); 

此屬性將actully大約需要確認的照顧。

+0

這解決了問題 – Alka

1

你能嘗試添加以下屬性:

prop.put("producer.type","async"); 
+0

這解決了問題1 – Alka