2017-07-17 69 views
0

轉換郵件使用Spring雲數據流的版本1.2.2具有以下配置:春季雲數據流 - 無法從卡夫卡

spring.cloud.dataflow.applicationProperties.stream.spring.cloud.stream.binders.kafka1.type=kafka 
spring.cloud.dataflow.applicationProperties.stream.spring.cloud.stream.binders.kafka1.environment.spring.cloud.stream.kafka.binder.brokers=<MY_BROKER> 
spring.cloud.dataflow.applicationProperties.stream.spring.cloud.stream.binders.kafka1.environment.spring.cloud.stream.kafka.binder.zkNodes=<MY_ZK> 

我試圖創建流將從特定主題閱讀並刷新它到長片,如下:

stream create --name metricsStream --definition ":metrics --spring.cloud.stream.bindings.input.binder=kafka1 --spring.cloud.stream.bindings.output.content-type='text/plain;charset=UTF-8' > bridge | log" --deploy 

展望上的日誌文件,我可以看到以下錯誤:

2017-07-17 09:44:01,700 INFO -kafka-listener-1 log-sink:202 - [[email protected] 2017-07-17 09:44:01,700 ERROR -kafka-listener-1 o.s.c.s.b.k.KafkaMessageChannelBinder:283 - Could not convert message: 7B226D657472696354696D657374616D70223A313530303233373037302C226D65747269634E616D65223A22636577632E7265636F6E6E61697373616E63655F616E645F7363616E6E696E672E64726F70735F7065725F65787465726E616C5F736F757263655F69702E3131335F32395F3233365F313136222C224074696D657374616D70223A22323031372D30372D31365432303A33313A32352E3438325A222C22706F7274223A33363133302C226D657472696356616C7565223A302C224076657273696F6E223A2231222C22686F7374223A223137322E32362E312E313135222C226D657373616765223A22636577632E7265636F6E6E61697373616E63655F616E645F7363616E6E696E672E64726F70735F7065725F65787465726E616C5F736F757263655F69702E3131335F32395F3233365F31313620302031353030323337303730227D java.lang.StringIndexOutOfBoundsException: String index out of range: 380 

我也嘗試配置爲卡夫卡源

stream create --name metricsStream --definition ":metrics --spring.kafka.consumer.valueDerserializer=org.apache.kafka.common.serialization.StringDeserializer --spring.cloud.stream.bindings.input.binder=kafka1 --spring.cloud.stream.bindings.output.content-type='text/plain;charset=UTF-8' --spring.cloud.stream.bindings.input.consumer.headerMode=raw --spring.cloud.stream.bindings.output.producer.headerMode=raw --outputType='text/plain;charset=UTF-8' > bridge | log" --deploy 

的消費者/生產者的某些屬性,但我得到了相同的結果

這裏是消費明細,由Spring數據流爲印刷:

2017-07-17 09:43:57,267 INFO main o.a.k.c.c.ConsumerConfig:180 - ConsumerConfig values: auto.commit.interval.ms = 100 auto.offset.reset = earliest bootstrap.servers = [172.26.1.63:9092] check.crcs = true client.id = consumer-2 connections.max.idle.ms = 540000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500  fetch.min.bytes 
= 1  group.id = metrics_KafkaToHdfs_5 heartbeat.interval.ms = 3000 interceptor.classes = null key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer max.partition.fetch.bytes = 1048576  max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2  metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]  receive.buffer.bytes = 65536 reconnect.backoff.ms = 50 request.timeout.ms = 305000  retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter 
= 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.mechanism = GSSAPI  security.protocol = PLAINTEXT send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = null ssl.key.password = null  ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS  ssl.protocol = TLS ssl.provider = null  ssl.secure.random.implementation = null  ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer 

我看到了類似的quetion,但沒有有效的答案 what is the property to accept binary json message in spring-cloud-stream kafka binder

我的卡夫卡指標主題包含JSON行。 我應該如何配置Spring數據流能夠從卡夫卡的JSON格式的主題閱讀,看起來像JSON(或至少字符串格式?

回答

2

您是否嘗試過配置輸入內容類型?

spring.cloud.stream.bindings.input.content-type=application/json

或許從春季雲數據流的前綴:

spring.cloud.dataflow.applicationProperties.stream.spring.cloud.stream.binders.kafka1.environment.spring.cloud.stream.bindings.input.content-type=application/json 
+0

哎#codependent,有沒有問過這個問題,不知何故別人登錄到我的帳戶,如果你有一定的,這是一個很好的答案,讓我知道和我會接受它 –

+0

我建議你改變你的密碼......這就是我在簡單的Striping Cloud Stream應用程序中配置反序列化的方式,所以它應該可以工作。 – codependent

相關問題