2
對ByteBuffer中應該存在的數據的最小大小是否有任何限制,以便Jackson能夠序列化它?這樣做的時候我得到了BufferUnderflowException。 但是,當數據大小很大時,它工作正常。Jackson Json,編組ByteBuffer
public class MyTest {
private static class Wrapper {
private ByteBuffer buffer;
public void setBuffer(ByteBuffer buffer) {
this.buffer = buffer;
}
public ByteBuffer getBuffer() {
return buffer;
}
}
@Test
public void fails() throws Exception {
// Fails
ByteBuffer smallBuffer = ByteBuffer.wrap("small".getBytes());
Wrapper wrapper1 = new Wrapper();
wrapper1.setBuffer(smallBuffer);
System.out.println(new ObjectMapper().writeValueAsBytes(wrapper1));
}
@Test
public void works() throws Exception {
// Works
ByteBuffer smallBuffer = ByteBuffer.wrap("larger string works, wonder why".getBytes());
Wrapper wrapper1 = new Wrapper();
wrapper1.setBuffer(smallBuffer);
System.out.println(new ObjectMapper().writeValueAsBytes(wrapper1));
}
}
異常堆棧跟蹤:有很多內部狀態和/或非標準的getter/setter方法的序列化複雜的對象時
org.codehaus.jackson.map.JsonMappingException: (was java.nio.BufferUnderflowException) (through reference chain: com.test.Wrapper["buffer"]->java.nio.HeapByteBuffer["int"])
at org.codehaus.jackson.map.JsonMappingException.wrapWithPath(JsonMappingException.java:218)
at org.codehaus.jackson.map.JsonMappingException.wrapWithPath(JsonMappingException.java:183)
at org.codehaus.jackson.map.ser.std.SerializerBase.wrapAndThrow(SerializerBase.java:140)
at org.codehaus.jackson.map.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:158)