2016-03-03 38 views
0

我正在使用Datanucleus來執行CRUD。我刪除一個實體,然後執行命名查詢,爲什麼已經刪除的實體仍然在結果列表中?Datanucleus JPA命名查詢返回被刪除的實體

首先,刪除實體:

MyEntity e = manager.find(MyEntity.class, id); 
manager.remove(e); 

然後,查詢:

@NamedQueries({ 
     @NamedQuery(name = MyEntity.FIND_ALL, query = "SELECT a FROM MyEntity a ORDER BY a.updated DESC") 
}) 
public static final String FIND_ALL = "MyEntity.findAll"; 
TypedQuery<MyEntity> query = manager.createNamedQuery(FIND_ALL, MyEntity.class); 
return query.getResultList(); 

配置datanucleus.Optimistic的persistence.xml:

<property name="datanucleus.Optimistic" value="true" /> 

命名查詢將返回意外的其中包含已刪除實體的結果列表。如果datanucleus.Optimistic=false,那麼結果是正確的。爲什麼datanucleus.Optimistic=true不起作用?

有關這個案子的詳細信息:

下面是CRUD相關的日誌:

1.保存操作的日誌:

DEBUG: DataNucleus.Transaction - Transaction begun for ExecutionContext [email protected] (optimistic=true) 
INFO : org.springframework.test.context.transaction.TransactionalTestExecutionListener - Began transaction (1): transaction manager [[email protected]]; rollback [true] 
DEBUG: DataNucleus.Persistence - Making object persistent : "[email protected]" 
DEBUG: DataNucleus.Cache - Object with id "com.demo.MyEntity:07cad778-d1c3-4834-ace7-ac2e4ecacc24" not found in Level 1 cache [cache size = 0] 
DEBUG: DataNucleus.Cache - Object with id "com.demo.MyEntity:07cad778-d1c3-4834-ace7-ac2e4ecacc24" not found in Level 2 cache 
DEBUG: DataNucleus.Persistence - Managing Persistence of Class : com.demo.MyEntity [Table : (none), InheritanceStrategy : superclass-table] 
DEBUG: DataNucleus.Cache - Object "[email protected]a65f" (id="com.demo.MyEntity:07cad778-d1c3-4834-ace7-ac2e4ecacc24") added to Level 1 cache (loadedFlags="[YNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNN]") 
DEBUG: DataNucleus.Lifecycle - Object "[email protected]" (id="com.demo.MyEntity:07cad778-d1c3-4834-ace7-ac2e4ecacc24") has a lifecycle change : "HOLLOW"->"P_NONTRANS" 
DEBUG: DataNucleus.Persistence - Fetching object "[email protected]" (id=07cad778-d1c3-4834-ace7-ac2e4ecacc24) fields [entityId,extensions,objectType,openSocial,published,updated,url,actor,appId,bcc,bto,cc,content,context,dc,endTime,generator,geojson,groupId,icon,inReplyTo,ld,links,location,mood,object,odata,opengraph,priority,provider,rating,result,schema_org,source,startTime,tags,target,title,to,userId,verb] 
DEBUG: DataNucleus.Datastore.Retrieve - Object "[email protected]" (id="07cad778-d1c3-4834-ace7-ac2e4ecacc24") being retrieved from HBase 
DEBUG: org.apache.hadoop.hbase.zookeeper.ZKUtil - hconnection opening connection to ZooKeeper with ensemble (master.hbase.com:2181) 

.... 
DEBUG: org.apache.hadoop.hbase.client.MetaScanner - Scanning .META. starting at row=MyEntity,,00000000000000 for max=10 rows using org.apache.h[email protected]25c7f5b0 
... 
DEBUG: DataNucleus.Cache - Object with id="com.demo.MyEntity:07cad778-d1c3-4834-ace7-ac2e4ecacc24" being removed from Level 1 cache [current cache size = 1] 
DEBUG: DataNucleus.ValueGeneration - Creating ValueGenerator instance of "org.datanucleus.store.valuegenerator.UUIDGenerator" for "uuid" 
DEBUG: DataNucleus.ValueGeneration - Reserved a block of 1 values 
DEBUG: DataNucleus.ValueGeneration - Generated value for field "com.demo.BaseEntity.entityId" using strategy="custom" (Generator="org.datanucleus.store.valuegenerator.UUIDGenerator") : value=4aa3c4a8-b450-473e-aeba-943dc6ef30ce 
DEBUG: DataNucleus.Cache - Object "[email protected]" (id="com.demo.MyEntity:4aa3c4a8-b450-473e-aeba-943dc6ef30ce") added to Level 1 cache (loadedFlags="[YYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY]") 
DEBUG: DataNucleus.Transaction - Object "[email protected]" (id="4aa3c4a8-b450-473e-aeba-943dc6ef30ce") enlisted in transactional cache 
DEBUG: DataNucleus.Persistence - Object "[email protected]" has been marked for persistence but its actual persistence to the datastore will be delayed due to use of optimistic transactions or "datanucleus.flush.mode" setting 

2.登錄DELETE操作:

DEBUG: DataNucleus.Cache - Object "[email protected]" (id="com.demo.MyEntity:4aa3c4a8-b450-473e-aeba-943dc6ef30ce") taken from Level 1 cache (loadedFlags="[YYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY]") [cache size = 1] 
DEBUG: DataNucleus.Persistence - Deleting object from persistence : "[email protected]" 
DEBUG: DataNucleus.Lifecycle - Object "[email protected]" (id="com.demo.MyEntity:4aa3c4a8-b450-473e-aeba-943dc6ef30ce") has a lifecycle change : "P_NEW"->"P_NEW_DELETED" 

3.登錄指定的查詢操作:

DEBUG: DataNucleus.Cache - Query Cache of type "org.datanucleus.query.cache.SoftQueryCompilationCache" initialised 
DEBUG: DataNucleus.Cache - Query Cache of type "org.datanucleus.store.query.cache.SoftQueryDatastoreCompilationCache" initialised 
DEBUG: DataNucleus.Cache - Query Cache of type "org.datanucleus.store.query.cache.SoftQueryResultsCache" initialised 
DEBUG: DataNucleus.Query - JPQL Single-String with "SELECT a FROM MyEntity a ORDER BY a.updated DESC" 
DEBUG: DataNucleus.Persistence - ExecutionContext.internalFlush() process started using optimised flush - 0 to delete, 1 to insert and 0 to update 
DEBUG: org.apache.hadoop.ipc.HBaseClient - IPC Client (47) connection to namenode.hbase.com/192.168.1.99:60020 from user1 sending #7 
DEBUG: org.apache.hadoop.ipc.HBaseClient - IPC Client (47) connection to namenode.hbase.com/192.168.1.99:60020 from user1 got value #7 
DEBUG: org.apache.hadoop.ipc.RPCEngine - Call: exists 0 
DEBUG: DataNucleus.Datastore.Persist - Object "[email protected]" being inserted into HBase with all reachable objects 
DEBUG: DataNucleus.Datastore.Native - Object "[email protected]" PUT into HBase table "MyEntity" as {"totalColumns":3,"families":{"MyEntity":[{"timestamp":9223372036854775807,"qualifier":"DTYPE","vlen":8},{"timestamp":9223372036854775807,"qualifier":"userId","vlen":5},{"timestamp":9223372036854775807,"qualifier":"entityId","vlen":36}]},"row":"4aa3c4a8-b450-473e-aeba-943dc6ef30ce"} 
DEBUG: org.apache.hadoop.ipc.HBaseClient - IPC Client (47) connection to namenode.hbase.com/192.168.1.99:60020 from user1 sending #8 
DEBUG: org.apache.hadoop.ipc.HBaseClient - IPC Client (47) connection to namenode.hbase.com/192.168.1.99:60020 from user1 got value #8 
DEBUG: org.apache.hadoop.ipc.RPCEngine - Call: multi 2 
DEBUG: DataNucleus.Datastore.Persist - Execution Time = 123 ms 
DEBUG: DataNucleus.Persistence - ExecutionContext.internalFlush() process finished 
DEBUG: DataNucleus.Query - JPQL Query : Compiling "SELECT a FROM MyEntity a ORDER BY a.updated DESC" 
DEBUG: DataNucleus.Query - JPQL Query : Compile Time = 13 ms 
DEBUG: DataNucleus.Query - QueryCompilation: 
    [from:ClassExpression(alias=a)] 
    [ordering:OrderExpression{PrimaryExpression{a.updated} descending}] 
    [symbols: a type=com.demo.MyEntity] 
DEBUG: DataNucleus.Query - JPQL Query : Compiling "SELECT a FROM MyEntity a ORDER BY a.updated DESC" for datastore 
DEBUG: DataNucleus.Query - JPQL Query : Compile Time for datastore = 2 ms 
DEBUG: DataNucleus.Query - JPQL Query : Executing "SELECT a FROM MyEntity a ORDER BY a.updated DESC" ... 
DEBUG: DataNucleus.Datastore.Native - Retrieving objects for candidate=com.demo.MyEntity and subclasses 
DEBUG: org.apache.hadoop.hbase.client.ClientScanner - Creating scanner over MyEntity starting at key '' 
DEBUG: org.apache.hadoop.hbase.client.ClientScanner - Advancing internal scanner to startKey at '' 
DEBUG: org.apache.hadoop.ipc.HBaseClient - IPC Client (47) connection to namenode.hbase.com/192.168.1.99:60020 from user1 sending #9 
DEBUG: org.apache.hadoop.ipc.HBaseClient - IPC Client (47) connection to namenode.hbase.com/192.168.1.99:60020 from user1 got value #9 
DEBUG: org.apache.hadoop.ipc.RPCEngine - Call: openScanner 1 
DEBUG: org.apache.hadoop.ipc.HBaseClient - IPC Client (47) connection to namenode.hbase.com/192.168.1.99:60020 from user1 sending #10 
DEBUG: org.apache.hadoop.ipc.HBaseClient - IPC Client (47) connection to namenode.hbase.com/192.168.1.99:60020 from user1 got value #10 
DEBUG: org.apache.hadoop.ipc.RPCEngine - Call: next 0 
DEBUG: DataNucleus.Cache - Object "[email protected]" (id="com.demo.MyEntity:4aa3c4a8-b450-473e-aeba-943dc6ef30ce") taken from Level 1 cache (loadedFlags="[YYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY]") [cache size = 1] 
DEBUG: org.apache.hadoop.ipc.HBaseClient - IPC Client (47) connection to namenode.hbase.com/192.168.1.99:60020 from user1 sending #11 
DEBUG: org.apache.hadoop.ipc.HBaseClient - IPC Client (47) connection to namenode.hbase.com/192.168.1.99:60020 from user1 got value #11 
DEBUG: org.apache.hadoop.ipc.RPCEngine - Call: next 0 
DEBUG: org.apache.hadoop.ipc.HBaseClient - IPC Client (47) connection to namenode.hbase.com/192.168.1.99:60020 from user1 sending #12 
DEBUG: org.apache.hadoop.ipc.HBaseClient - IPC Client (47) connection to namenode.hbase.com/192.168.1.99:60020 from user1 got value #12 
DEBUG: org.apache.hadoop.ipc.RPCEngine - Call: close 1 
DEBUG: org.apache.hadoop.hbase.client.ClientScanner - Finished with scanning at {NAME => 'MyEntity,,1457106265917.c6437b9afd33cd225c33e0ed52ff50d4.', STARTKEY => '', ENDKEY => '', ENCODED => c6437b9afd33cd225c33e0ed52ff50d4,} 
DEBUG: DataNucleus.Query - JPQL Query : Processing the "ordering" clause using in-memory evaluation (clause = "[OrderExpression{PrimaryExpression{a.updated} descending}]") 
DEBUG: DataNucleus.Query - JPQL Query : Processing the "resultClass" clause using in-memory evaluation (clause = "com.demo.MyEntity") 
DEBUG: DataNucleus.Query - JPQL Query : Execution Time = 14 ms 

爲什麼以下日誌(與生命週期 「P_NEW_DELETED」 到數據存儲PUT實體)查詢操作中出現?以及如何避免這種行爲?

DEBUG: DataNucleus.Datastore.Persist - Object "[email protected]" being inserted into HBase with all reachable objects 
DEBUG: DataNucleus.Datastore.Native - Object "[email protected]" PUT into HBase table "MyEntity" as {"totalColumns":3,"families":{"MyEntity":[{"timestamp":9223372036854775807,"qualifier":"DTYPE","vlen":8},{"timestamp":9223372036854775807,"qualifier":"userId","vlen":5},{"timestamp":9223372036854775807,"qualifier":"entityId","vlen":36}]},"row":"4aa3c4a8-b450-473e-aeba-943dc6ef30ce"} 
+1

是否有問題? –

+0

創建一個「NAMED」查詢意味着你已經在某個文件中定義了一個命名查詢。你有嗎?在哪種情況下呢? –

+0

是的,我更新了我的帖子。謝謝! – Michael

回答

1

您打開樂觀事務,因此所有數據寫入操作只發生在提交。您在執行查詢之前執行了查詢(並且未爲查詢設置刷新模式),因此在執行查詢時,您的刪除不在數據存儲區中。

呼叫

em.flush() 

執行查詢之前,或者設置

query.setFlushMode(FlushModeType.AUTO);