2011-03-12 146 views
3

當變量NUMBER_OF_ITERATIONS設置爲1時,一切正常......但是當我將它更改爲大於1的任何值時,我遇到了一些問題。MySQL的奇怪行爲SELECT

首先,在這種情況下,當我打印res的值時,我得到一個很大的數字(如18446744073709551615)。

第二,但最重要的,在這種情況下,腳本無法處理數據,因爲價值的lenght始終爲0 ...

if __name__ == '__main__': 

    NUMBER_OF_ITERATIONS = 2 

    conn = DBconnection() # return a database connection 

    for i in range(NUMBER_OF_ITERATIONS): 
     cursor = conn.cursor() 
     res = cursor.execute('SELECT field 
           FROM table 
           WHERE other_field = 0 
           LIMIT 10 LOCK IN SHARE MODE') 
     print '# of selected rows: ' + str(res) 

     values = [] 
     for elem in cursor.fetchall(): 
      if elem != None: 
       values.append(list(elem).pop()) 

     if len(values) != 0: 
      # do something... 

     else: 
      print 'NO VALUES AVAILABLE' 
      cursor.close() 
      break 

    conn.close() 
    print 'DONE' 

我使用InnoDB存儲引擎,同時在此腳本有另一個腳本python上傳數據在同一張表上(使用構造LOAD DATA INFILE)。

我認爲這可能是由於所造成的負載數據表鎖,但什麼是從一個到2(或更多)的迭代區別?一次迭代就可以正常工作,而2次迭代的效果也不錯。我不明白這一點。

回答

2

我無法用下面的代碼來重現問題。你能修改它來證明錯誤嗎?

import config 
import MySQLdb 
import multiprocessing as mp 
import random 
import string 
import time 

def random_string(n): 
    return ''.join(random.choice(string.letters) for _ in range(n)) 

def generate_data(): 
    conn=MySQLdb.connect(
     host=config.HOST,user=config.USER, 
     passwd=config.PASS,db='test')  
    cursor=conn.cursor() 
    while True: 
     with open('/tmp/test.dat','w') as f: 
      for _ in range(20): 
       f.write('{b}\n'.format(b=random_string(10))) 
     # sql='LOCK TABLES foo WRITE' 
     # cursor.execute(sql) 
     sql="LOAD DATA INFILE '/tmp/test.dat' INTO TABLE test.foo" 
     cursor.execute(sql) 
     conn.commit() 
     # sql='UNLOCK TABLES' 
     # cursor.execute(sql)   
     time.sleep(0.05) 

def setup_innodb(connection): 
    cursor=connection.cursor() 
    sql='DROP TABLE IF EXISTS foo' 
    cursor.execute(sql) 
    sql='''\ 
     CREATE TABLE `foo` (
      `bar` varchar(10) NOT NULL 
     ) ENGINE=InnoDB 
     ''' 
    cursor.execute(sql) 
    connection.commit() 

if __name__ == '__main__': 
    NUMBER_OF_ITERATIONS = 20 
    conn=MySQLdb.connect(
     host=config.HOST,user=config.USER, 
     passwd=config.PASS,db='test') 
    setup_innodb(conn) 

    # Start a process which is "simultaneously" calling LOAD DATA INFILE 
    proc=mp.Process(target=generate_data) 
    proc.daemon=True 
    proc.start() 

    for i in range(NUMBER_OF_ITERATIONS): 
     cursor = conn.cursor() 
     # sql='''SELECT field 
     #  FROM table 
     #  WHERE other_field = 0 
     #  LIMIT 10 LOCK IN SHARE MODE''' 
     # sql='LOCK TABLES foo READ' 
     # cursor.execute(sql) 
     sql='''SELECT * 
       FROM foo 
       LOCK IN SHARE MODE 
       ''' 
     res = cursor.execute(sql) 
     print '# of selected rows: ' + str(res) 
     values = cursor.fetchall() 
     # http://dev.mysql.com/doc/refman/5.0/en/innodb-locking-reads.html 
     # Locks set by LOCK IN SHARE MODE and FOR UPDATE reads are released when 
     # the transaction is committed or rolled back. 
     conn.commit() 
     time.sleep(0.1) 

    conn.close() 
    print 'DONE' 

產生

# of selected rows: 0 
# of selected rows: 40 
# of selected rows: 80 
# of selected rows: 120 
# of selected rows: 160 
# of selected rows: 180 
# of selected rows: 220 
# of selected rows: 260 
# of selected rows: 300 
# of selected rows: 340 
# of selected rows: 360 
# of selected rows: 400 
# of selected rows: 440 
# of selected rows: 460 
# of selected rows: 500 
# of selected rows: 540 
# of selected rows: 580 
# of selected rows: 600 
# of selected rows: 640 
# of selected rows: 680 
DONE 
+0

我修改了原來的問題,因爲我已經忘記了一兩件事: 我使用InnoDB存儲引擎,並在此腳本同時存在是對上傳數據另一個腳本蟒蛇同一個表(使用構造LOAD DATA INFILE)。 –

+0

我添加它運行在一個子'LOAD DATA INFILE'代碼,但仍無法重現的錯誤。你能發佈一個可以運行的腳本來證明問題嗎? – unutbu

+0

OK,這似乎是我失蹤後,選擇提交!但現在幾次迭代後,我得到這個錯誤:鎖超時超時;嘗試重新啓動事務 –