我有scrapy問題,而試圖將數據存儲到MySQL數據庫:我收到以下錯誤:(screenshot here)SCRAPY:如何將數據存儲到Mysql數據庫
我在pipelines.py代碼
class SQLStorePipeline(object):
def __init__(self):
self.dbpool = adbapi.ConnectionPool('localhost', db='python',
user='root', passwd='', cursorclass=MySQLdb.cursors.DictCursor,
charset='utf8', use_unicode=True)
def process_item(self, item, spider):
# run db query in thread pool
query = self.dbpool.runInteraction(self._conditional_insert, item)
query.addErrback(self.handle_error)
return item
def _conditional_insert(self, tx, item):
# create record if doesn't exist.
# all this block run on it's own thread
tx.execute("select * from test where name = %s", (item['name'][0],))
result = tx.fetchone()
if result:
log.msg("Item already stored in db: %s" % item, level=log.DEBUG)
else:
tx.execute(\
"insert into test (name, price) "
"values (%s, %s)",
(item['link'][0],
datetime.datetime.now())
)
log.msg("Item stored in db: %s" % item, level=log.DEBUG)
def handle_error(self, e):
log.err(e)
(我把它從here)。
我的解析類是:
def parse(self, response):
item = DmozItem()
item['name'] = response.xpath('//meta[@itemprop="name"]/@content').extract()[0]
item['price'] = response.xpath('//meta[@itemprop="price"]/@content').extract()[0]
yield item
我知道這個問題已經被問過,但我想在這裏問之前的所有不同的答案,其中沒有工作......
有人可以幫助我嗎?先謝謝你!
根據你的截圖你有一個縮進問題。檢查你的空間。 – multivac