0
所以,我試圖從我的飼料請求狀態時遇到這個問題,錯誤是如此模糊我無法弄清楚我做錯了什麼。任何幫助,將不勝感激。JSON和Twitter API的問題
My Code:
with open("Output.txt") as input:
lines = [line for line in input if line.strip()]
with open("Output.txt", "w") as output:
for items in api.request('statuses/home_timeline', {'count': '200'}):
x = ((items['text'] if 'text' in items else items).encode('ascii', 'ignore'))
current_id = id(x)
print(current_id)
print(x.decode('ascii', 'ignore'))
output.write((x.decode('ascii', 'ignore')) + '\n')
for items in api.request('statuses/home_timeline', {'count': '200'}, {'max_id': str(current_id)}):
x = ((items['text'] if 'text' in items else items).encode('ascii', 'ignore'))
current_id = id(x)
print(current_id)
print(x.decode('ascii', 'ignore'))
output.write((x.decode('ascii', 'ignore')) + '\n')
The Error Code returned:
Traceback (most recent call last):
File "C:/Users/Brandon/PycharmProjects/untitled/automated_timeline_collector.py", line 29, in <module>
for items in api.request('statuses/home_timeline', {'count': '200'}, {'max_id': str(current_id)}):
File "C:\Python27\lib\site-packages\TwitterAPI\TwitterAPI.py", line 140, in __iter__
for item in self.get_iterator():
File "C:\Python27\lib\site-packages\TwitterAPI\TwitterAPI.py", line 137, in get_iterator
return RestIterator(self.response)
File "C:\Python27\lib\site-packages\TwitterAPI\TwitterAPI.py", line 165, in __init__
resp = response.json()
File "C:\Python27\lib\site-packages\requests\models.py", line 756, in json
return json.loads(self.content.decode(encoding), **kwargs)
File "C:\Python27\lib\json\__init__.py", line 338, in loads
return _default_decoder.decode(s)
File "C:\Python27\lib\json\decoder.py", line 366, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "C:\Python27\lib\json\decoder.py", line 384, in raw_decode
raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
Process finished with exit code 1
該方案通過前200層的狀態就沒有得到關於我使用的電流id,讓代碼拿起它離開的地方是不完全正確的方式問題,而是事。感謝您的幫助和反饋。
所以,如果你回來的不是JSON,那麼**是什麼? – 2014-09-25 03:18:36
捕捉異常,看看它是否有更多的細節。碰到一些不好的頁面可能是正常的,發現異常並繼續。 – tdelaney 2014-09-25 03:51:11