2017-03-26 74 views
0

我想在Python上構建一個簡單的記號器包。但是當我嘗試在虛擬環境中本地安裝後運行它時,它拒絕導入。這個小Python包有什麼問題?

pip install git+https://github.com/djokester/tokenizer 

然後

>>> import tokenizer 

Traceback (most recent call last): File "", line 1, in ImportError: No module named 'tokenizer'

能否請你告訴我,什麼是錯的包。 這裏是鏈接 https://github.com/djokester/tokenizer

+0

在setup.py中,我看到'name ='Tokenizer''。這是否意味着您需要執行'import Tokenizer'而不是? –

+0

@ Code-Apprentice號碼NumPy也可以作爲numpy導入。 – Djokester

回答

0

該模塊的名稱是tokenize,而不是標記器。以下作品。

/ # pip install 'git+https://github.com/djokester/tokenize' --upgrade 
Collecting git+https://github.com/djokester/tokenize 
    Cloning https://github.com/djokester/tokenize to /tmp/pip-BOScTb-build 
Installing collected packages: Tokenize 
    Running setup.py install for Tokenize ... done 
Successfully installed Tokenize-0.1 
/# 
/# python 
Python 2.7.13 (default, Mar 3 2017, 23:23:44) 
[GCC 5.3.0] on linux2 
Type "help", "copyright", "credits" or "license" for more information. 
>>> 
>>> import tokenize 
>>> dir(tokenize) 
['AMPER', 'AMPEREQUAL', 'AT', 'BACKQUOTE', 'Binnumber', 'Bracket', 'CIRCUMFLEX', 'CIRCUMFLEXEQUAL', 'COLON', 'COMMA', 'COMMENT', 'Comment', 'ContStr', 'DEDENT', 'DOT', 'DOUBLESLASH', 'DOUBLESLASHEQUAL', 'DOUBLESTAR', 'DOUBLESTAREQUAL', 'Decnumber', 'Double', 'Double3', 'ENDMARKER', 'EQEQUAL', 'EQUAL', 'ERRORTOKEN', 'Expfloat', 'Exponent', 'Floatnumber', 'Funny', 'GREATER', 'GREATEREQUAL', 'Hexnumber', 'INDENT', 'ISEOF', 'ISNONTERMINAL', 'ISTERMINAL', 'Ignore', 'Imagnumber', 'Intnumber', 'LBRACE', 'LEFTSHIFT', 'LEFTSHIFTEQUAL', 'LESS', 'LESSEQUAL', 'LPAR', 'LSQB', 'MINEQUAL', 'MINUS', 'NAME', 'NEWLINE', 'NL', 'NOTEQUAL', 'NT_OFFSET', 'NUMBER', 'N_TOKENS', 'Name', 'Number', 'OP', 'Octnumber', 'Operator', 'PERCENT', 'PERCENTEQUAL', 'PLUS', 'PLUSEQUAL', 'PlainToken', 'Pointfloat', 'PseudoExtras', 'PseudoToken', 'RBRACE', 'RIGHTSHIFT', 'RIGHTSHIFTEQUAL', 'RPAR', 'RSQB', 'SEMI', 'SLASH', 'SLASHEQUAL', 'STAR', 'STAREQUAL', 'STRING', 'Single', 'Single3', 'Special', 'StopTokenizing', 'String', 'TILDE', 'Token', 'TokenError', 'Triple', 'Untokenizer', 'VBAR', 'VBAREQUAL', 'Whitespace', '__all__', '__author__', '__builtins__', '__credits__', '__doc__', '__file__', '__name__', '__package__', 'any', 'chain', 'double3prog', 'endprogs', 'generate_tokens', 'group', 'main', 'maybe', 'printtoken', 'pseudoprog', 're', 'single3prog', 'single_quoted', 'string', 't', 'tabsize', 'tok_name', 'tokenize', 'tokenize_loop', 'tokenprog', 'triple_quoted', 'untokenize'] 
>>> 
+0

tokenize已經是python中的一個現有軟件包。我將包的名稱更改爲tokenizer,它不起作用。任何提示有什麼問題? – Djokester