我有一個計算機視覺算法,我想調整使用scipy.optimize.minimize。現在我只想調整兩個參數,但最終參數的數量可能會增加,所以我想使用可以進行高維梯度搜索的技術。 SciPy中的Nelder-Mead實現看起來很合適。在scipy整數步長優化最小化
我得到了所有設置的代碼,但似乎最小化函數真的想要使用小於1的步長大小的浮點值。當前的一組參數都是整數,其中一個步長爲一個和另一個具有兩個步長(即,值必須是奇數,如果它不是我想優化的東西將它轉換爲奇數)。大致一個參數是以像素爲單位的窗口大小,另一個參數是閾值(0-255之間的值)。
爲什麼值得我使用git repo的scipy新版本。有誰知道如何告訴scipy爲每個參數使用特定的步長?有什麼方法可以滾動我自己的漸變功能嗎?有沒有可以幫助我的scipy旗幟?我知道這可以通過簡單的參數掃描來完成,但我最終希望將此代碼應用於更大的參數集。
代碼本身是死的簡單:
import numpy as np
from scipy.optimize import minimize
from ScannerUtil import straightenImg
import bson
def doSingleIteration(parameters):
# do some machine vision magic
# return the difference between my value and the truth value
parameters = np.array([11,10])
res = minimize(doSingleIteration, parameters, method='Nelder-Mead',options={'xtol': 1e-2, 'disp': True,'ftol':1.0,}) #not sure if these params do anything
print "~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~"
print res
這是我的輸出樣子。正如你所看到的,我們正在重複很多次運行,並沒有在最小化的任何地方。
*+++++++++++++++++++++++++++++++++++++++++
[ 11. 10.] <-- Output from scipy minimize
{'block_size': 11, 'degree': 10} <-- input to my algorithm rounded and made int
+++++++++++++++++++++++++++++++++++++++++
120 <-- output of the function I am trying to minimize
+++++++++++++++++++++++++++++++++++++++++
[ 11.55 10. ]
{'block_size': 11, 'degree': 10}
+++++++++++++++++++++++++++++++++++++++++
120
+++++++++++++++++++++++++++++++++++++++++
[ 11. 10.5]
{'block_size': 11, 'degree': 10}
+++++++++++++++++++++++++++++++++++++++++
120
+++++++++++++++++++++++++++++++++++++++++
[ 11.55 9.5 ]
{'block_size': 11, 'degree': 9}
+++++++++++++++++++++++++++++++++++++++++
120
+++++++++++++++++++++++++++++++++++++++++
[ 11.1375 10.25 ]
{'block_size': 11, 'degree': 10}
+++++++++++++++++++++++++++++++++++++++++
120
+++++++++++++++++++++++++++++++++++++++++
[ 11.275 10. ]
{'block_size': 11, 'degree': 10}
+++++++++++++++++++++++++++++++++++++++++
120
+++++++++++++++++++++++++++++++++++++++++
[ 11. 10.25]
{'block_size': 11, 'degree': 10}
+++++++++++++++++++++++++++++++++++++++++
120
+++++++++++++++++++++++++++++++++++++++++
[ 11.275 9.75 ]
{'block_size': 11, 'degree': 9}
+++++++++++++++++++++++++++++++++++++++++
120
+++++++++++++++++++++++++++++++++++++++++
~~~
SNIP
~~~
+++++++++++++++++++++++++++++++++++++++++
[ 11. 10.0078125]
{'block_size': 11, 'degree': 10}
+++++++++++++++++++++++++++++++++++++++++
120
Optimization terminated successfully.
Current function value: 120.000000
Iterations: 7
Function evaluations: 27
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
status: 0
nfev: 27
success: True
fun: 120.0
x: array([ 11., 10.])
message: 'Optimization terminated successfully.'
nit: 7*
根據該文檔,SciPy的的內爾德-米德方法使用單純的線性規劃算法。它依靠使用非整數點/步長來優化功能。我一般不熟悉SciPy,所以可能會有一個配置選項讓它按照你的意願去做。你可能也想看看整數編程(http://en.wikipedia.org/wiki/Integer_programming),因爲這聽起來像你想要完成的。 –
@EricG實際上我認爲這只是一個名稱混合,Nelder-Mead「Simplex」適用於Simplex的幾何結構。它與線性規劃中的Simplex算法無關,而且這是非線性優化。 – seberg
由於這樣的問題,ML算法的參數調整通常只使用網格搜索(通常在對數網格上,但對於看起來不必要的參數)來完成。你可以做一個粗糙的網格搜索,首先找到一個好的區域,然後在該區域進行更細粒度的網格搜索。 – Dougal