2017-05-27 72 views
3

我使用sklearn標準縮放(平均去除和方差縮放)來縮放數據幀,並將其與「手動」減去平均值併除以標準偏差的數據幀進行比較。比較顯示一致的小差異。有人可以解釋爲什麼嗎? (I所使用的數據集是這樣的:http://archive.ics.uci.edu/ml/datasets/Winesklearn standardscaler結果不同於手動結果

import pandas as pd 
from sklearn.preprocessing import StandardScaler 

df = pd.read_csv("~/DataSets/WineDataSetItaly/wine.data.txt", names=["Class", "Alcohol", "Malic acid", "Ash", "Alcalinity of ash", "Magnesium", "Total phenols", "Flavanoids", "Nonflavanoid phenols", "Proanthocyanins", "Color intensity", "Hue", "OD280/OD315 of diluted wines", "Proline"]) 

cols = list(df.columns)[1:] # I didn't want to scale the "Class" column 
std_scal = StandardScaler() 
standardized = std_scal.fit_transform(df[cols]) 
df_standardized_fit = pd.DataFrame(standardized, index=df.index, columns=df.columns[1:]) 

df_standardized_manual = (df - df.mean())/df.std() 
df_standardized_manual.drop("Class", axis=1, inplace=True) 

df_differences = df_standardized_fit - df_standardized_manual 
df_differences.iloc[:,:5] 


    Alcohol Malic acid Ash   Alcalinity Magnesium 
0 0.004272 -0.001582 0.000653 -0.003290 0.005384 
1 0.000693 -0.001405 -0.002329 -0.007007 0.000051 
2 0.000554 0.000060 0.003120 -0.000756 0.000249 
3 0.004758 -0.000976 0.001373 -0.002276 0.002619 
4 0.000832 0.000640 0.005177 0.001271 0.003606 
5 0.004168 -0.001455 0.000858 -0.003628 0.002421 

回答

4

scikit學習用途np.std默認爲人口標準偏差(其中偏差平方之和被觀測的數量除以)和熊貓使用示例標準偏差(其中分母是觀測值的數量 - 1)(見Wikipedia's standard deviation article)這是一個修正因子,對人口標準差有一個無偏估計,並由自由度決定(ddof)因此默認情況下,numpy和scikit -learn的計算使用ddof=0,而大熊貓使用ddof=1docs)。

DataFrame.std(軸=無,skipna =無,級別=無,ddof = 1,numeric_only =無,** kwargs)

返回樣本標準偏差超過請求的軸。

默認情況下由N-1歸一化。這可以通過使用ddof 參數

如果您改變了大熊貓的版本改變:

df_standardized_manual = (df - df.mean())/df.std(ddof=0) 

的差異將是幾乎爲零:

 Alcohol Malic acid   Ash Alcalinity of ash  Magnesium 
0 -8.215650e-15 -5.551115e-16 3.191891e-15  0.000000e+00 2.220446e-16 
1 -8.715251e-15 -4.996004e-16 3.441691e-15  0.000000e+00 0.000000e+00 
2 -8.715251e-15 -3.955170e-16 2.886580e-15  -5.551115e-17 1.387779e-17 
3 -8.437695e-15 -4.440892e-16 3.164136e-15  -1.110223e-16 1.110223e-16 
4 -8.659740e-15 -3.330669e-16 2.886580e-15  5.551115e-17 2.220446e-16