2016-01-02 86 views
-1

我想通過使用神經網絡和python numpy來適應正弦波,但是我的程序無法適應正弦波,我想我錯過了反向傳播的東西神經網絡多項式迴歸不能擬合曲線

和我有一個問題,如果我使用迴歸神經網絡,我仍然有正向傳播後乙狀結腸?

import numpy 
import matplotlib.pyplot as plt 
from sklearn import preprocessing 



def sigmoid(x): 
    return 1/(1 + numpy.exp(-x)) 

def function(x): 
    #return x**3 +10 
    #return numpy.sin(x) 
    return numpy.exp(-x) * numpy.sin(3 * x) 

rng = numpy.random.RandomState(12345) 

#data generate 
x_1 = numpy.arange(0,4.2,0.2) # one order x 
x = preprocessing.scale(numpy.array([x_1,x_1**2,x_1**3,x_1**4]).T) 
y = function(x_1) 

n_in = x.shape[1] # feature 
n_out = 4# hidden unit 

w_1 = numpy.asarray(
    rng.uniform(
     low = -numpy.sqrt(0.5/(n_in)), 
     high = numpy.sqrt(0.5/(n_in)), 
     size = (n_in, n_out))) 
w_2 = numpy.asarray(
    rng.uniform(
     low = -numpy.sqrt(0.5/ (n_in)), 
     high = numpy.sqrt(0.5 /(n_in)), 
     size = (n_out,1))) 
b_1 = numpy.asarray(
    rng.uniform(
     low = -numpy.sqrt(0.5/ (n_in)), 
     high = numpy.sqrt(0.5 /(n_in)), 
     size = (n_out))) 
b_2 = numpy.asarray(
    rng.uniform(
     low = -numpy.sqrt(0.5/ (n_in)), 
     high = numpy.sqrt(0.5 /(n_in)), 
     size = (1))) 

lr = 0.0001 

for step in range(1000): 
    activate_hidden = numpy.dot(x,w_1) + b_1 #forward 
    activate_output = numpy.dot(activate_hidden,w_2) + b_2 #forward 
    delta_output = -(activate_output - numpy.reshape(y,(x_1.shape[0],1))) 
    w_2 = w_2 + lr * (numpy.dot(activate_hidden.T, delta_output).mean()) 
    b_2 = b_2 + lr * delta_output.mean() 
    delta_hidden = numpy.dot(delta_output, w_2.T) 
    w_1 = w_1 + lr * (numpy.dot(x.T,delta_hidden).mean()) 
    b_1 = b_1 + lr * delta_hidden.mean() 

activate_hidden = numpy.dot(x,w_1) + b_1 
activate_output = numpy.dot(activate_hidden,w_2) + b_2 

plt.subplot(121) 
plt.plot(x_1,y) 

plt.subplot(122) 
plt.plot(x_1,activate_output) 

plt.show() 

回答

0

你被乘以X * W_1計算網絡激活,然後W_2。你需要有一個隱藏層激活函數,或者這是一個線性變換。

activate_hidden = numpy.dot(x,w_1) # forward 
activate_output = numpy.dot(activate_hidden,w_2) # forward 

你可能還需要x * (w_1 * w_2)。你應該通過圖層之間的激活函數來傳遞它。你可能會考慮sigmoid,tanh或relu。