更新 感謝Q&A here,我能夠建立tensorflow的工作步驟功能。 (請參見下面的代碼)如何將step_function作爲keras中的激活函數來編寫?
現在我的問題演變成
如何利用在
tensorflow
創造了這個tf_stepy
激活功能在keras
工作?
我想下面的代碼,以利用在keras tf_stepy
,但不工作:
from tensorflow_step_function import tf_stepy
def buy_hold_sell(x):
return tf_stepy(x)
get_custom_objects().update({'custom_activation': Activation(buy_hold_sell)})
下面是tensorflow
# tensorflow_step_function.py
import tensorflow as tf
import keras.backend as K
from keras.backend.tensorflow_backend import _to_tensor
import numpy as np
def stepy(x):
if x < 0.33:
return 0.0
elif x > 0.66:
return 1.0
else:
return 0.5
import numpy as np
np_stepy = np.vectorize(stepy)
def d_stepy(x): # derivative
if x < 0.33:
return 0.0
elif x > 0.66:
return 1.0
else:
return 0.5
np_d_stepy = np.vectorize(d_stepy)
import tensorflow as tf
from tensorflow.python.framework import ops
np_d_stepy_32 = lambda x: np_d_stepy(x).astype(np.float32)
def py_func(func, inp, Tout, stateful=True, name=None, grad=None):
# Need to generate a unique name to avoid duplicates:
rnd_name = 'PyFuncGrad' + str(np.random.randint(0, 1E+8))
tf.RegisterGradient(rnd_name)(grad) # see _MySquareGrad for grad example
g = tf.get_default_graph()
with g.gradient_override_map({"PyFunc": rnd_name}):
return tf.py_func(func, inp, Tout, stateful=stateful, name=name)
def tf_d_stepy(x,name=None):
with ops.op_scope([x], name, "d_stepy") as name:
y = tf.py_func(np_d_stepy_32,
[x],
[tf.float32],
name=name,
stateful=False)
return y[0]
def stepygrad(op, grad):
x = op.inputs[0]
n_gr = tf_d_stepy(x)
return grad * n_gr
np_stepy_32 = lambda x: np_stepy(x).astype(np.float32)
def tf_stepy(x, name=None):
with ops.op_scope([x], name, "stepy") as name:
y = py_func(np_stepy_32,
[x],
[tf.float32],
name=name,
grad=stepygrad) # <-- here's the call to the gradient
return y[0]
with tf.Session() as sess:
x = tf.constant([0.2,0.7,0.4,0.6])
y = tf_stepy(x)
tf.initialize_all_variables().run()
print(x.eval(), y.eval(), tf.gradients(y, [x])[0].eval())
原來的創建步驟激活功能問題
在numpy的圖形,如步激活功能應該表現如下:
def step_func(x, lower_threshold=0.33, higher_threshold=0.66):
# x is an array, and return an array
for index in range(len(x)):
if x[index] < lower_threshold:
x[index] = 0.0
elif x[index] > higher_threshold:
x[index] = 1.0
else:
x[index] = 0.5
我管理將step功能從numpy版本轉換爲keras.tensor版本。它的工作原理如下:
import tensorflow as tf
import keras.backend as K
from keras.backend.tensorflow_backend import _to_tensor
import numpy as np
def high_med_low(x, lower_threshold=0.33, higher_threshold=0.66):
"""
x: tensor
return a tensor
"""
# x_shape = K.get_variable_shape(x)
# x_flat = K.flatten(x)
x_array = K.get_value(x)
for index in range(x_array.shape[0]):
if x_array[index,0] < lower_threshold:
x_array[index,0] = 0.0
elif x_array[index,0] > higher_threshold:
x_array[index,0] = 1.0
else:
x_array[index,0] = 0.5
# x_return = x_array.reshape(x_shape)
return _to_tensor(x_array, x.dtype.base_dtype)
x = K.ones((10,1)) * 0.7
print(high_med_low(x))
# the following line of code is used in building a model with keras
get_custom_objects().update({'custom_activation': Activation(high_med_low)})
儘管此功能可以自行工作,但在應用於模型時會導致錯誤。我懷疑是作爲激活層,它不應該訪問張量的每個元素值。
如果是這樣,那麼寫這個步驟激活函數的正確方法是什麼?
謝謝!
https://stackoverflow.com/questions/43915482/how-do-you-create-a-custom-activation-function-with-keras – desertnaut