如果我明白你想要什麼,你可以使用gMC版本的PyMC(PyMC3)和glm子模塊來做到這一點。 例如
import numpy as np
import pymc as pm
import matplotlib.pyplot as plt
from pymc import glm
## Make some data
x = np.array(range(0,50))
y = np.random.uniform(low=0.0, high=40.0, size=50)
y = 2*x+y
## plt.scatter(x,y)
data = dict(x=x, y=y)
with pm.Model() as model:
# specify glm and pass in data. The resulting linear model, its likelihood and
# and all its parameters are automatically added to our model.
pm.glm.glm('y ~ x', data)
step = pm.NUTS() # Instantiate MCMC sampling algorithm
trace = pm.sample(2000, step)
##fig = pm.traceplot(trace, lines={'alpha': 1, 'beta': 2, 'sigma': .5});## traces
fig = plt.figure()
ax = fig.add_subplot(111)
plt.scatter(x, y, label='data')
glm.plot_posterior_predictive(trace, samples=50, eval=x,
label='posterior predictive regression lines')
爲了得到這樣的![posterior predictive](https://i.stack.imgur.com/ntoxg.png)
你會發現這些博客文章有意思:從那裏我把想法 1和2。
編輯 爲了得到每個x的y值,嘗試從挖掘到glm源中獲得的這個值。
lm = lambda x, sample: sample['Intercept'] + sample['x'] * x ## linear model
samples=50 ## Choose to be the same as in plot call
trace_det = np.empty([samples, len(x)]) ## initialise
for i, rand_loc in enumerate(np.random.randint(0, len(trace), samples)):
rand_sample = trace[rand_loc]
trace_det[i] = lm(x, rand_sample)
y = trace_det.T
y[0]
道歉,如果它不是最優雅的 - 希望你可以按照邏輯。
你知道怎麼用手做這個嗎? – User