Linear regression with pymc3 and faith

I am trying to understand Bayesain's statistics with pymc3

I ran this code for simple linear regression

#Generating data y=a+bx
import pymc3
import numpy as np
N=1000
alpha,beta, sigma = 2.0, 0.5, 1.0
np.random.seed(47)
X = np.linspace(0, 1, N)
Y = alpha + beta*X + np.random.randn(N)*sigma

#Fitting
linear_model = pymc3.Model()
with linear_model:
    alpha = pymc3.Normal('alpha', mu=0, sd=10)
    beta = pymc3.Normal('beta', mu=0, sd=10)
    sigma = pymc3.HalfNormal('sigma', sd=1)
    mu = alpha + beta*X
    Y_obs = pymc3.Normal('Y_obs', mu=mu, sd=sigma, observed=Y)

    start = pymc3.find_MAP(fmin=optimize.fmin_powell)
    step = pymc3.NUTS(scaling=start)
    trace = pymc3.sample(500, step, start=start)

I don't understand what tracing means

If I understand well Bayesian theory, should be a function beliefthat takes alpha, betaand sigma, and outputs the probability of their combination.

How can I get this structure belieffrom variables trace?

+4
source share
1 answer

traceis the result of the Monte Carlo Markov chain process (MCMC). It converges to the distribution (for example, persuasion) of your parameters, given the data.

You can view the trace using:

pymc3.traceplot(trace, vars=['alpha', 'beta', 'sigma'])

Trace plot

, - :

import matplotlib.pyplot as plt

a = trace['alpha']
b = trace['beta']
x = np.linspace(0,1,N)

fig = plt.figure(figsize=(12,4))

ax = fig.add_subplot(1,2,1)
plt.scatter(X,Y, color='g', alpha=0.3)
for i in xrange(500):
  y = a[i] + b[i] * x
  plt.plot(x, y, 'b', alpha=0.02)

ax = fig.add_subplot(1,2,2)
for i in xrange(500):
  y = a[i] + b[i] * x
  plt.plot(x, y, 'b', alpha=0.02)
plt.show()

Individual Realizations

. , : from scipy import optimize

+8

Source: https://habr.com/ru/post/1598438/


All Articles