A library to perform MCMC proposal steps on `ModelWrapper` structs, see ModelWrappers.jl.
Author paschermayr
0 Stars
Updated Last
1 Year Ago
Started In
January 2022


Documentation, Stable Build Status Coverage ColPrac: Contributor's Guide on Collaborative Practices for Community Packages

BaytesMCMC.jl is a library to perform MCMC proposal steps on ModelWrapper structs, see ModelWrappers.jl.

First steps

Let us use the model initially defined in the ModelWrappers.jl introduction:

using ModelWrappers, BaytesMCMC
using Distributions, Random, UnPack
_rng = Random.GLOBAL_RNG
#Create Model and data
myparameter == Param(Normal(), 0.0, ), σ = Param(Gamma(), 1.0, ))
mymodel = ModelWrapper(myparameter)
data = randn(1000)
#Create objective for both μ and σ and define a target function for it
myobjective = Objective(mymodel, data, (, ))
function (objective::Objective{<:ModelWrapper{BaseModel}})(θ::NamedTuple)
	@unpack data = objective
	lprior = Distributions.logpdf(Distributions.Normal(),θ.μ) + Distributions.logpdf(Distributions.Exponential(), θ.σ)
    llik = sum(Distributions.logpdf( Distributions.Normal.μ, θ.σ), data[iter] ) for iter in eachindex(data))
	return lprior + llik

We can assign an MCMC Kernel to this model by simply calling

mcmc_nuts = MCMC(_rng, NUTS, myobjective)
mcmc_hmc = MCMC(_rng, HMC, myobjective)
mcmc_mala = MCMC(_rng, MALA, myobjective)
mcmc_metropolis = MCMC(_rng, Metropolis, myobjective)

There are two standard ways to make proposal steps in BaytesMCMC.

  1. The first step is the standard MCMC proposal step:
_val, _diagnostics = propose(_rng, mcmc_nuts, myobjective)

This will return proposed model parameter (if accepted), along with MCMC summary diagnostics.

  1. The second way can be used if the MCMC kernel is used alongside other algorithms, or if the data changes between proposal steps. In this case, the MCMC kernel is updated with the paramter given in mymodel and the new data before a proposal step is performed. This will take up more time then the first way, but is more flexible and can be used in scenarios where, for instance, you want to use MCMC in combination with other inference algorithm.
_val, _diagnostics = propose!(_rng, mcmc_nuts, mymodel, data)

This will update mymodel with the proposed parameter (if accepted), and return mymodel parameter along with MCMC summary diagnostics.


All MCMC kernels are initialized with sane default tuning parameter, but each field in the config.jl file of each sampler in src/Kernels can be fully customized. For instance, the following settings initializes a HMC kernel with fixed stepsize and a dense mass matrix adaption. Moreover, the ReverseDiff package is used for obtaining derivative information.

mcmcdefault = MCMCDefault(;
	kernel = (; stepnumber = ConfigStepnumber(; steps = 10)),
	stepsize = ConfigStepsize(; ϵ = 1.0, stepsizeadaption = UpdateFalse()),
	proposal = ConfigProposal(; metric = MDense()),
	GradientBackend = :ReverseDiff,

mcmc_customized = MCMC(_rng, HMC, myobjective, mcmcdefault)
_val, _diagnostics = propose(_rng, mcmc_customized, myobjective)
_val, _diagnostics = propose!(_rng, mcmc_customized, mymodel, data)

There are many customization options for each kernel, which can be seen in the corresponding config.jl files.

Generated data and prediction

You can return generated quantities and predictions of your model for each proposal step. Just like with the objective functor, you need to add methods for your model for the corresponding functions:

function ModelWrappers.predict(_rng::Random.AbstractRNG, objective::Objective{<:ModelWrapper{BaseModel}})
    @unpack μ, σ = objective.model.val
    return rand(_rng, Normal(μ, σ))
function ModelWrappers.generate(_rng::Random.AbstractRNG, objective::Objective{<:ModelWrapper{BaseModel}})
    @unpack model, data = objective
    # Return some interesting stuff
    return model.val.μ + randn(_rng)

predict(_rng, myobjective)
generate(_rng, myobjective)

mcmc_nuts = MCMC(_rng, NUTS, myobjective, MCMCDefault(generated = UpdateTrue()))
_val, _diagnostics = propose(_rng, mcmc_nuts, myobjective)

Going Forward

This package is still highly experimental - suggestions and comments are always welcome!

License Notice

Note that this package heavily uses and adapts code from the DynamicHMC.jl package (v3.1.0) licensed under MIT License, see