Header menu logo bristlecone

MetropolisWithinGibbs Module

Functions and values

Function or value Description

core isAdaptive writeOut random domain f results batchLength batchNumber theta sigmas

Full Usage: core isAdaptive writeOut random domain f results batchLength batchNumber theta sigmas

Parameters:
    isAdaptive : bool
    writeOut : LogEvent -> unit
    random : Random
    domain : ('a * 'b * Constraint)[]
    f : float[] -> float
    results : (float * float[]) list
    batchLength : int
    batchNumber : int
    theta : float[]
    sigmas : float[]

Returns: int * (float * float[]) list * float[]

Adaptive-metropolis-within-Gibbs algorithm, which can work in both adaptive and fixed modes

isAdaptive : bool
writeOut : LogEvent -> unit
random : Random
domain : ('a * 'b * Constraint)[]
f : float[] -> float
results : (float * float[]) list
batchLength : int
batchNumber : int
theta : float[]
sigmas : float[]
Returns: int * (float * float[]) list * float[]

propose theta j lsj domain random

Full Usage: propose theta j lsj domain random

Parameters:
    theta : float[]
    j : int
    lsj : float
    domain : ('a * 'b * Constraint)[]
    random : Random

Returns: float[]

Propose a jump, while leaving all but one parameter value fixed

theta : float[]
j : int
lsj : float
domain : ('a * 'b * Constraint)[]
random : Random
Returns: float[]

tune sigma acceptanceRate batchNumber

Full Usage: tune sigma acceptanceRate batchNumber

Parameters:
    sigma : float
    acceptanceRate : float
    batchNumber : int

Returns: float

Tune variance of a parameter based on its acceptance rate. The magnitude of tuning reduces as more batches have been run.

sigma : float
acceptanceRate : float
batchNumber : int
Returns: float

Type something to start searching.