Department of Biomedical Engineering and Computational Science

FBM tools for Matlab

Copyright (C) 1999 Simo Särkkä
Copyright (C) 2000-2003 Aki Vehtari
Maintainer: Aki Vehtari

Introduction

FBM is a "Software for Flexible Bayesian Modeling and Markov Chain Sampling" written by Radford Neal. First goal of the FBM tools was to make convergence diagnostics easier by reading data from the FBM log files directly to Matlab. Later simple prediction functions and conversion function for Mathworks Neural Network toolbox were added. Note, that only some of the fields from the FBM log files for MLP and GP models are currently read but changing this is relatively easy.

In 1999 Simo Särkkä implemented first parts of FBM tools in Matlab at Laboratory of Computational Engineering. Later Aki Vehtari added additional functions, and fixed many bugs and documentation.

License

This software is distributed under the GNU General Public Licence (version 2 or later); please refer to the file Licence.txt, included with the software, for details.

Download

Last updated 2003-08-18
Fixed fbmmlpwrite.m: Use use_ard in hyperest.

fbmtools.tar.gz
fbmtools.zip

Contents

Bayesian MLP networks:
FBMMLPREAD  - Read MLP networks from FBM log file
FBMMLPWRITE - Write network parameters into FBM logfile
FBMMLPPRED  - Compute predictions of Bayesian MLP
FBMMLP2NNET - Convert FBM-generated MLPs into NNET-toolbox 3.x/4.x format

Gaussian processes:
FBMGPREAD   - Read GP models from FBM log file
FBMGPPRED   - Compute predictions of Gaussian Process

Utilities
THIN        - Delete burn-in and thin MCMC-chains
JOIN        - Join similar structures of arrays to one structure of arrays

Example

Here is an example of using early-stop intialization for weights and hyperparameters in order to shorten burn-in time. Related reference is

  • Aki Vehtari, Simo Särkkä, and Jouko Lampinen (2000). On MCMC sampling in Bayesian MLP neural networks. In Shun-Ichi Amari, C. Lee Giles, Marco Gori, and Vincenzo Piuri, editors, IJCNN'2000: Proceedings of the 2000 International Joint Conference on Neural Networks, volume I, pages 317-322. IEEE. (PostScript) (PDF)

First create MLP network using FBM 'net' programs, but do not sample anything, that is, stop after specifying the main sampling specification. Then using Matlab and Neural Network Toolbox do early-stop training with, e.g., efficient scaled conjugate gradient (SCG) optimization method. For example: regression problem, x is input data, y is target data, number of data points is 100, 40 hidden unit mlp in 'mlp.log' FBM log file.

VV.P=x(90:end,:)';VV.T=y(90:end,:)';
p=x(1:90,:)';t=t(1:90,:)';
net = newff(minmax(p),[40 1], {'tansig' 'purelin'},'trainscg');
net.trainParam.epochs=5000;
net.trainParam.show=25;
net.trainParam.max_fail=100;
net.trainParam.show=NaN;
[net, tr] = train(net,p,t,[],[],VV);
tg = sim(net,p);
noise = mean(std(tg-t));
fbmmlpwrite('mlp.log',net,noise,noise);

After that, run net-mc for desired number of iterations.

See also