Department of Biomedical Engineering and Computational Science

# A Gaussian Process, 'demo_2ingp'

The demonstration program solves a simple 2-dimensional regression problem using an Gaussian process. The demonstration is discussed completely in the documentation. The data in the demonstration has been used in the work by Christopher J. Paciorek and Mark J. Schervish (2004). The training data has an unknown Gaussian noise and can be seen in the figure 1.

We are using the same data that was used in the regression model demonstration demo_2input. The effectiveness of Gaussian Process can be seen from the amount of samples needed. Here we needed only 350 samples to reach the "right" distribution when in MLP network with Bayesian learning we needed 2500 samples. The CPU time needed to sample the 350 samples on a 2400MHz Intel Pentium workstation was approximately 30 minutes.

```
function demo_2ingp

x = [data(:,1) data(:,2)];
y = data(:,3);

% Draw the data. The data is not arranged.
figure
title({'The noisy teaching data'});
[xi,yi,zi]=griddata(data(:,1),data(:,2),data(:,3),-1.8:0.01:1.8,[-1.8:0.01:1.8]');
mesh(xi,yi,zi)
rot
pop

% Create a Gaussian process for regression model.
[n, nin] = size(x);
gp=gp2('gp2r',nin,'exp');
gp.jitterSigmas=0.01;
gp.expScale=0.2;
gp.expSigmas=repmat(0.1,1,gp.nin);
gp.noiseSigmas=0.2;
gp.f='norm';

gp.p.expSigmas=invgam_p({0.1 0.5 0.05 1});
gp.p.expScale=invgam_p({0.05 0.5});
gp.p.noiseSigmas=invgam_p({0.05 0.5});

opt=gp2r_mcopt;
opt.repeat=20;
opt.nsamples=1;
opt.hmc_opt.steps=20;
opt.hmc_opt.nsamples=1;
hmc2('state', sum(100*clock));

[r1,g1,rstate1]=gp2r_mc(opt, gp, x, y);

opt.repeat=10;
opt.hmc_opt.steps=30;
[r2,g2,rstate2]=gp2r_mc(opt, g1, x, y, [], [], r1, rstate1);

opt.nsamples=350;
opt.hmc_opt.persistence=1;
opt.sample_variances=0;
opt.hmc_opt.window=5;
r=r2; g=g2; rstate=rstate2;
opt.hmc_opt.steps=10;
[r,g,rstate]=gp2r_mc(opt, g, x, y, [], [], r, rstate);

% Create new data with the right scale checked above
figure
[p1,p2]=meshgrid(-1.8:0.05:1.8,-1.8:0.05:1.8);
p=[p1(:) p2(:)];
gp=thin(r,50,2);
out=gp2fwds(gp, x, y, p);
mout = mean(squeeze(out)');

%Plot the new data
gp=zeros(size(p1));
gp(:)=mout;
mesh(p1,p2,gp);
axis on;
rot;
pop;
```

Figure 1. A training data with Gaussian noise.

Figure 3. The prediction from Gaussian process.