3. Low-level interface#
The class SVGPFAModelFactory
creates
an svGPFA model and an instance of the class SVEM
optimises its parameters. Please refer to the svGPFA class and interaction diagrams.
There is a one-to-one mapping between classes in the svGPFA.stats
package and equations in Duncker and Sahani [DS18].
Class
SVLowerBound
corresponds to the right-hand-side ofEq.4
. This class uses theExpectedLogLikelihood
andKLDivergence
classes, described next.The abstract class
ExpectedLogLikelihood
corresponds to the first term of the right-hand-side ofEq.4
.The abstract subclass
PointProcessELL
implements the functionality ofExpectedLogLikelihood
for point-process observations, and corresponds toEq.7
. If the link function (i.e., g inEq.7
) is the exponential function, then the one-dimensional integral in the first term ofEq.7
can be solved analytically (concrete subclassPointProcessELLExpLink
). For other link functions we can solve this integral using Gaussian quadrature (concrete subclassPointProcessELLQuad
).Similarly, the subclasses
PoissonELL
,PoissonELLExpLink
andPoissonELLQuad
implement the functionality ofExpectedLogLikelihood
for Poisson observations.
The concrete class
KLDivergence
corresponds to the second term of the right-hand-side ofEq.4
and implements the KL divergence between the prior on inducing points, \(p(\mathbf{u}_k^{(r)})\), and the posterior on inducing points, \(q(\mathbf{u}_k^{(r)})\).ExpectedLogLikelihood
usesSVEmbedding
, which calculates the mean and variance of the svGPFA embedding (\(h_n^{(r)}\) inEq.1
), given inEq.5
.SVEmbedding
is an abstract class, which hasLinearSVEmbedding
as abstract sublcass. Two concrete subclasses ofLinearSVEmbedding
are provided, which optimise the calculation of the embedding for two different uses inEq.7
.The first term in the right-hand-side of
Eq.7
requires the calculation of the embedding at sample times in a grid, which are the same for all neurons. This calculation is implemented inLinearSVEmbeddingAllTimes
.The second term in the right-hand-side of
Eq.7
requires the calculation of the embedding at spike times, which are different for each neuron. This calculation is implemented inLinearSVEmbeddingAssocTimes
.
SVEmbedding
usesSVPosteriorOnLatents
, which calculates the mean and variance of the latent variables, \(x_k^{(r)}\) inEq.1
. These means and variances are not described by their own equations in Duncker and Sahani, 2018, but are embedded inEq.5
. They are\[\begin{split}\nu_k^{(r)}(t) &= \kappa_k(t,z_k)K_{zz}^{(k)^{-1}}m_k^{(r)}\\ \sigma_k^{(r)}(t) &= \kappa_k(t,t)+\mathbf{\kappa}_k(t,\mathbf{z}_k)\left(K_{zz}^{(k)^{-1}}S_k^{(r)}K_{zz}^{(k)^{-1}}-K_{zz}^{(k)^{ -1}}\right)\mathbf{\kappa}_k(\mathbf{z}_k,t)\end{split}\]SVPosteriorOnLatents
is an abstract class. As above, two concrete subclasses are provided.SVPosteriorOnLatentsAllTimes
computes the means and variances in a grid of time points andSVPosteriorOnLatentsAssocTimes
calculates these statistics at spike times.SVPosteriorOnLatents
usesKernelsMatricesStore
, which stores kernel matrices between inducing points, \(K_{zz}\), between time points, \(K_{tt}\), and between time points and inducing points, \(K_{tz}\).KernelsMatricesStore
is an abstract class with two subclasses.IndPointsLocsKMS
is a concrete subclass ofKernelsMatricesStore
that stores kernel matrices between inducing points, and their Cholesky decompositions.IndPointsLocsAndTimesKMS
is an abstract subclass ofKernelsMatricesStore
which stores covariance matrices between time points and between time points and inducing points. As above,IndPointsLocsAndAllTimesKMS
andIndPointsLocsAndAssocTimesKMS
are concrete subclasses ofIndPointsLocsAndTimesKMS
for times points in a grid and for spike times, respectively.KernelsMatricesStore
usesKernel
, which is an abstract class for constructing kernel matrices. Concrete subclasses construct kernel matrices for specific types of kernels (e.g.,ExponentialQuadraticKernel
andPeriodicKernel
).