beebo.utils package

Submodules

beebo.utils.cholesky_inference module

Do cholesky-based GP predictions, allow for low-rank updates.

GPytorch does not support low-rank updates with Cholesky as far as I can tell.

class beebo.utils.cholesky_inference.GPPosteriorPredictor(covar_module, mean_module, noise_module, train_X, train_y)[source][source]

Bases: object

A convenience class for computing posterior covariances of a GP. This avoids using GPytorch’s default forward pass so that we can do cholesky-based predictions and low rank updates.

augmented_covariance(new_X)[source][source]

Add new_X to the training set, then compute posterior covariance of new_X. Use low rank update to avoid recomputing the entire cholesky decomposition.

forward(X)[source][source]

Get mean and covariance of the GP at X.

predict_covar(X, test_train_covar=None)[source][source]

Basic code taken from exact_predictive_covar in GPytorch.

NOTE this supports both batch mode (b,q,d) and single mode (q,d).

predict_mean(X, test_train_covar=None)[source][source]
static update_chol(L, B, C)[source][source]

Update cholesky decomposition of M to M_aug.

Parameters:
  • L (np.ndarray) – Cholesky decomposition of M (n, n) / (b, n, n)

  • B (np.ndarray) – old-new covar (n, q) / (b, n, q)

  • C (np.ndarray) – new-new covar (q, q) / (b, q, q)

  • NOTE – C needs to include the noise on the diagonal.

Returns:

Cholesky decomposition of M_aug (n+q, n+q) / (b, n+q, n+q)

Return type:

L_aug

beebo.utils.cholesky_inference.update_covar_one_point(covar: Tensor, x_train: Tensor, x_augmented: Tensor, new_x: Tensor, new_x_idx: int, kernel: Module)[source][source]

Module contents