Custom probability models¶
When to use CPT vs custom models¶
TBNPy represents a Bayesian network as:
Variables (
Variable) that define names and domainsProbability objects that define conditional relationships for each node
For discrete, tractable variables (small state spaces), you can use
Cpt to define probability tables.
For continuous variables, non-tabular conditionals, or deterministic relationships, you should implement a custom probability model.
A probability object is any Python class that exposes a small interface used by
tbnpy.inference and tbnpy.adaptiveMH.
Minimal interface¶
A custom probability model class should define:
Required attributes
Attribute |
Meaning |
|---|---|
|
List of child variables (each a |
|
List of parent variables (each a |
|
Torch device ( |
Required methods
- prob.sample(Cs_pars: torch.Tensor) tuple[torch.Tensor, torch.Tensor]¶
Generate samples of the child variable(s) conditional on parent samples.
Parameters¶
- Cs_pars
Parent realisations as a tensor of shape
(n_samples, n_parents).For discrete parents, entries are typically integer state indices.
For continuous parents, entries are floats.
Returns¶
- (Cs_child, logp)
Cs_childis a tensor of shape(n_samples, n_childs)containing generated child samples.logpis a tensor of shape(n_samples,)containing the log-probability of the generated samples under the model.
Notes¶
The return value
logpis strongly recommended because it enables downstream inference routines to track likelihood contributions efficiently.
- prob.log_prob(Cs: torch.Tensor) torch.Tensor¶
Evaluate log-probability for provided samples.
Parameters¶
- Cs
Tensor of shape
(n_samples, n_childs + n_parents)storing realisations of:child variables first, then
parent variables (in the same order as
parents)
Returns¶
- torch.Tensor
Log-probabilities of shape
(n_samples,).
Device and performance notes¶
Implement methods using PyTorch tensor operations so computations can run on CPU or GPU.
Keep shapes consistent: TBNPy relies heavily on batched tensor operations.
Deterministic relationships can be represented by returning: - child samples computed by a function of parent samples, and - a log-probability tensor (often zeros if treated as deterministic in your factorisation).
Examples in the ABCDE tutorial¶
The ABCDE example demonstrates three common cases:
C | A, B: continuous conditional (Gaussian with parameters depending on discrete parents)OC | C: noisy observation modelE | C, D: deterministic relationship
See the ABCDE example page for working code snippets.