7. Losses
7.1
elbo
This module contains the code for the Losses.
7.1.1
ELBOLoss(loss_function, num_samples=1, kl_weight=0.001, **kwargs)
Computes the Evidence Lower Bound (ELBO) loss, combining a likelihood loss and KL divergence.
Initializes the ELBO loss with specified likelihood loss function, sample count, and KL weight.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
loss_function
|
Callable[[Tensor, Tensor], Tensor]
|
Loss function for computing likelihood loss. |
required |
num_samples
|
int
|
Number of samples for Monte Carlo approximation. |
1
|
kl_weight
|
float
|
Scaling factor for the KL divergence component. |
0.001
|
**kwargs
|
Any
|
Additional keyword arguments. |
{}
|
Source code in illia/losses/tf/elbo.py
101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 |
|
7.1.1.1
__call__(y_true, y_pred, model)
Computes the ELBO loss, averaging over multiple samples.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
y_true
|
Tensor
|
True labels. |
required |
y_pred
|
Tensor
|
Predictions from the model. |
required |
model
|
Model
|
TensorFlow model containing Bayesian layers. |
required |
Returns:
Type | Description |
---|---|
Tensor
|
Average ELBO loss across samples. |
Source code in illia/losses/tf/elbo.py
149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 |
|
7.1.2
KLDivergenceLoss(reduction='mean', weight=1.0, **kwargs)
Computes the KL divergence loss for Bayesian modules within a model.
Initializes the KL divergence loss with specified reduction method and weight.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
reduction
|
Literal['mean']
|
Method to reduce the loss, currently only "mean" is supported. |
'mean'
|
weight
|
float
|
Scaling factor for the KL divergence loss. |
1.0
|
**kwargs
|
Any
|
Additional keyword arguments. |
{}
|
Source code in illia/losses/tf/elbo.py
22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 |
|
7.1.2.1
__call__(model)
Computes the KL divergence loss across all Bayesian layers in the model.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
model
|
Model
|
TensorFlow model containing Bayesian layers. |
required |
Returns:
Type | Description |
---|---|
Tensor
|
KL divergence cost scaled by the specified weight. |
Source code in illia/losses/tf/elbo.py
63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 |
|