8. Losses
This module implements the Kullback-Leibler (KL) divergence loss for Bayesian neural networks in Tensorflow.
8.1
KLDivergenceLoss(reduction='mean', weight=1.0, **kwargs)
Computes the Kullback-Leibler divergence loss across all Bayesian modules.
Initializes the Kullback-Leibler divergence loss computation.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
reduction
|
Literal['mean']
|
Reduction method for the loss. |
'mean'
|
weight
|
float
|
Scaling factor applied to the total KL loss. |
1.0
|
**kwargs
|
Any
|
Additional keyword arguments. |
{}
|
Returns:
Type | Description |
---|---|
None
|
None. |
Source code in illia/losses/tf/kl.py
24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 |
|
8.1.1
__call__(*args, **kwargs)
Computes Kullback-Leibler divergence for all Bayesian modules in the model.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
*args
|
Any
|
Unused positional arguments. |
()
|
**kwargs
|
Any
|
Must include 'model' as a keyword argument. |
{}
|
Returns:
Type | Description |
---|---|
Tensor
|
Scaled Kullback-Leibler divergence loss as a scalar array. |
Source code in illia/losses/tf/kl.py
67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 |
|
This module implements the Evidence Lower Bound (ELBO) loss for Bayesian neural networks in TensorFlow.
8.2
ELBOLoss(loss_function, num_samples=1, kl_weight=0.001, **kwargs)
Computes the Evidence Lower Bound (ELBO) loss function for Bayesian neural networks.
This combines a reconstruction loss and a KL divergence term, estimated using Monte Carlo sampling.
Initializes the ELBO loss with sampling and KL scaling.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
loss_function
|
Callable[[Tensor, Tensor], Tensor]
|
Module for computing reconstruction loss. |
required |
num_samples
|
int
|
Number of MC samples for estimation. |
1
|
kl_weight
|
float
|
Weight applied to the KL loss. |
0.001
|
Returns:
Type | Description |
---|---|
None
|
None. |
Source code in illia/losses/tf/elbo.py
27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 |
|
8.2.1
__call__(y_true, y_pred, *args, **kwargs)
Computes the ELBO loss using KL regularization and reconstruction error.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
y_true
|
Tensor
|
Ground truth targets. |
required |
y_pred
|
Tensor
|
Model predictions. |
required |
*args
|
Any
|
Unused positional arguments. |
()
|
**kwargs
|
Any
|
Must include 'model' containing Bayesian layers. |
{}
|
Returns:
Type | Description |
---|---|
Tensor
|
Scalar tensor representing the total ELBO loss. |
Source code in illia/losses/tf/elbo.py
76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 |
|