Skip to content
Snippets Groups Projects
Commit b64b4525 authored by Danilo Ferreira de Lima's avatar Danilo Ferreira de Lima
Browse files

Small fix.

parent bb3f633d
No related branches found
No related tags found
No related merge requests found
%% Cell type:markdown id:1bba0128 tags: %% Cell type:markdown id:1bba0128 tags:
# Supervised regression # Supervised regression
This is an example of how to build and optimize neural networks with PyTorch with the objective of predicting a known feature in the training dataset. This is an example of how to build and optimize neural networks with PyTorch with the objective of predicting a known feature in the training dataset.
The logic here is very similar to the classification problem and the code is also very close, but the final objective is to minimize the error in the prediction of the missing feature. This is achieved by minimizing the mean-squared-error between the prediction and the target features. As noticed during the presentation, by minimizing the mean-squared-error, we assume that the underlying error distribution is Gaussian. One could use the mean absolute error instead when assuming the distribution is Laplacian. The logic here is very similar to the classification problem and the code is also very close, but the final objective is to minimize the error in the prediction of the missing feature. This is achieved by minimizing the mean-squared-error between the prediction and the target features. As noticed during the presentation, by minimizing the mean-squared-error, we assume that the underlying error distribution is Gaussian. One could use the mean absolute error instead when assuming the distribution is Laplacian.
%% Cell type:code id:d0681795 tags: %% Cell type:code id:d0681795 tags:
``` python ``` python
!pip install torchvision torch pandas numpy matplotlib ipympl torchbnn !pip install torchvision torch pandas numpy matplotlib ipympl torchbnn
``` ```
%% Output %% Output
Requirement already satisfied: torchvision in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (0.10.0) Requirement already satisfied: torchvision in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (0.10.0)
Requirement already satisfied: torch in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (1.9.0) Requirement already satisfied: torch in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (1.9.0)
Requirement already satisfied: pandas in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (1.3.0) Requirement already satisfied: pandas in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (1.3.0)
Requirement already satisfied: numpy in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (1.19.2) Requirement already satisfied: numpy in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (1.19.2)
Requirement already satisfied: matplotlib in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (3.4.2) Requirement already satisfied: matplotlib in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (3.4.2)
Requirement already satisfied: ipympl in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (0.8.2) Requirement already satisfied: ipympl in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (0.8.2)
Collecting torchbnn Collecting torchbnn
Downloading torchbnn-1.2-py3-none-any.whl (12 kB) Downloading torchbnn-1.2-py3-none-any.whl (12 kB)
Requirement already satisfied: pillow>=5.3.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from torchvision) (8.3.1) Requirement already satisfied: pillow>=5.3.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from torchvision) (8.3.1)
Requirement already satisfied: typing_extensions in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from torch) (3.10.0.0) Requirement already satisfied: typing_extensions in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from torch) (3.10.0.0)
Requirement already satisfied: python-dateutil>=2.7.3 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from pandas) (2.8.2) Requirement already satisfied: python-dateutil>=2.7.3 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from pandas) (2.8.2)
Requirement already satisfied: pytz>=2017.3 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from pandas) (2021.1) Requirement already satisfied: pytz>=2017.3 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from pandas) (2021.1)
Requirement already satisfied: six>=1.5 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from python-dateutil>=2.7.3->pandas) (1.16.0) Requirement already satisfied: six>=1.5 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from python-dateutil>=2.7.3->pandas) (1.16.0)
Requirement already satisfied: pyparsing>=2.2.1 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from matplotlib) (2.4.7) Requirement already satisfied: pyparsing>=2.2.1 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from matplotlib) (2.4.7)
Requirement already satisfied: kiwisolver>=1.0.1 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from matplotlib) (1.3.1) Requirement already satisfied: kiwisolver>=1.0.1 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from matplotlib) (1.3.1)
Requirement already satisfied: cycler>=0.10 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from matplotlib) (0.10.0) Requirement already satisfied: cycler>=0.10 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from matplotlib) (0.10.0)
Requirement already satisfied: ipywidgets>=7.6.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipympl) (7.6.3) Requirement already satisfied: ipywidgets>=7.6.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipympl) (7.6.3)
Requirement already satisfied: ipykernel>=4.7 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipympl) (5.5.5) Requirement already satisfied: ipykernel>=4.7 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipympl) (5.5.5)
Requirement already satisfied: tornado>=4.2 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipykernel>=4.7->ipympl) (6.1) Requirement already satisfied: tornado>=4.2 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipykernel>=4.7->ipympl) (6.1)
Requirement already satisfied: jupyter-client in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipykernel>=4.7->ipympl) (6.1.12) Requirement already satisfied: jupyter-client in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipykernel>=4.7->ipympl) (6.1.12)
Requirement already satisfied: ipython>=5.0.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipykernel>=4.7->ipympl) (7.25.0) Requirement already satisfied: ipython>=5.0.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipykernel>=4.7->ipympl) (7.25.0)
Requirement already satisfied: traitlets>=4.1.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipykernel>=4.7->ipympl) (5.0.5) Requirement already satisfied: traitlets>=4.1.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipykernel>=4.7->ipympl) (5.0.5)
Requirement already satisfied: prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipython>=5.0.0->ipykernel>=4.7->ipympl) (3.0.19) Requirement already satisfied: prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipython>=5.0.0->ipykernel>=4.7->ipympl) (3.0.19)
Requirement already satisfied: pickleshare in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipython>=5.0.0->ipykernel>=4.7->ipympl) (0.7.5) Requirement already satisfied: pickleshare in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipython>=5.0.0->ipykernel>=4.7->ipympl) (0.7.5)
Requirement already satisfied: decorator in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipython>=5.0.0->ipykernel>=4.7->ipympl) (5.0.9) Requirement already satisfied: decorator in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipython>=5.0.0->ipykernel>=4.7->ipympl) (5.0.9)
Requirement already satisfied: jedi>=0.16 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipython>=5.0.0->ipykernel>=4.7->ipympl) (0.18.0) Requirement already satisfied: jedi>=0.16 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipython>=5.0.0->ipykernel>=4.7->ipympl) (0.18.0)
Requirement already satisfied: pygments in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipython>=5.0.0->ipykernel>=4.7->ipympl) (2.9.0) Requirement already satisfied: pygments in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipython>=5.0.0->ipykernel>=4.7->ipympl) (2.9.0)
Requirement already satisfied: backcall in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipython>=5.0.0->ipykernel>=4.7->ipympl) (0.2.0) Requirement already satisfied: backcall in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipython>=5.0.0->ipykernel>=4.7->ipympl) (0.2.0)
Requirement already satisfied: setuptools>=18.5 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipython>=5.0.0->ipykernel>=4.7->ipympl) (49.6.0.post20210108) Requirement already satisfied: setuptools>=18.5 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipython>=5.0.0->ipykernel>=4.7->ipympl) (49.6.0.post20210108)
Requirement already satisfied: matplotlib-inline in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipython>=5.0.0->ipykernel>=4.7->ipympl) (0.1.2) Requirement already satisfied: matplotlib-inline in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipython>=5.0.0->ipykernel>=4.7->ipympl) (0.1.2)
Requirement already satisfied: pexpect>4.3 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipython>=5.0.0->ipykernel>=4.7->ipympl) (4.8.0) Requirement already satisfied: pexpect>4.3 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipython>=5.0.0->ipykernel>=4.7->ipympl) (4.8.0)
Requirement already satisfied: widgetsnbextension~=3.5.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipywidgets>=7.6.0->ipympl) (3.5.1) Requirement already satisfied: widgetsnbextension~=3.5.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipywidgets>=7.6.0->ipympl) (3.5.1)
Requirement already satisfied: jupyterlab-widgets>=1.0.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipywidgets>=7.6.0->ipympl) (1.0.0) Requirement already satisfied: jupyterlab-widgets>=1.0.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipywidgets>=7.6.0->ipympl) (1.0.0)
Requirement already satisfied: nbformat>=4.2.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipywidgets>=7.6.0->ipympl) (5.1.3) Requirement already satisfied: nbformat>=4.2.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from ipywidgets>=7.6.0->ipympl) (5.1.3)
Requirement already satisfied: parso<0.9.0,>=0.8.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from jedi>=0.16->ipython>=5.0.0->ipykernel>=4.7->ipympl) (0.8.2) Requirement already satisfied: parso<0.9.0,>=0.8.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from jedi>=0.16->ipython>=5.0.0->ipykernel>=4.7->ipympl) (0.8.2)
Requirement already satisfied: jupyter-core in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbformat>=4.2.0->ipywidgets>=7.6.0->ipympl) (4.7.1) Requirement already satisfied: jupyter-core in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbformat>=4.2.0->ipywidgets>=7.6.0->ipympl) (4.7.1)
Requirement already satisfied: jsonschema!=2.5.0,>=2.4 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbformat>=4.2.0->ipywidgets>=7.6.0->ipympl) (3.2.0) Requirement already satisfied: jsonschema!=2.5.0,>=2.4 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbformat>=4.2.0->ipywidgets>=7.6.0->ipympl) (3.2.0)
Requirement already satisfied: ipython-genutils in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbformat>=4.2.0->ipywidgets>=7.6.0->ipympl) (0.2.0) Requirement already satisfied: ipython-genutils in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbformat>=4.2.0->ipywidgets>=7.6.0->ipympl) (0.2.0)
Requirement already satisfied: pyrsistent>=0.14.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from jsonschema!=2.5.0,>=2.4->nbformat>=4.2.0->ipywidgets>=7.6.0->ipympl) (0.17.3) Requirement already satisfied: pyrsistent>=0.14.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from jsonschema!=2.5.0,>=2.4->nbformat>=4.2.0->ipywidgets>=7.6.0->ipympl) (0.17.3)
Requirement already satisfied: importlib-metadata in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from jsonschema!=2.5.0,>=2.4->nbformat>=4.2.0->ipywidgets>=7.6.0->ipympl) (1.7.0) Requirement already satisfied: importlib-metadata in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from jsonschema!=2.5.0,>=2.4->nbformat>=4.2.0->ipywidgets>=7.6.0->ipympl) (1.7.0)
Requirement already satisfied: attrs>=17.4.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from jsonschema!=2.5.0,>=2.4->nbformat>=4.2.0->ipywidgets>=7.6.0->ipympl) (21.2.0) Requirement already satisfied: attrs>=17.4.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from jsonschema!=2.5.0,>=2.4->nbformat>=4.2.0->ipywidgets>=7.6.0->ipympl) (21.2.0)
Requirement already satisfied: ptyprocess>=0.5 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from pexpect>4.3->ipython>=5.0.0->ipykernel>=4.7->ipympl) (0.7.0) Requirement already satisfied: ptyprocess>=0.5 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from pexpect>4.3->ipython>=5.0.0->ipykernel>=4.7->ipympl) (0.7.0)
Requirement already satisfied: wcwidth in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0->ipython>=5.0.0->ipykernel>=4.7->ipympl) (0.2.5) Requirement already satisfied: wcwidth in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0->ipython>=5.0.0->ipykernel>=4.7->ipympl) (0.2.5)
Requirement already satisfied: notebook>=4.4.1 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (6.4.0) Requirement already satisfied: notebook>=4.4.1 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (6.4.0)
Requirement already satisfied: nbconvert in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (6.1.0) Requirement already satisfied: nbconvert in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (6.1.0)
Requirement already satisfied: prometheus-client in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (0.11.0) Requirement already satisfied: prometheus-client in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (0.11.0)
Requirement already satisfied: jinja2 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (3.0.1) Requirement already satisfied: jinja2 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (3.0.1)
Requirement already satisfied: Send2Trash>=1.5.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (1.7.1) Requirement already satisfied: Send2Trash>=1.5.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (1.7.1)
Requirement already satisfied: argon2-cffi in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (20.1.0) Requirement already satisfied: argon2-cffi in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (20.1.0)
Requirement already satisfied: pyzmq>=17 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (22.1.0) Requirement already satisfied: pyzmq>=17 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (22.1.0)
Requirement already satisfied: terminado>=0.8.3 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (0.10.1) Requirement already satisfied: terminado>=0.8.3 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (0.10.1)
Requirement already satisfied: cffi>=1.0.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from argon2-cffi->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (1.14.6) Requirement already satisfied: cffi>=1.0.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from argon2-cffi->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (1.14.6)
Requirement already satisfied: pycparser in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from cffi>=1.0.0->argon2-cffi->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (2.20) Requirement already satisfied: pycparser in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from cffi>=1.0.0->argon2-cffi->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (2.20)
Requirement already satisfied: zipp>=0.5 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from importlib-metadata->jsonschema!=2.5.0,>=2.4->nbformat>=4.2.0->ipywidgets>=7.6.0->ipympl) (3.5.0) Requirement already satisfied: zipp>=0.5 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from importlib-metadata->jsonschema!=2.5.0,>=2.4->nbformat>=4.2.0->ipywidgets>=7.6.0->ipympl) (3.5.0)
Requirement already satisfied: MarkupSafe>=2.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from jinja2->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (2.0.1) Requirement already satisfied: MarkupSafe>=2.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from jinja2->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (2.0.1)
Requirement already satisfied: entrypoints>=0.2.2 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (0.3) Requirement already satisfied: entrypoints>=0.2.2 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (0.3)
Requirement already satisfied: pandocfilters>=1.4.1 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (1.4.2) Requirement already satisfied: pandocfilters>=1.4.1 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (1.4.2)
Requirement already satisfied: mistune<2,>=0.8.1 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (0.8.4) Requirement already satisfied: mistune<2,>=0.8.1 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (0.8.4)
Requirement already satisfied: jupyterlab-pygments in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (0.1.2) Requirement already satisfied: jupyterlab-pygments in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (0.1.2)
Requirement already satisfied: bleach in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (3.3.1) Requirement already satisfied: bleach in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (3.3.1)
Requirement already satisfied: defusedxml in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (0.7.1) Requirement already satisfied: defusedxml in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (0.7.1)
Requirement already satisfied: nbclient<0.6.0,>=0.5.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (0.5.3) Requirement already satisfied: nbclient<0.6.0,>=0.5.0 in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (0.5.3)
Requirement already satisfied: testpath in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (0.5.0) Requirement already satisfied: testpath in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (0.5.0)
Requirement already satisfied: async-generator in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbclient<0.6.0,>=0.5.0->nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (1.10) Requirement already satisfied: async-generator in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbclient<0.6.0,>=0.5.0->nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (1.10)
Requirement already satisfied: nest-asyncio in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbclient<0.6.0,>=0.5.0->nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (1.5.1) Requirement already satisfied: nest-asyncio in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from nbclient<0.6.0,>=0.5.0->nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (1.5.1)
Requirement already satisfied: packaging in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from bleach->nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (21.0) Requirement already satisfied: packaging in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from bleach->nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (21.0)
Requirement already satisfied: webencodings in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from bleach->nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (0.5.1) Requirement already satisfied: webencodings in /home/danilo/miniconda3/envs/mlmkl/lib/python3.7/site-packages (from bleach->nbconvert->notebook>=4.4.1->widgetsnbextension~=3.5.0->ipywidgets>=7.6.0->ipympl) (0.5.1)
Installing collected packages: torchbnn Installing collected packages: torchbnn
Successfully installed torchbnn-1.2 Successfully installed torchbnn-1.2
%% Cell type:code id:23feddde tags: %% Cell type:code id:23feddde tags:
``` python ``` python
%matplotlib notebook %matplotlib notebook
from typing import Tuple from typing import Tuple
# import standard PyTorch modules # import standard PyTorch modules
import torch import torch
import torch.nn as nn import torch.nn as nn
import torch.nn.functional as F import torch.nn.functional as F
import torchbnn as bnn import torchbnn as bnn
# import torchvision module to handle image manipulation # import torchvision module to handle image manipulation
import torchvision import torchvision
import torchvision.transforms as transforms import torchvision.transforms as transforms
import pandas as pd import pandas as pd
import numpy as np import numpy as np
import matplotlib.pyplot as plt import matplotlib.pyplot as plt
from mpl_toolkits.mplot3d import axes3d from mpl_toolkits.mplot3d import axes3d
``` ```
%% Cell type:markdown id:bb1286f0 tags: %% Cell type:markdown id:bb1286f0 tags:
We start by generating some fake dataset, which is simple enough that we can visualize the results easily. For this reason, the dataset will contain only two independent variables and a third feature variable which we want to determine in test data. We start by generating some fake dataset, which is simple enough that we can visualize the results easily. For this reason, the dataset will contain only two independent variables and a third feature variable which we want to determine in test data.
The simulated example data will be $f(x) = (3 + \kappa) x^2 + \epsilon$, where $\epsilon \sim \mathcal{N}(\mu=0, \sigma=10)$ and $\kappa \sim \mathcal{N}(\mu=0, \sigma=0.03)$. The simulated example data will be $f(x) = (3 + \kappa) x^2 + \epsilon$, where $\epsilon \sim \mathcal{N}(\mu=0, \sigma=10)$ and $\kappa \sim \mathcal{N}(\mu=0, \sigma=0.03)$.
In this case we do know the true model, so it is interesting to take some time to pinpoint the role of $\kappa$ and $\epsilon$. These variables add fluctuation to the results. $\epsilon$ adds Gaussian noise in a way that is completely independent from $x$ and cannot be traced down to a particular functional dependence. $\kappa$ changes a specific parameter of the model, in this case the coefficient 3, by around 1%. In this case we do know the true model, so it is interesting to take some time to pinpoint the role of $\kappa$ and $\epsilon$. These variables add fluctuation to the results. $\epsilon$ adds Gaussian noise in a way that is completely independent from $x$ and cannot be traced down to a particular functional dependence. $\kappa$ changes a specific parameter of the model, in this case the coefficient 3, by around 1%.
When fitting a model, the nomenclature *epistemic uncertainty* is often used to refer to uncertainties coming to effects related to different functional models. That is, one can imagine that there are different functions that may fit the data due to the effect of $\kappa$, such as: $g(x) = 3x^2$ or $h(x) = 2.95x^2$. When fitting a model, the nomenclature *epistemic uncertainty* is often used to refer to uncertainties coming to effects related to different functional models. That is, one can imagine that there are different functions that may fit the data due to the effect of $\kappa$, such as: $g(x) = 3x^2$ or $h(x) = 2.95x^2$.
The nomenclature *aleatoric uncertainty* is used to refer to whichever uncertainty cannot be tracked down to a given model dependence. In this example, different constant factors could be added to the model $g$ to account for the fluctuations in $\epsilon$. The nomenclature *aleatoric uncertainty* is used to refer to whichever uncertainty cannot be tracked down to a given model dependence. In this example, different constant factors could be added to the model $g$ to account for the fluctuations in $\epsilon$.
We will see these two effects later on, when we discuss Bayesian Neural Networks, so that we can predict the effect of those uncertainties. We will see these two effects later on, when we discuss Bayesian Neural Networks, so that we can predict the effect of those uncertainties.
%% Cell type:code id:5d457cd8 tags: %% Cell type:code id:5d457cd8 tags:
``` python ``` python
def generate_data(N: int) -> np.ndarray: def generate_data(N: int) -> np.ndarray:
x = 2*np.random.randn(N, 1) x = 2*np.random.randn(N, 1)
epsilon = 10*np.random.randn(N, 1) epsilon = 10*np.random.randn(N, 1)
kappa = 0.03*np.random.randn(N, 1) kappa = 0.03*np.random.randn(N, 1)
z = (3 + kappa)*x**2 + epsilon z = (3 + kappa)*x**2 + epsilon
return np.concatenate((x, z), axis=1).astype(np.float32) return np.concatenate((x, z), axis=1).astype(np.float32)
train_data = generate_data(N=1000) train_data = generate_data(N=1000)
``` ```
%% Cell type:markdown id:48433f6f tags: %% Cell type:markdown id:48433f6f tags:
PyTorch allows you to create a class that outputs a single data entry and use that to feed input to your neural network. We will use the following class to feed the data to the neural network. This looks useless if all your data fits in a Numpy array, but notice that if you have a lot of data and cannot load it all in memory, this allows you to read data on demand, as you need it and only the needed samples are stored at a single time. PyTorch allows you to create a class that outputs a single data entry and use that to feed input to your neural network. We will use the following class to feed the data to the neural network. This looks useless if all your data fits in a Numpy array, but notice that if you have a lot of data and cannot load it all in memory, this allows you to read data on demand, as you need it and only the needed samples are stored at a single time.
%% Cell type:code id:30205402 tags: %% Cell type:code id:30205402 tags:
``` python ``` python
class MyDataset(torch.utils.data.Dataset): class MyDataset(torch.utils.data.Dataset):
def __init__(self, data: np.ndarray): def __init__(self, data: np.ndarray):
self.data = data self.data = data
def __len__(self) -> int: def __len__(self) -> int:
"""How many samples do I have?""" """How many samples do I have?"""
return len(self.data) return len(self.data)
def __getitem__(self, idx): def __getitem__(self, idx):
# give me item with index idx # give me item with index idx
return {"data": self.data[idx, 0:1], "target": self.data[idx, 1:]} return {"data": self.data[idx, 0:1], "target": self.data[idx, 1:]}
``` ```
%% Cell type:code id:cc0b0774 tags: %% Cell type:code id:cc0b0774 tags:
``` python ``` python
my_dataset = MyDataset(train_data) my_dataset = MyDataset(train_data)
print(len(my_dataset)) print(len(my_dataset))
``` ```
%% Output %% Output
1000 1000
%% Cell type:code id:6dccfac6 tags: %% Cell type:code id:6dccfac6 tags:
``` python ``` python
print(my_dataset[1]) print(my_dataset[1])
``` ```
%% Output %% Output
{'data': array([-0.5583114], dtype=float32), 'target': array([-11.577045], dtype=float32)} {'data': array([-0.5583114], dtype=float32), 'target': array([-11.577045], dtype=float32)}
%% Cell type:markdown id:527089bd tags: %% Cell type:markdown id:527089bd tags:
Plot some of the data: Plot some of the data:
%% Cell type:code id:067b8105 tags: %% Cell type:code id:067b8105 tags:
``` python ``` python
fig = plt.figure() fig = plt.figure()
ax = fig.add_subplot(111) ax = fig.add_subplot(111)
ax.scatter(train_data[:, 0], train_data[:, 1]) ax.scatter(train_data[:, 0], train_data[:, 1])
plt.show() plt.show()
``` ```
%% Output %% Output
%% Cell type:markdown id:e517975c tags: %% Cell type:markdown id:e517975c tags:
And now let us define the neural network. In PyTorch, neural networks always extend `nn.Module`. They define their sub-parts in their constructor, which are convolutional layers and fully connected linear layers in this case, and the method `forward` is expected to receive an input image and output the network target. And now let us define the neural network. In PyTorch, neural networks always extend `nn.Module`. They define their sub-parts in their constructor, which are convolutional layers and fully connected linear layers in this case, and the method `forward` is expected to receive an input image and output the network target.
The network parameters are the weights of the `Conv2d` and `Linear` layers, which are conveniently hidden here, but can be accessed if you try to access their `weights` elements. The network parameters are the weights of the `Conv2d` and `Linear` layers, which are conveniently hidden here, but can be accessed if you try to access their `weights` elements.
We will not directly output the label probabilities, since we do not actually need it to optimize the neural network: we need only the logits. We will not directly output the label probabilities, since we do not actually need it to optimize the neural network: we need only the logits.
%% Cell type:code id:d908ef86 tags: %% Cell type:code id:d908ef86 tags:
``` python ``` python
class Network(nn.Module): class Network(nn.Module):
""" """
This is our parametrized function. This is our parametrized function.
It stores all the parametrized weights theta inside the model object. It stores all the parametrized weights theta inside the model object.
For such a simple example data, it was not necessary to have such a complex model: For such a simple example data, it was not necessary to have such a complex model:
this was only done here to show the interface provided by PyTorch. this was only done here to show the interface provided by PyTorch.
The forward function receives the x values and outputs an estimate of the target. The forward function receives the x values and outputs an estimate of the target.
The nn.Sequential object allows one to apply each step in the given list of parameter The nn.Sequential object allows one to apply each step in the given list of parameter
in a sequential way. An alternative would be to create each of these layers manually in a sequential way. An alternative would be to create each of these layers manually
and apply them one after the other in the forward method. and apply them one after the other in the forward method.
""" """
def __init__(self, input_dimension: int=1, output_dimension: int=1): def __init__(self, input_dimension: int=1, output_dimension: int=1):
""" """
Constructor. Here we initialize the weights. Constructor. Here we initialize the weights.
""" """
super().__init__() super().__init__()
hidden_layer = 100 hidden_layer = 100
self.model = nn.Sequential( self.model = nn.Sequential(
nn.Linear(input_dimension, hidden_layer), nn.Linear(input_dimension, hidden_layer),
nn.ReLU(), nn.ReLU(),
nn.Linear(hidden_layer, output_dimension) nn.Linear(hidden_layer, output_dimension)
) )
def forward(self, x): def forward(self, x):
""" """
This function is called when one does my_network(x) and it represents the action This function is called when one does my_network(x) and it represents the action
of our parametrized function in the input. of our parametrized function in the input.
""" """
return self.model(x) return self.model(x)
``` ```
%% Cell type:markdown id:9c5620dc tags: %% Cell type:markdown id:9c5620dc tags:
Let us create one instance of this network. We also create an instance of PyTorch's `DataLoader`, which has the task of taking a given number of data elements and outputing it in a single object. This "mini-batch" of data is used during training, so that we do not need to load the entire data in memory during the optimization procedure. Let us create one instance of this network. We also create an instance of PyTorch's `DataLoader`, which has the task of taking a given number of data elements and outputing it in a single object. This "mini-batch" of data is used during training, so that we do not need to load the entire data in memory during the optimization procedure.
We also create an instance of the Adam optimizer, which is used to tune the parameters of the network. We also create an instance of the Adam optimizer, which is used to tune the parameters of the network.
%% Cell type:code id:988e1979 tags: %% Cell type:code id:988e1979 tags:
``` python ``` python
network = Network() network = Network()
B = 10 B = 10
loader = torch.utils.data.DataLoader(my_dataset, batch_size=B) loader = torch.utils.data.DataLoader(my_dataset, batch_size=B)
optimizer = torch.optim.Adam(network.parameters(), lr=1e-3) optimizer = torch.optim.Adam(network.parameters(), lr=1e-3)
``` ```
%% Cell type:markdown id:3ee54520 tags: %% Cell type:markdown id:3ee54520 tags:
Now we actually repeatedly try to optimize the network parameters. Each time we go through all the data we have, we go through one "epoch". For each epoch, we take several "mini-batches" of data (given by the `DataLoader` in `loader`) and use it to make one training step. Now we actually repeatedly try to optimize the network parameters. Each time we go through all the data we have, we go through one "epoch". For each epoch, we take several "mini-batches" of data (given by the `DataLoader` in `loader`) and use it to make one training step.
%% Cell type:code id:d15d655d tags: %% Cell type:code id:d15d655d tags:
``` python ``` python
epochs = 100 epochs = 100
# for each epoch # for each epoch
for epoch in range(epochs): for epoch in range(epochs):
losses = list() losses = list()
# for each mini-batch given by the loader: # for each mini-batch given by the loader:
for batch in loader: for batch in loader:
# get the input in the mini-batch # get the input in the mini-batch
# this has size (B, C) # this has size (B, C)
# where B is the mini-batch size # where B is the mini-batch size
# C is the number of features (1 in this case) # C is the number of features (1 in this case)
features = batch["data"] features = batch["data"]
# get the targets in the mini-batch (there shall be B of them) # get the targets in the mini-batch (there shall be B of them)
target = batch["target"] target = batch["target"]
# get the output of the neural network: # get the output of the neural network:
prediction = network(features) prediction = network(features)
# calculate the loss function being minimized # calculate the loss function being minimized
# in this case, it is the mean-squared error between the prediction and the target values # in this case, it is the mean-squared error between the prediction and the target values
loss = F.mse_loss(prediction, target) loss = F.mse_loss(prediction, target)
# exactly equivalent to: # exactly equivalent to:
#loss = ((prediction - target)**2).mean() #loss = ((prediction - target)**2).mean()
# clean the optimizer temporary gradient storage # clean the optimizer temporary gradient storage
optimizer.zero_grad() optimizer.zero_grad()
# calculate the gradient of the loss function as a function of the gradients # calculate the gradient of the loss function as a function of the gradients
loss.backward() loss.backward()
# ask the Adam optimizer to change the parameters in the direction of - gradient # ask the Adam optimizer to change the parameters in the direction of - gradient
# Adam scales the gradient by a constant which is adaptively tuned # Adam scales the gradient by a constant which is adaptively tuned
# take a look at the Adam paper for more details: https://arxiv.org/abs/1412.6980 # take a look at the Adam paper for more details: https://arxiv.org/abs/1412.6980
optimizer.step() optimizer.step()
losses.append(loss.detach().cpu().item()) losses.append(loss.detach().cpu().item())
avg_loss = np.mean(np.array(losses)) avg_loss = np.mean(np.array(losses))
print(f"Epoch {epoch}/{epochs}: average loss {avg_loss:.5f}") print(f"Epoch {epoch}/{epochs}: average loss {avg_loss:.5f}")
``` ```
%% Output %% Output
Epoch 0/100: average loss 378.81588 Epoch 0/100: average loss 378.81588
Epoch 1/100: average loss 275.14966 Epoch 1/100: average loss 275.14966
Epoch 2/100: average loss 209.67680 Epoch 2/100: average loss 209.67680
Epoch 3/100: average loss 177.45487 Epoch 3/100: average loss 177.45487
Epoch 4/100: average loss 161.36641 Epoch 4/100: average loss 161.36641
Epoch 5/100: average loss 151.22071 Epoch 5/100: average loss 151.22071
Epoch 6/100: average loss 143.62892 Epoch 6/100: average loss 143.62892
Epoch 7/100: average loss 137.58497 Epoch 7/100: average loss 137.58497
Epoch 8/100: average loss 132.61164 Epoch 8/100: average loss 132.61164
Epoch 9/100: average loss 128.38996 Epoch 9/100: average loss 128.38996
Epoch 10/100: average loss 124.72414 Epoch 10/100: average loss 124.72414
Epoch 11/100: average loss 121.49631 Epoch 11/100: average loss 121.49631
Epoch 12/100: average loss 118.62273 Epoch 12/100: average loss 118.62273
Epoch 13/100: average loss 116.04727 Epoch 13/100: average loss 116.04727
Epoch 14/100: average loss 113.73365 Epoch 14/100: average loss 113.73365
Epoch 15/100: average loss 111.67534 Epoch 15/100: average loss 111.67534
Epoch 16/100: average loss 109.85956 Epoch 16/100: average loss 109.85956
Epoch 17/100: average loss 108.26555 Epoch 17/100: average loss 108.26555
Epoch 18/100: average loss 106.88248 Epoch 18/100: average loss 106.88248
Epoch 19/100: average loss 105.69173 Epoch 19/100: average loss 105.69173
Epoch 20/100: average loss 104.67635 Epoch 20/100: average loss 104.67635
Epoch 21/100: average loss 103.80502 Epoch 21/100: average loss 103.80502
Epoch 22/100: average loss 103.06343 Epoch 22/100: average loss 103.06343
Epoch 23/100: average loss 102.43223 Epoch 23/100: average loss 102.43223
Epoch 24/100: average loss 101.89756 Epoch 24/100: average loss 101.89756
Epoch 25/100: average loss 101.43507 Epoch 25/100: average loss 101.43507
Epoch 26/100: average loss 101.03641 Epoch 26/100: average loss 101.03641
Epoch 27/100: average loss 100.68191 Epoch 27/100: average loss 100.68191
Epoch 28/100: average loss 100.35821 Epoch 28/100: average loss 100.35821
Epoch 29/100: average loss 100.06578 Epoch 29/100: average loss 100.06578
Epoch 30/100: average loss 99.79758 Epoch 30/100: average loss 99.79758
Epoch 31/100: average loss 99.54630 Epoch 31/100: average loss 99.54630
Epoch 32/100: average loss 99.31432 Epoch 32/100: average loss 99.31432
Epoch 33/100: average loss 99.08812 Epoch 33/100: average loss 99.08812
Epoch 34/100: average loss 98.87219 Epoch 34/100: average loss 98.87219
Epoch 35/100: average loss 98.67368 Epoch 35/100: average loss 98.67368
Epoch 36/100: average loss 98.48651 Epoch 36/100: average loss 98.48651
Epoch 37/100: average loss 98.31420 Epoch 37/100: average loss 98.31420
Epoch 38/100: average loss 98.15310 Epoch 38/100: average loss 98.15310
Epoch 39/100: average loss 97.99930 Epoch 39/100: average loss 97.99930
Epoch 40/100: average loss 97.85031 Epoch 40/100: average loss 97.85031
Epoch 41/100: average loss 97.71002 Epoch 41/100: average loss 97.71002
Epoch 42/100: average loss 97.57293 Epoch 42/100: average loss 97.57293
Epoch 43/100: average loss 97.44047 Epoch 43/100: average loss 97.44047
Epoch 44/100: average loss 97.31417 Epoch 44/100: average loss 97.31417
Epoch 45/100: average loss 97.18990 Epoch 45/100: average loss 97.18990
Epoch 46/100: average loss 97.07419 Epoch 46/100: average loss 97.07419
Epoch 47/100: average loss 96.96548 Epoch 47/100: average loss 96.96548
Epoch 48/100: average loss 96.86184 Epoch 48/100: average loss 96.86184
Epoch 49/100: average loss 96.76805 Epoch 49/100: average loss 96.76805
Epoch 50/100: average loss 96.67791 Epoch 50/100: average loss 96.67791
Epoch 51/100: average loss 96.59360 Epoch 51/100: average loss 96.59360
Epoch 52/100: average loss 96.51472 Epoch 52/100: average loss 96.51472
Epoch 53/100: average loss 96.43937 Epoch 53/100: average loss 96.43937
Epoch 54/100: average loss 96.36539 Epoch 54/100: average loss 96.36539
Epoch 55/100: average loss 96.29459 Epoch 55/100: average loss 96.29459
Epoch 56/100: average loss 96.22356 Epoch 56/100: average loss 96.22356
Epoch 57/100: average loss 96.15634 Epoch 57/100: average loss 96.15634
Epoch 58/100: average loss 96.08934 Epoch 58/100: average loss 96.08934
Epoch 59/100: average loss 96.02401 Epoch 59/100: average loss 96.02401
Epoch 60/100: average loss 95.96307 Epoch 60/100: average loss 95.96307
Epoch 61/100: average loss 95.90349 Epoch 61/100: average loss 95.90349
Epoch 62/100: average loss 95.84973 Epoch 62/100: average loss 95.84973
Epoch 63/100: average loss 95.79636 Epoch 63/100: average loss 95.79636
Epoch 64/100: average loss 95.74215 Epoch 64/100: average loss 95.74215
Epoch 65/100: average loss 95.69529 Epoch 65/100: average loss 95.69529
Epoch 66/100: average loss 95.64951 Epoch 66/100: average loss 95.64951
Epoch 67/100: average loss 95.60449 Epoch 67/100: average loss 95.60449
Epoch 68/100: average loss 95.56373 Epoch 68/100: average loss 95.56373
Epoch 69/100: average loss 95.52165 Epoch 69/100: average loss 95.52165
Epoch 70/100: average loss 95.48233 Epoch 70/100: average loss 95.48233
Epoch 71/100: average loss 95.44179 Epoch 71/100: average loss 95.44179
Epoch 72/100: average loss 95.39826 Epoch 72/100: average loss 95.39826
Epoch 73/100: average loss 95.35763 Epoch 73/100: average loss 95.35763
Epoch 74/100: average loss 95.31944 Epoch 74/100: average loss 95.31944
Epoch 75/100: average loss 95.27754 Epoch 75/100: average loss 95.27754
Epoch 76/100: average loss 95.23919 Epoch 76/100: average loss 95.23919
Epoch 77/100: average loss 95.20086 Epoch 77/100: average loss 95.20086
Epoch 78/100: average loss 95.16258 Epoch 78/100: average loss 95.16258
Epoch 79/100: average loss 95.12233 Epoch 79/100: average loss 95.12233
Epoch 80/100: average loss 95.08201 Epoch 80/100: average loss 95.08201
Epoch 81/100: average loss 95.04595 Epoch 81/100: average loss 95.04595
Epoch 82/100: average loss 95.01281 Epoch 82/100: average loss 95.01281
Epoch 83/100: average loss 94.97996 Epoch 83/100: average loss 94.97996
Epoch 84/100: average loss 94.94827 Epoch 84/100: average loss 94.94827
Epoch 85/100: average loss 94.91624 Epoch 85/100: average loss 94.91624
Epoch 86/100: average loss 94.88639 Epoch 86/100: average loss 94.88639
Epoch 87/100: average loss 94.85546 Epoch 87/100: average loss 94.85546
Epoch 88/100: average loss 94.82733 Epoch 88/100: average loss 94.82733
Epoch 89/100: average loss 94.79647 Epoch 89/100: average loss 94.79647
Epoch 90/100: average loss 94.77049 Epoch 90/100: average loss 94.77049
Epoch 91/100: average loss 94.74167 Epoch 91/100: average loss 94.74167
Epoch 92/100: average loss 94.71930 Epoch 92/100: average loss 94.71930
Epoch 93/100: average loss 94.69341 Epoch 93/100: average loss 94.69341
Epoch 94/100: average loss 94.66904 Epoch 94/100: average loss 94.66904
Epoch 95/100: average loss 94.64581 Epoch 95/100: average loss 94.64581
Epoch 96/100: average loss 94.61936 Epoch 96/100: average loss 94.61936
Epoch 97/100: average loss 94.59652 Epoch 97/100: average loss 94.59652
Epoch 98/100: average loss 94.57301 Epoch 98/100: average loss 94.57301
Epoch 99/100: average loss 94.55085 Epoch 99/100: average loss 94.55085
%% Cell type:markdown id:a4980bf4 tags: %% Cell type:markdown id:a4980bf4 tags:
Let us check what the network says about some new data it has never seen before. Let us check what the network says about some new data it has never seen before.
%% Cell type:code id:09646d29 tags: %% Cell type:code id:09646d29 tags:
``` python ``` python
test_data = generate_data(N=1000) test_data = generate_data(N=1000)
``` ```
%% Cell type:markdown id:e315b5dc tags: %% Cell type:markdown id:e315b5dc tags:
And now we can plot again the new images, now showing what the network tells us about it. And now we can plot again the new images, now showing what the network tells us about it.
%% Cell type:code id:7a06a4c0 tags: %% Cell type:code id:7a06a4c0 tags:
``` python ``` python
predicted = network(torch.from_numpy(test_data[:,0:1])).detach().numpy() predicted = network(torch.from_numpy(test_data[:,0:1])).detach().numpy()
``` ```
%% Cell type:code id:bab0ce43 tags: %% Cell type:code id:bab0ce43 tags:
``` python ``` python
fig = plt.figure() fig = plt.figure()
ax = fig.add_subplot(111) ax = fig.add_subplot(111)
ax.scatter(test_data[:, 0], test_data[:, 1], label="Test data") ax.scatter(test_data[:, 0], test_data[:, 1], label="Test data")
ax.scatter(test_data[:, 0], predicted, label="Predicted") ax.scatter(test_data[:, 0], predicted, label="Predicted")
ax.set(xlabel="$x$", ylabel="$f(x)$") ax.set(xlabel="$x$", ylabel="$f(x)$")
#ax.set_yscale('log') #ax.set_yscale('log')
plt.legend(frameon=False) plt.legend(frameon=False)
plt.show() plt.show()
``` ```
%% Output %% Output
%% Cell type:markdown id:6bd7c62d tags: %% Cell type:markdown id:6bd7c62d tags:
## What about an uncertainty?! ## What about a uncertainties?!
The method shown before finds the neural network parameters which maximize the log-likelihood of the data. But not all parameters are equally likely and we can estimate an uncertainty for them. The method shown before finds the neural network parameters which maximize the log-likelihood of the data. But not all parameters are equally likely and we can estimate an uncertainty for them.
With an uncertainty for the parameters, we can propagate the uncertainty through the neural network and obtain an uncertainty on the prediction of the regression output. With an uncertainty for the parameters, we can propagate the uncertainty through the neural network and obtain an uncertainty on the prediction of the regression output.
This can be done assuming each weight in the network function has a given probability distribution and instead of fitting a single value for the weight, we fit the parameters of this probability distribution. For the example shown here, we assume that the probability distribution of the weights is Gaussian and we aim to obtain the mean and variance of the Gaussian. This can be done assuming each weight in the network function has a given probability distribution and instead of fitting a single value for the weight, we fit the parameters of this probability distribution. For the example shown here, we assume that the probability distribution of the weights is Gaussian and we aim to obtain the mean and variance of the Gaussian.
We are going to include the epistemic uncertainty through the variation of the weights. That is, the fact that the weights vary and lead to different effective functions allow us to model different $f(x)$ dependence relationships and this is attributed to the epistemic uncertainty. We are going to include the epistemic uncertainty through the variation of the weights. That is, the fact that the weights vary and lead to different effective functions allow us to model different $f(x)$ dependence relationships and this is attributed to the epistemic uncertainty.
We additionally assume that the data collected has some aleatoric uncertainty, which means that every point is uncertain by some fixed unknown amount. To model this effect, we assume that the likelihood function $p(\text{data}|\theta)$ can be modelled by a Gaussian distribution with a certain standard deviation $\sigma_a$. This standard deviation will be used to model the aleatoric uncertainty. We additionally assume that the data collected has some aleatoric uncertainty, which means that every point is uncertain by some fixed unknown amount. To model this effect, we assume that the likelihood function $p(\text{data}|\theta)$ can be modelled by a Gaussian distribution with a certain standard deviation $\sigma_a$. This standard deviation will be used to model the aleatoric uncertainty.
The final loss function to be optimised here is: The final loss function to be optimised here is:
$\mathcal{L} = -\mathbb{E}_{\text{data}}\left[\log p(\text{data}|\text{weights})\right] + \frac{1}{M} KL(\text{weights}|\text{prior})$ $\mathcal{L} = -\mathbb{E}_{\text{data}}\left[\log p(\text{data}|\text{weights})\right] + \frac{1}{M} KL(\text{weights}|\text{prior})$
The first term is assumed to be a Gaussian with the standard deviation given by the aleatoric uncertainty (assumed to be the same for every data point, but this could be changed to be data-point specific as well!). The second term corresponds to a penalty for varying the weights away from the prior assumption that the weights are Gaussian with a mean zero and standard deviation 0.1. In this equation, $M$ is the number of batches used. The first term is assumed to be a Gaussian with the standard deviation given by the aleatoric uncertainty (assumed to be the same for every data point, but this could be changed to be data-point specific as well!). The second term corresponds to a penalty for varying the weights away from the prior assumption that the weights are Gaussian with a mean zero and standard deviation 0.1. In this equation, $M$ is the number of batches used.
It can be shown that by minimizing this loss function, we obtain weights mean and standard deviations that approximately optimize the posterior probability given by the Bayes rule: $p(\text{weights}|\text{data}) = \frac{p(\text{data}|\text{weights}) p(\text{weights})}{p(\text{data})}$. The proof follows by algebraically trying to minimize the Kullback-Leibler divergence between the true posterior given by the Bayes rule and the approximate posterior, on which the weights are assumed to be Gaussian and the likelihood is assumed to be Gaussian. It can be shown that by minimizing this loss function, we obtain weights mean and standard deviations that approximately optimize the posterior probability given by the Bayes rule: $p(\text{weights}|\text{data}) = \frac{p(\text{data}|\text{weights}) p(\text{weights})}{p(\text{data})}$. The proof follows by algebraically trying to minimize the Kullback-Leibler divergence between the true posterior given by the Bayes rule and the approximate posterior, on which the weights are assumed to be Gaussian and the likelihood is assumed to be Gaussian.
![elbo.png](attachment:elbo.png) ![elbo.png](attachment:elbo.png)
The details of the derivation can be consulted in the following paper: The details of the derivation can be consulted in the following paper:
https://arxiv.org/pdf/1505.05424.pdf https://arxiv.org/pdf/1505.05424.pdf
%% Cell type:code id:f8d501ff tags: %% Cell type:code id:f8d501ff tags:
``` python ``` python
class BayesianNetwork(nn.Module): class BayesianNetwork(nn.Module):
""" """
A model Bayesian Neural network. A model Bayesian Neural network.
Each weight is represented by a Gaussian with a mean and a standard deviation. Each weight is represented by a Gaussian with a mean and a standard deviation.
Each evaluation of forward leads to a different choice of the weights, so running Each evaluation of forward leads to a different choice of the weights, so running
forward several times we can check the effect of the weights variation on the same input. forward several times we can check the effect of the weights variation on the same input.
The nll function implements the negative log likelihood to be used as the first part of the loss The nll function implements the negative log likelihood to be used as the first part of the loss
function (the second shall be the Kullback-Leibler divergence). function (the second shall be the Kullback-Leibler divergence).
The negative log-likelihood is simply the negative log likelihood of a Gaussian The negative log-likelihood is simply the negative log likelihood of a Gaussian
between the prediction and the true value. The standard deviation of the Gaussian is left as a between the prediction and the true value. The standard deviation of the Gaussian is left as a
parameter to be fit: sigma. parameter to be fit: sigma.
""" """
def __init__(self, input_dimension: int=1, output_dimension: int=1): def __init__(self, input_dimension: int=1, output_dimension: int=1):
super(BayesianNetwork, self).__init__() super(BayesianNetwork, self).__init__()
hidden_dimension = 100 hidden_dimension = 100
self.model = nn.Sequential( self.model = nn.Sequential(
bnn.BayesLinear(prior_mu=0, bnn.BayesLinear(prior_mu=0,
prior_sigma=0.1, prior_sigma=0.1,
in_features=input_dimension, in_features=input_dimension,
out_features=hidden_dimension), out_features=hidden_dimension),
nn.ReLU(), nn.ReLU(),
bnn.BayesLinear(prior_mu=0, bnn.BayesLinear(prior_mu=0,
prior_sigma=0.1, prior_sigma=0.1,
in_features=hidden_dimension, in_features=hidden_dimension,
out_features=hidden_dimension), out_features=hidden_dimension),
nn.ReLU(), nn.ReLU(),
bnn.BayesLinear(prior_mu=0, bnn.BayesLinear(prior_mu=0,
prior_sigma=0.1, prior_sigma=0.1,
in_features=hidden_dimension, in_features=hidden_dimension,
out_features=output_dimension) out_features=output_dimension)
) )
self.log_sigma2 = nn.Parameter(torch.ones(1), requires_grad=True) self.log_sigma2 = nn.Parameter(torch.ones(1), requires_grad=True)
def forward(self, x: torch.Tensor) -> torch.Tensor: def forward(self, x: torch.Tensor) -> torch.Tensor:
""" """
Calculate the result f(x) applied on the input x. Calculate the result f(x) applied on the input x.
""" """
return self.model(x) return self.model(x)
def nll(self, prediction: torch.Tensor, target: torch.Tensor) -> torch.Tensor: def nll(self, prediction: torch.Tensor, target: torch.Tensor) -> torch.Tensor:
""" """
Calculate the negative log-likelihood (divided by the batch size, since we take the mean). Calculate the negative log-likelihood (divided by the batch size, since we take the mean).
""" """
error = prediction - target error = prediction - target
squared_error = error**2 squared_error = error**2
sigma2 = torch.exp(self.log_sigma2)[0] sigma2 = torch.exp(self.log_sigma2)[0]
norm_error = 0.5*squared_error/sigma2 norm_error = 0.5*squared_error/sigma2
norm_term = 0.5*(np.log(2*np.pi) + self.log_sigma2[0]) norm_term = 0.5*(np.log(2*np.pi) + self.log_sigma2[0])
return norm_error.mean() + norm_term return norm_error.mean() + norm_term
def aleatoric_uncertainty(self) -> torch.Tensor: def aleatoric_uncertainty(self) -> torch.Tensor:
""" """
Get the aleatoric component of the uncertainty. Get the aleatoric component of the uncertainty.
""" """
return torch.exp(0.5*self.log_sigma2[0]) return torch.exp(0.5*self.log_sigma2[0])
``` ```
%% Cell type:code id:7b9beb21 tags: %% Cell type:code id:7b9beb21 tags:
``` python ``` python
# create the neural network: # create the neural network:
b_network = BayesianNetwork() b_network = BayesianNetwork()
# create the object to load the data: # create the object to load the data:
B = 10 B = 10
loader = torch.utils.data.DataLoader(my_dataset, batch_size=B) loader = torch.utils.data.DataLoader(my_dataset, batch_size=B)
# create the optimizer to be used # create the optimizer to be used
optimizer = torch.optim.Adam(b_network.parameters(), lr=0.001) optimizer = torch.optim.Adam(b_network.parameters(), lr=0.001)
# the Kullback-Leibler divergence should be scaled by 1/number_of_batches # the Kullback-Leibler divergence should be scaled by 1/number_of_batches
# see https://arxiv.org/abs/1505.05424 for more information on this # see https://arxiv.org/abs/1505.05424 for more information on this
number_of_batches = len(my_dataset)/float(B) number_of_batches = len(my_dataset)/float(B)
weight_kl = 1.0/float(number_of_batches) weight_kl = 1.0/float(number_of_batches)
``` ```
%% Cell type:markdown id:c68ba2e2 tags: %% Cell type:markdown id:c68ba2e2 tags:
The criteria for finding the optimal weights are based on the Bayes' theorem, on which the posterior probability of the weights is proportional to the likelihood of the data given the weights and to the prior probability of the weights. We assume the prior probability of the weights is Gaussian corresponding to a unit Gaussian centred at zero and with standard deviation 0.1. This prior has a regularizing effect, preventing overtraining. The criteria for finding the optimal weights are based on the Bayes' theorem, on which the posterior probability of the weights is proportional to the likelihood of the data given the weights and to the prior probability of the weights. We assume the prior probability of the weights is Gaussian corresponding to a unit Gaussian centred at zero and with standard deviation 0.1. This prior has a regularizing effect, preventing overtraining.
We can translate the Bayes theorem and the assumption that the final posterior distribution is also Gaussian into an optimization procedure to find the posterior mean and variance of the posterior distribution. The function optimized to obtain the mean and variances of the Gaussians for the weights is the sum between the mean-squared-error (corresponding to a Gaussian log-likelihood of the data) and the Kullback-Leibler divergence between the weights distribution and the prior Gaussian. We can translate the Bayes theorem and the assumption that the final posterior distribution is also Gaussian into an optimization procedure to find the posterior mean and variance of the posterior distribution. The function optimized to obtain the mean and variances of the Gaussians for the weights is the sum between the mean-squared-error (corresponding to a Gaussian log-likelihood of the data) and the Kullback-Leibler divergence between the weights distribution and the prior Gaussian.
%% Cell type:code id:fbea6b0c tags: %% Cell type:code id:fbea6b0c tags:
``` python ``` python
kl_loss = bnn.BKLLoss(reduction='mean', last_layer_only=False) kl_loss = bnn.BKLLoss(reduction='mean', last_layer_only=False)
``` ```
%% Cell type:code id:b92ed4b0 tags: %% Cell type:code id:b92ed4b0 tags:
``` python ``` python
epochs = 500 epochs = 500
# for each epoch # for each epoch
for epoch in range(epochs): for epoch in range(epochs):
losses = list() losses = list()
# for each mini-batch given by the loader: # for each mini-batch given by the loader:
for batch in loader: for batch in loader:
# get the input in the mini-batch # get the input in the mini-batch
# this has size (B, C) # this has size (B, C)
# where B is the mini-batch size # where B is the mini-batch size
# C is the number of features (1 in this case) # C is the number of features (1 in this case)
features = batch["data"] features = batch["data"]
# get the targets in the mini-batch (there shall be B of them) # get the targets in the mini-batch (there shall be B of them)
target = batch["target"] target = batch["target"]
# get the output of the neural network: # get the output of the neural network:
prediction = b_network(features) prediction = b_network(features)
# calculate the loss function being minimized # calculate the loss function being minimized
# in this case, it is the mean-squared error between the prediction and the target values added # in this case, it is the mean-squared error between the prediction and the target values added
# to the Kullback-Leibler divergence between the current weight Gaussian and # to the Kullback-Leibler divergence between the current weight Gaussian and
# the prior Gaussian, set to the unit Normal distribution # the prior Gaussian, set to the unit Normal distribution
nll = b_network.nll(prediction, target) nll = b_network.nll(prediction, target)
prior = kl_loss(b_network) prior = kl_loss(b_network)
loss = nll + weight_kl * prior loss = nll + weight_kl * prior
# clean the optimizer temporary gradient storage # clean the optimizer temporary gradient storage
optimizer.zero_grad() optimizer.zero_grad()
# calculate the gradient of the loss function as a function of the gradients # calculate the gradient of the loss function as a function of the gradients
loss.backward() loss.backward()
# ask the Adam optimizer to change the parameters in the direction of - gradient # ask the Adam optimizer to change the parameters in the direction of - gradient
# Adam scales the gradient by a constant which is adaptively tuned # Adam scales the gradient by a constant which is adaptively tuned
# take a look at the Adam paper for more details: https://arxiv.org/abs/1412.6980 # take a look at the Adam paper for more details: https://arxiv.org/abs/1412.6980
optimizer.step() optimizer.step()
ale = b_network.aleatoric_uncertainty().detach().numpy() ale = b_network.aleatoric_uncertainty().detach().numpy()
losses.append(loss.detach().cpu().item()) losses.append(loss.detach().cpu().item())
avg_loss = np.mean(np.array(losses)) avg_loss = np.mean(np.array(losses))
print(f"Epoch {epoch}/{epochs} total: {avg_loss:.5f}, -LL: {nll.item():.5f}, prior: {prior.item():.5f}, aleatoric unc.: {ale:.5f}") print(f"Epoch {epoch}/{epochs} total: {avg_loss:.5f}, -LL: {nll.item():.5f}, prior: {prior.item():.5f}, aleatoric unc.: {ale:.5f}")
``` ```
%% Output %% Output
Epoch 0/500 total: 58.60563, -LL: 29.30297, prior: 0.49401, aleatoric unc.: 1.70738 Epoch 0/500 total: 58.60563, -LL: 29.30297, prior: 0.49401, aleatoric unc.: 1.70738
Epoch 1/500 total: 29.89883, -LL: 22.77377, prior: 0.51578, aleatoric unc.: 1.74675 Epoch 1/500 total: 29.89883, -LL: 22.77377, prior: 0.51578, aleatoric unc.: 1.74675
Epoch 2/500 total: 24.01581, -LL: 19.56905, prior: 0.54058, aleatoric unc.: 1.78531 Epoch 2/500 total: 24.01581, -LL: 19.56905, prior: 0.54058, aleatoric unc.: 1.78531
Epoch 3/500 total: 20.88541, -LL: 15.23512, prior: 0.56686, aleatoric unc.: 1.82458 Epoch 3/500 total: 20.88541, -LL: 15.23512, prior: 0.56686, aleatoric unc.: 1.82458
Epoch 4/500 total: 18.42086, -LL: 12.56114, prior: 0.59138, aleatoric unc.: 1.86375 Epoch 4/500 total: 18.42086, -LL: 12.56114, prior: 0.59138, aleatoric unc.: 1.86375
Epoch 5/500 total: 16.78540, -LL: 11.47957, prior: 0.61184, aleatoric unc.: 1.90320 Epoch 5/500 total: 16.78540, -LL: 11.47957, prior: 0.61184, aleatoric unc.: 1.90320
Epoch 6/500 total: 15.74639, -LL: 10.39981, prior: 0.63168, aleatoric unc.: 1.94387 Epoch 6/500 total: 15.74639, -LL: 10.39981, prior: 0.63168, aleatoric unc.: 1.94387
Epoch 7/500 total: 15.23265, -LL: 10.92122, prior: 0.64618, aleatoric unc.: 1.98676 Epoch 7/500 total: 15.23265, -LL: 10.92122, prior: 0.64618, aleatoric unc.: 1.98676
Epoch 8/500 total: 14.43628, -LL: 14.04801, prior: 0.65764, aleatoric unc.: 2.03084 Epoch 8/500 total: 14.43628, -LL: 14.04801, prior: 0.65764, aleatoric unc.: 2.03084
Epoch 9/500 total: 13.75567, -LL: 9.28419, prior: 0.66495, aleatoric unc.: 2.07617 Epoch 9/500 total: 13.75567, -LL: 9.28419, prior: 0.66495, aleatoric unc.: 2.07617
Epoch 10/500 total: 13.46332, -LL: 9.24567, prior: 0.67066, aleatoric unc.: 2.12401 Epoch 10/500 total: 13.46332, -LL: 9.24567, prior: 0.67066, aleatoric unc.: 2.12401
Epoch 11/500 total: 12.68328, -LL: 8.02405, prior: 0.68236, aleatoric unc.: 2.17244 Epoch 11/500 total: 12.68328, -LL: 8.02405, prior: 0.68236, aleatoric unc.: 2.17244
Epoch 12/500 total: 12.16965, -LL: 10.14399, prior: 0.69100, aleatoric unc.: 2.22149 Epoch 12/500 total: 12.16965, -LL: 10.14399, prior: 0.69100, aleatoric unc.: 2.22149
Epoch 13/500 total: 11.89046, -LL: 10.02692, prior: 0.69707, aleatoric unc.: 2.27265 Epoch 13/500 total: 11.89046, -LL: 10.02692, prior: 0.69707, aleatoric unc.: 2.27265
Epoch 14/500 total: 11.31826, -LL: 8.22698, prior: 0.70348, aleatoric unc.: 2.32503 Epoch 14/500 total: 11.31826, -LL: 8.22698, prior: 0.70348, aleatoric unc.: 2.32503
Epoch 15/500 total: 11.15097, -LL: 8.46218, prior: 0.70463, aleatoric unc.: 2.37917 Epoch 15/500 total: 11.15097, -LL: 8.46218, prior: 0.70463, aleatoric unc.: 2.37917
Epoch 16/500 total: 10.78811, -LL: 6.91465, prior: 0.70900, aleatoric unc.: 2.43505 Epoch 16/500 total: 10.78811, -LL: 6.91465, prior: 0.70900, aleatoric unc.: 2.43505
Epoch 17/500 total: 10.27545, -LL: 7.55249, prior: 0.70822, aleatoric unc.: 2.49083 Epoch 17/500 total: 10.27545, -LL: 7.55249, prior: 0.70822, aleatoric unc.: 2.49083
Epoch 18/500 total: 9.88403, -LL: 7.78374, prior: 0.70896, aleatoric unc.: 2.54750 Epoch 18/500 total: 9.88403, -LL: 7.78374, prior: 0.70896, aleatoric unc.: 2.54750
Epoch 19/500 total: 9.49734, -LL: 6.96424, prior: 0.71104, aleatoric unc.: 2.60481 Epoch 19/500 total: 9.49734, -LL: 6.96424, prior: 0.71104, aleatoric unc.: 2.60481
Epoch 20/500 total: 9.18567, -LL: 6.83609, prior: 0.71782, aleatoric unc.: 2.66279 Epoch 20/500 total: 9.18567, -LL: 6.83609, prior: 0.71782, aleatoric unc.: 2.66279
Epoch 21/500 total: 8.77549, -LL: 6.85847, prior: 0.72051, aleatoric unc.: 2.72092 Epoch 21/500 total: 8.77549, -LL: 6.85847, prior: 0.72051, aleatoric unc.: 2.72092
Epoch 22/500 total: 8.64839, -LL: 6.52611, prior: 0.71736, aleatoric unc.: 2.78174 Epoch 22/500 total: 8.64839, -LL: 6.52611, prior: 0.71736, aleatoric unc.: 2.78174
Epoch 23/500 total: 8.43415, -LL: 7.78695, prior: 0.71795, aleatoric unc.: 2.84376 Epoch 23/500 total: 8.43415, -LL: 7.78695, prior: 0.71795, aleatoric unc.: 2.84376
Epoch 24/500 total: 8.17714, -LL: 6.13595, prior: 0.72104, aleatoric unc.: 2.90688 Epoch 24/500 total: 8.17714, -LL: 6.13595, prior: 0.72104, aleatoric unc.: 2.90688
Epoch 25/500 total: 7.70156, -LL: 6.61542, prior: 0.72217, aleatoric unc.: 2.96854 Epoch 25/500 total: 7.70156, -LL: 6.61542, prior: 0.72217, aleatoric unc.: 2.96854
Epoch 26/500 total: 7.61942, -LL: 6.63054, prior: 0.72667, aleatoric unc.: 3.03244 Epoch 26/500 total: 7.61942, -LL: 6.63054, prior: 0.72667, aleatoric unc.: 3.03244
Epoch 27/500 total: 7.42529, -LL: 5.76781, prior: 0.72993, aleatoric unc.: 3.09773 Epoch 27/500 total: 7.42529, -LL: 5.76781, prior: 0.72993, aleatoric unc.: 3.09773
Epoch 28/500 total: 7.29471, -LL: 6.51807, prior: 0.72927, aleatoric unc.: 3.16509 Epoch 28/500 total: 7.29471, -LL: 6.51807, prior: 0.72927, aleatoric unc.: 3.16509
Epoch 29/500 total: 7.12159, -LL: 5.48303, prior: 0.73176, aleatoric unc.: 3.23374 Epoch 29/500 total: 7.12159, -LL: 5.48303, prior: 0.73176, aleatoric unc.: 3.23374
Epoch 30/500 total: 6.79870, -LL: 6.10557, prior: 0.73237, aleatoric unc.: 3.30124 Epoch 30/500 total: 6.79870, -LL: 6.10557, prior: 0.73237, aleatoric unc.: 3.30124
Epoch 31/500 total: 6.60342, -LL: 5.65072, prior: 0.73511, aleatoric unc.: 3.36955 Epoch 31/500 total: 6.60342, -LL: 5.65072, prior: 0.73511, aleatoric unc.: 3.36955
Epoch 32/500 total: 6.63333, -LL: 6.37231, prior: 0.73616, aleatoric unc.: 3.44177 Epoch 32/500 total: 6.63333, -LL: 6.37231, prior: 0.73616, aleatoric unc.: 3.44177
Epoch 33/500 total: 6.25342, -LL: 5.67089, prior: 0.73999, aleatoric unc.: 3.51138 Epoch 33/500 total: 6.25342, -LL: 5.67089, prior: 0.73999, aleatoric unc.: 3.51138
Epoch 34/500 total: 6.25324, -LL: 5.80523, prior: 0.74006, aleatoric unc.: 3.58479 Epoch 34/500 total: 6.25324, -LL: 5.80523, prior: 0.74006, aleatoric unc.: 3.58479
Epoch 35/500 total: 6.06724, -LL: 4.63822, prior: 0.73928, aleatoric unc.: 3.65814 Epoch 35/500 total: 6.06724, -LL: 4.63822, prior: 0.73928, aleatoric unc.: 3.65814
Epoch 36/500 total: 5.97354, -LL: 4.87066, prior: 0.73962, aleatoric unc.: 3.73306 Epoch 36/500 total: 5.97354, -LL: 4.87066, prior: 0.73962, aleatoric unc.: 3.73306
Epoch 37/500 total: 5.75250, -LL: 5.21252, prior: 0.74361, aleatoric unc.: 3.80782 Epoch 37/500 total: 5.75250, -LL: 5.21252, prior: 0.74361, aleatoric unc.: 3.80782
Epoch 38/500 total: 5.70047, -LL: 4.60235, prior: 0.74538, aleatoric unc.: 3.88453 Epoch 38/500 total: 5.70047, -LL: 4.60235, prior: 0.74538, aleatoric unc.: 3.88453
Epoch 39/500 total: 5.65181, -LL: 5.21456, prior: 0.74527, aleatoric unc.: 3.96423 Epoch 39/500 total: 5.65181, -LL: 5.21456, prior: 0.74527, aleatoric unc.: 3.96423
Epoch 40/500 total: 5.43323, -LL: 4.63033, prior: 0.74861, aleatoric unc.: 4.04155 Epoch 40/500 total: 5.43323, -LL: 4.63033, prior: 0.74861, aleatoric unc.: 4.04155
Epoch 41/500 total: 5.28977, -LL: 5.04850, prior: 0.74891, aleatoric unc.: 4.11915 Epoch 41/500 total: 5.28977, -LL: 5.04850, prior: 0.74891, aleatoric unc.: 4.11915
Epoch 42/500 total: 5.24329, -LL: 5.24680, prior: 0.74535, aleatoric unc.: 4.19924 Epoch 42/500 total: 5.24329, -LL: 5.24680, prior: 0.74535, aleatoric unc.: 4.19924
Epoch 43/500 total: 5.17848, -LL: 4.70368, prior: 0.75037, aleatoric unc.: 4.28055 Epoch 43/500 total: 5.17848, -LL: 4.70368, prior: 0.75037, aleatoric unc.: 4.28055
Epoch 44/500 total: 5.16991, -LL: 4.86729, prior: 0.74977, aleatoric unc.: 4.36667 Epoch 44/500 total: 5.16991, -LL: 4.86729, prior: 0.74977, aleatoric unc.: 4.36667
Epoch 45/500 total: 5.00426, -LL: 4.76213, prior: 0.74627, aleatoric unc.: 4.45042 Epoch 45/500 total: 5.00426, -LL: 4.76213, prior: 0.74627, aleatoric unc.: 4.45042
Epoch 46/500 total: 4.96309, -LL: 4.31775, prior: 0.75200, aleatoric unc.: 4.53622 Epoch 46/500 total: 4.96309, -LL: 4.31775, prior: 0.75200, aleatoric unc.: 4.53622
Epoch 47/500 total: 4.89156, -LL: 4.62450, prior: 0.75400, aleatoric unc.: 4.62413 Epoch 47/500 total: 4.89156, -LL: 4.62450, prior: 0.75400, aleatoric unc.: 4.62413
Epoch 48/500 total: 4.77733, -LL: 4.33423, prior: 0.75599, aleatoric unc.: 4.71083 Epoch 48/500 total: 4.77733, -LL: 4.33423, prior: 0.75599, aleatoric unc.: 4.71083
Epoch 49/500 total: 4.70859, -LL: 4.23383, prior: 0.75542, aleatoric unc.: 4.79768 Epoch 49/500 total: 4.70859, -LL: 4.23383, prior: 0.75542, aleatoric unc.: 4.79768
Epoch 50/500 total: 4.61107, -LL: 4.23617, prior: 0.75896, aleatoric unc.: 4.88461 Epoch 50/500 total: 4.61107, -LL: 4.23617, prior: 0.75896, aleatoric unc.: 4.88461
Epoch 51/500 total: 4.53410, -LL: 4.05732, prior: 0.76156, aleatoric unc.: 4.97112 Epoch 51/500 total: 4.53410, -LL: 4.05732, prior: 0.76156, aleatoric unc.: 4.97112
Epoch 52/500 total: 4.53218, -LL: 4.03225, prior: 0.76646, aleatoric unc.: 5.06118 Epoch 52/500 total: 4.53218, -LL: 4.03225, prior: 0.76646, aleatoric unc.: 5.06118
Epoch 53/500 total: 4.47935, -LL: 4.09978, prior: 0.76704, aleatoric unc.: 5.15256 Epoch 53/500 total: 4.47935, -LL: 4.09978, prior: 0.76704, aleatoric unc.: 5.15256
Epoch 54/500 total: 4.39028, -LL: 3.94386, prior: 0.76755, aleatoric unc.: 5.24289 Epoch 54/500 total: 4.39028, -LL: 3.94386, prior: 0.76755, aleatoric unc.: 5.24289
Epoch 55/500 total: 4.38337, -LL: 4.36773, prior: 0.76699, aleatoric unc.: 5.33596 Epoch 55/500 total: 4.38337, -LL: 4.36773, prior: 0.76699, aleatoric unc.: 5.33596
Epoch 56/500 total: 4.35281, -LL: 4.11247, prior: 0.77196, aleatoric unc.: 5.43080 Epoch 56/500 total: 4.35281, -LL: 4.11247, prior: 0.77196, aleatoric unc.: 5.43080
Epoch 57/500 total: 4.29956, -LL: 3.87602, prior: 0.77093, aleatoric unc.: 5.52630 Epoch 57/500 total: 4.29956, -LL: 3.87602, prior: 0.77093, aleatoric unc.: 5.52630
Epoch 58/500 total: 4.23197, -LL: 4.01298, prior: 0.77028, aleatoric unc.: 5.62054 Epoch 58/500 total: 4.23197, -LL: 4.01298, prior: 0.77028, aleatoric unc.: 5.62054
Epoch 59/500 total: 4.24475, -LL: 3.99289, prior: 0.77126, aleatoric unc.: 5.71866 Epoch 59/500 total: 4.24475, -LL: 3.99289, prior: 0.77126, aleatoric unc.: 5.71866
Epoch 60/500 total: 4.16691, -LL: 3.89405, prior: 0.77055, aleatoric unc.: 5.81574 Epoch 60/500 total: 4.16691, -LL: 3.89405, prior: 0.77055, aleatoric unc.: 5.81574
Epoch 61/500 total: 4.16522, -LL: 3.96328, prior: 0.76699, aleatoric unc.: 5.91432 Epoch 61/500 total: 4.16522, -LL: 3.96328, prior: 0.76699, aleatoric unc.: 5.91432
Epoch 62/500 total: 4.11586, -LL: 3.69259, prior: 0.76921, aleatoric unc.: 6.01400 Epoch 62/500 total: 4.11586, -LL: 3.69259, prior: 0.76921, aleatoric unc.: 6.01400
Epoch 63/500 total: 4.07180, -LL: 3.89622, prior: 0.77077, aleatoric unc.: 6.11181 Epoch 63/500 total: 4.07180, -LL: 3.89622, prior: 0.77077, aleatoric unc.: 6.11181
Epoch 64/500 total: 4.07462, -LL: 3.86786, prior: 0.77294, aleatoric unc.: 6.21290 Epoch 64/500 total: 4.07462, -LL: 3.86786, prior: 0.77294, aleatoric unc.: 6.21290
Epoch 65/500 total: 4.01141, -LL: 3.77488, prior: 0.77207, aleatoric unc.: 6.31128 Epoch 65/500 total: 4.01141, -LL: 3.77488, prior: 0.77207, aleatoric unc.: 6.31128
Epoch 66/500 total: 3.99999, -LL: 3.65617, prior: 0.77586, aleatoric unc.: 6.41138 Epoch 66/500 total: 3.99999, -LL: 3.65617, prior: 0.77586, aleatoric unc.: 6.41138
Epoch 67/500 total: 4.02086, -LL: 3.71232, prior: 0.77457, aleatoric unc.: 6.51647 Epoch 67/500 total: 4.02086, -LL: 3.71232, prior: 0.77457, aleatoric unc.: 6.51647
Epoch 68/500 total: 3.96727, -LL: 3.82445, prior: 0.77131, aleatoric unc.: 6.61852 Epoch 68/500 total: 3.96727, -LL: 3.82445, prior: 0.77131, aleatoric unc.: 6.61852
Epoch 69/500 total: 3.91331, -LL: 3.59059, prior: 0.77492, aleatoric unc.: 6.71576 Epoch 69/500 total: 3.91331, -LL: 3.59059, prior: 0.77492, aleatoric unc.: 6.71576
Epoch 70/500 total: 3.92368, -LL: 3.87038, prior: 0.77663, aleatoric unc.: 6.81692 Epoch 70/500 total: 3.92368, -LL: 3.87038, prior: 0.77663, aleatoric unc.: 6.81692
Epoch 71/500 total: 3.91091, -LL: 3.71611, prior: 0.77434, aleatoric unc.: 6.91915 Epoch 71/500 total: 3.91091, -LL: 3.71611, prior: 0.77434, aleatoric unc.: 6.91915
Epoch 72/500 total: 3.90254, -LL: 3.84058, prior: 0.77568, aleatoric unc.: 7.02362 Epoch 72/500 total: 3.90254, -LL: 3.84058, prior: 0.77568, aleatoric unc.: 7.02362
Epoch 73/500 total: 3.86610, -LL: 3.79804, prior: 0.77613, aleatoric unc.: 7.12259 Epoch 73/500 total: 3.86610, -LL: 3.79804, prior: 0.77613, aleatoric unc.: 7.12259
Epoch 74/500 total: 3.86877, -LL: 3.67038, prior: 0.77967, aleatoric unc.: 7.22509 Epoch 74/500 total: 3.86877, -LL: 3.67038, prior: 0.77967, aleatoric unc.: 7.22509
Epoch 75/500 total: 3.83692, -LL: 3.71282, prior: 0.78043, aleatoric unc.: 7.32391 Epoch 75/500 total: 3.83692, -LL: 3.71282, prior: 0.78043, aleatoric unc.: 7.32391
Epoch 76/500 total: 3.84445, -LL: 3.81955, prior: 0.77407, aleatoric unc.: 7.42552 Epoch 76/500 total: 3.84445, -LL: 3.81955, prior: 0.77407, aleatoric unc.: 7.42552
Epoch 77/500 total: 3.82880, -LL: 3.67313, prior: 0.77377, aleatoric unc.: 7.52656 Epoch 77/500 total: 3.82880, -LL: 3.67313, prior: 0.77377, aleatoric unc.: 7.52656
Epoch 78/500 total: 3.81437, -LL: 3.66513, prior: 0.77432, aleatoric unc.: 7.62591 Epoch 78/500 total: 3.81437, -LL: 3.66513, prior: 0.77432, aleatoric unc.: 7.62591
Epoch 79/500 total: 3.78821, -LL: 3.75625, prior: 0.77875, aleatoric unc.: 7.71907 Epoch 79/500 total: 3.78821, -LL: 3.75625, prior: 0.77875, aleatoric unc.: 7.71907
Epoch 80/500 total: 3.83347, -LL: 3.61354, prior: 0.77968, aleatoric unc.: 7.82734 Epoch 80/500 total: 3.83347, -LL: 3.61354, prior: 0.77968, aleatoric unc.: 7.82734
Epoch 81/500 total: 3.78359, -LL: 3.59515, prior: 0.77631, aleatoric unc.: 7.92300 Epoch 81/500 total: 3.78359, -LL: 3.59515, prior: 0.77631, aleatoric unc.: 7.92300
Epoch 82/500 total: 3.78367, -LL: 3.67180, prior: 0.77953, aleatoric unc.: 8.01834 Epoch 82/500 total: 3.78367, -LL: 3.67180, prior: 0.77953, aleatoric unc.: 8.01834
Epoch 83/500 total: 3.75569, -LL: 3.61895, prior: 0.78101, aleatoric unc.: 8.10435 Epoch 83/500 total: 3.75569, -LL: 3.61895, prior: 0.78101, aleatoric unc.: 8.10435
Epoch 84/500 total: 3.76833, -LL: 3.55765, prior: 0.78393, aleatoric unc.: 8.19697 Epoch 84/500 total: 3.76833, -LL: 3.55765, prior: 0.78393, aleatoric unc.: 8.19697
Epoch 85/500 total: 3.76053, -LL: 3.68451, prior: 0.78236, aleatoric unc.: 8.28731 Epoch 85/500 total: 3.76053, -LL: 3.68451, prior: 0.78236, aleatoric unc.: 8.28731
Epoch 86/500 total: 3.76217, -LL: 3.74953, prior: 0.78270, aleatoric unc.: 8.37660 Epoch 86/500 total: 3.76217, -LL: 3.74953, prior: 0.78270, aleatoric unc.: 8.37660
Epoch 87/500 total: 3.73655, -LL: 3.64136, prior: 0.77914, aleatoric unc.: 8.45761 Epoch 87/500 total: 3.73655, -LL: 3.64136, prior: 0.77914, aleatoric unc.: 8.45761
Epoch 88/500 total: 3.75396, -LL: 3.62085, prior: 0.77929, aleatoric unc.: 8.54510 Epoch 88/500 total: 3.75396, -LL: 3.62085, prior: 0.77929, aleatoric unc.: 8.54510
Epoch 89/500 total: 3.74527, -LL: 3.68434, prior: 0.77997, aleatoric unc.: 8.62806 Epoch 89/500 total: 3.74527, -LL: 3.68434, prior: 0.77997, aleatoric unc.: 8.62806
Epoch 90/500 total: 3.75304, -LL: 3.66249, prior: 0.77914, aleatoric unc.: 8.71503 Epoch 90/500 total: 3.75304, -LL: 3.66249, prior: 0.77914, aleatoric unc.: 8.71503
Epoch 91/500 total: 3.72325, -LL: 3.60384, prior: 0.77980, aleatoric unc.: 8.78583 Epoch 91/500 total: 3.72325, -LL: 3.60384, prior: 0.77980, aleatoric unc.: 8.78583
Epoch 92/500 total: 3.74192, -LL: 3.59396, prior: 0.78044, aleatoric unc.: 8.86492 Epoch 92/500 total: 3.74192, -LL: 3.59396, prior: 0.78044, aleatoric unc.: 8.86492
Epoch 93/500 total: 3.71866, -LL: 3.61358, prior: 0.78299, aleatoric unc.: 8.93002 Epoch 93/500 total: 3.71866, -LL: 3.61358, prior: 0.78299, aleatoric unc.: 8.93002
Epoch 94/500 total: 3.72444, -LL: 3.58171, prior: 0.78235, aleatoric unc.: 8.99536 Epoch 94/500 total: 3.72444, -LL: 3.58171, prior: 0.78235, aleatoric unc.: 8.99536
Epoch 95/500 total: 3.72439, -LL: 3.60700, prior: 0.77858, aleatoric unc.: 9.05974 Epoch 95/500 total: 3.72439, -LL: 3.60700, prior: 0.77858, aleatoric unc.: 9.05974
Epoch 96/500 total: 3.73727, -LL: 3.67353, prior: 0.77897, aleatoric unc.: 9.13145 Epoch 96/500 total: 3.73727, -LL: 3.67353, prior: 0.77897, aleatoric unc.: 9.13145
Epoch 97/500 total: 3.73587, -LL: 3.65088, prior: 0.78041, aleatoric unc.: 9.19960 Epoch 97/500 total: 3.73587, -LL: 3.65088, prior: 0.78041, aleatoric unc.: 9.19960
Epoch 98/500 total: 3.72790, -LL: 3.58224, prior: 0.78028, aleatoric unc.: 9.25964 Epoch 98/500 total: 3.72790, -LL: 3.58224, prior: 0.78028, aleatoric unc.: 9.25964
Epoch 99/500 total: 3.71771, -LL: 3.66020, prior: 0.77745, aleatoric unc.: 9.31000 Epoch 99/500 total: 3.71771, -LL: 3.66020, prior: 0.77745, aleatoric unc.: 9.31000
Epoch 100/500 total: 3.72549, -LL: 3.62102, prior: 0.77761, aleatoric unc.: 9.36518 Epoch 100/500 total: 3.72549, -LL: 3.62102, prior: 0.77761, aleatoric unc.: 9.36518
Epoch 101/500 total: 3.71956, -LL: 3.57595, prior: 0.77925, aleatoric unc.: 9.41201 Epoch 101/500 total: 3.71956, -LL: 3.57595, prior: 0.77925, aleatoric unc.: 9.41201
Epoch 102/500 total: 3.72677, -LL: 3.57028, prior: 0.77652, aleatoric unc.: 9.46463 Epoch 102/500 total: 3.72677, -LL: 3.57028, prior: 0.77652, aleatoric unc.: 9.46463
Epoch 103/500 total: 3.72387, -LL: 3.65744, prior: 0.77754, aleatoric unc.: 9.50802 Epoch 103/500 total: 3.72387, -LL: 3.65744, prior: 0.77754, aleatoric unc.: 9.50802
Epoch 104/500 total: 3.71402, -LL: 3.60348, prior: 0.77234, aleatoric unc.: 9.54170 Epoch 104/500 total: 3.71402, -LL: 3.60348, prior: 0.77234, aleatoric unc.: 9.54170
Epoch 105/500 total: 3.72198, -LL: 3.63285, prior: 0.77360, aleatoric unc.: 9.58104 Epoch 105/500 total: 3.72198, -LL: 3.63285, prior: 0.77360, aleatoric unc.: 9.58104
Epoch 106/500 total: 3.71737, -LL: 3.64993, prior: 0.77597, aleatoric unc.: 9.61322 Epoch 106/500 total: 3.71737, -LL: 3.64993, prior: 0.77597, aleatoric unc.: 9.61322
Epoch 107/500 total: 3.71005, -LL: 3.56981, prior: 0.77387, aleatoric unc.: 9.63459 Epoch 107/500 total: 3.71005, -LL: 3.56981, prior: 0.77387, aleatoric unc.: 9.63459
Epoch 108/500 total: 3.71906, -LL: 3.58226, prior: 0.77644, aleatoric unc.: 9.66549 Epoch 108/500 total: 3.71906, -LL: 3.58226, prior: 0.77644, aleatoric unc.: 9.66549
Epoch 109/500 total: 3.72139, -LL: 3.63789, prior: 0.77184, aleatoric unc.: 9.69481 Epoch 109/500 total: 3.72139, -LL: 3.63789, prior: 0.77184, aleatoric unc.: 9.69481
Epoch 110/500 total: 3.70912, -LL: 3.60181, prior: 0.77226, aleatoric unc.: 9.71032 Epoch 110/500 total: 3.70912, -LL: 3.60181, prior: 0.77226, aleatoric unc.: 9.71032
Epoch 111/500 total: 3.71838, -LL: 3.63031, prior: 0.77578, aleatoric unc.: 9.73120 Epoch 111/500 total: 3.71838, -LL: 3.63031, prior: 0.77578, aleatoric unc.: 9.73120
Epoch 112/500 total: 3.71506, -LL: 3.61052, prior: 0.77374, aleatoric unc.: 9.74926 Epoch 112/500 total: 3.71506, -LL: 3.61052, prior: 0.77374, aleatoric unc.: 9.74926
Epoch 113/500 total: 3.71562, -LL: 3.72471, prior: 0.77285, aleatoric unc.: 9.76058 Epoch 113/500 total: 3.71562, -LL: 3.72471, prior: 0.77285, aleatoric unc.: 9.76058
Epoch 114/500 total: 3.71990, -LL: 3.74557, prior: 0.76846, aleatoric unc.: 9.78141 Epoch 114/500 total: 3.71990, -LL: 3.74557, prior: 0.76846, aleatoric unc.: 9.78141
Epoch 115/500 total: 3.72581, -LL: 3.67777, prior: 0.76765, aleatoric unc.: 9.80824 Epoch 115/500 total: 3.72581, -LL: 3.67777, prior: 0.76765, aleatoric unc.: 9.80824
Epoch 116/500 total: 3.72395, -LL: 3.66619, prior: 0.76687, aleatoric unc.: 9.82927 Epoch 116/500 total: 3.72395, -LL: 3.66619, prior: 0.76687, aleatoric unc.: 9.82927
Epoch 117/500 total: 3.72196, -LL: 3.67000, prior: 0.76839, aleatoric unc.: 9.84076 Epoch 117/500 total: 3.72196, -LL: 3.67000, prior: 0.76839, aleatoric unc.: 9.84076
Epoch 118/500 total: 3.70945, -LL: 3.61228, prior: 0.76902, aleatoric unc.: 9.84046 Epoch 118/500 total: 3.70945, -LL: 3.61228, prior: 0.76902, aleatoric unc.: 9.84046
Epoch 119/500 total: 3.71781, -LL: 3.57591, prior: 0.77380, aleatoric unc.: 9.84767 Epoch 119/500 total: 3.71781, -LL: 3.57591, prior: 0.77380, aleatoric unc.: 9.84767
Epoch 120/500 total: 3.70874, -LL: 3.66554, prior: 0.77551, aleatoric unc.: 9.83665 Epoch 120/500 total: 3.70874, -LL: 3.66554, prior: 0.77551, aleatoric unc.: 9.83665
Epoch 121/500 total: 3.72752, -LL: 3.65136, prior: 0.77845, aleatoric unc.: 9.86153 Epoch 121/500 total: 3.72752, -LL: 3.65136, prior: 0.77845, aleatoric unc.: 9.86153
Epoch 122/500 total: 3.73040, -LL: 3.62794, prior: 0.77837, aleatoric unc.: 9.88645 Epoch 122/500 total: 3.73040, -LL: 3.62794, prior: 0.77837, aleatoric unc.: 9.88645
Epoch 123/500 total: 3.71462, -LL: 3.68275, prior: 0.78350, aleatoric unc.: 9.88238 Epoch 123/500 total: 3.71462, -LL: 3.68275, prior: 0.78350, aleatoric unc.: 9.88238
Epoch 124/500 total: 3.71511, -LL: 3.59032, prior: 0.78381, aleatoric unc.: 9.87697 Epoch 124/500 total: 3.71511, -LL: 3.59032, prior: 0.78381, aleatoric unc.: 9.87697
Epoch 125/500 total: 3.72409, -LL: 3.58439, prior: 0.78439, aleatoric unc.: 9.89114 Epoch 125/500 total: 3.72409, -LL: 3.58439, prior: 0.78439, aleatoric unc.: 9.89114
Epoch 126/500 total: 3.71711, -LL: 3.66987, prior: 0.78606, aleatoric unc.: 9.88819 Epoch 126/500 total: 3.71711, -LL: 3.66987, prior: 0.78606, aleatoric unc.: 9.88819
Epoch 127/500 total: 3.71188, -LL: 3.63467, prior: 0.78302, aleatoric unc.: 9.87889 Epoch 127/500 total: 3.71188, -LL: 3.63467, prior: 0.78302, aleatoric unc.: 9.87889
Epoch 128/500 total: 3.72376, -LL: 3.58284, prior: 0.78485, aleatoric unc.: 9.88960 Epoch 128/500 total: 3.72376, -LL: 3.58284, prior: 0.78485, aleatoric unc.: 9.88960
Epoch 129/500 total: 3.71365, -LL: 3.66670, prior: 0.78087, aleatoric unc.: 9.88381 Epoch 129/500 total: 3.71365, -LL: 3.66670, prior: 0.78087, aleatoric unc.: 9.88381
Epoch 130/500 total: 3.72456, -LL: 3.66885, prior: 0.78370, aleatoric unc.: 9.89316 Epoch 130/500 total: 3.72456, -LL: 3.66885, prior: 0.78370, aleatoric unc.: 9.89316
Epoch 131/500 total: 3.71323, -LL: 3.62470, prior: 0.78227, aleatoric unc.: 9.88930 Epoch 131/500 total: 3.71323, -LL: 3.62470, prior: 0.78227, aleatoric unc.: 9.88930
Epoch 132/500 total: 3.71856, -LL: 3.66937, prior: 0.78154, aleatoric unc.: 9.88618 Epoch 132/500 total: 3.71856, -LL: 3.66937, prior: 0.78154, aleatoric unc.: 9.88618
Epoch 133/500 total: 3.71670, -LL: 3.64594, prior: 0.77825, aleatoric unc.: 9.88424 Epoch 133/500 total: 3.71670, -LL: 3.64594, prior: 0.77825, aleatoric unc.: 9.88424
Epoch 134/500 total: 3.72210, -LL: 3.60950, prior: 0.77902, aleatoric unc.: 9.89171 Epoch 134/500 total: 3.72210, -LL: 3.60950, prior: 0.77902, aleatoric unc.: 9.89171
Epoch 135/500 total: 3.71507, -LL: 3.59651, prior: 0.77877, aleatoric unc.: 9.88918 Epoch 135/500 total: 3.71507, -LL: 3.59651, prior: 0.77877, aleatoric unc.: 9.88918
Epoch 136/500 total: 3.71900, -LL: 3.65212, prior: 0.78622, aleatoric unc.: 9.89051 Epoch 136/500 total: 3.71900, -LL: 3.65212, prior: 0.78622, aleatoric unc.: 9.89051
Epoch 137/500 total: 3.71307, -LL: 3.61388, prior: 0.78563, aleatoric unc.: 9.88130 Epoch 137/500 total: 3.71307, -LL: 3.61388, prior: 0.78563, aleatoric unc.: 9.88130
Epoch 138/500 total: 3.71602, -LL: 3.60536, prior: 0.78637, aleatoric unc.: 9.88007 Epoch 138/500 total: 3.71602, -LL: 3.60536, prior: 0.78637, aleatoric unc.: 9.88007
Epoch 139/500 total: 3.72967, -LL: 3.64175, prior: 0.78606, aleatoric unc.: 9.89858 Epoch 139/500 total: 3.72967, -LL: 3.64175, prior: 0.78606, aleatoric unc.: 9.89858
Epoch 140/500 total: 3.71805, -LL: 3.57543, prior: 0.78286, aleatoric unc.: 9.90221 Epoch 140/500 total: 3.71805, -LL: 3.57543, prior: 0.78286, aleatoric unc.: 9.90221
Epoch 141/500 total: 3.71457, -LL: 3.60350, prior: 0.78505, aleatoric unc.: 9.89543 Epoch 141/500 total: 3.71457, -LL: 3.60350, prior: 0.78505, aleatoric unc.: 9.89543
Epoch 142/500 total: 3.71818, -LL: 3.66506, prior: 0.79071, aleatoric unc.: 9.89198 Epoch 142/500 total: 3.71818, -LL: 3.66506, prior: 0.79071, aleatoric unc.: 9.89198
Epoch 143/500 total: 3.70745, -LL: 3.66879, prior: 0.79167, aleatoric unc.: 9.87098 Epoch 143/500 total: 3.70745, -LL: 3.66879, prior: 0.79167, aleatoric unc.: 9.87098
Epoch 144/500 total: 3.71752, -LL: 3.59973, prior: 0.79228, aleatoric unc.: 9.87335 Epoch 144/500 total: 3.71752, -LL: 3.59973, prior: 0.79228, aleatoric unc.: 9.87335
Epoch 145/500 total: 3.71323, -LL: 3.68346, prior: 0.78821, aleatoric unc.: 9.86594 Epoch 145/500 total: 3.71323, -LL: 3.68346, prior: 0.78821, aleatoric unc.: 9.86594
Epoch 146/500 total: 3.69810, -LL: 3.62257, prior: 0.79184, aleatoric unc.: 9.83553 Epoch 146/500 total: 3.69810, -LL: 3.62257, prior: 0.79184, aleatoric unc.: 9.83553
Epoch 147/500 total: 3.70289, -LL: 3.60558, prior: 0.79325, aleatoric unc.: 9.81485 Epoch 147/500 total: 3.70289, -LL: 3.60558, prior: 0.79325, aleatoric unc.: 9.81485
Epoch 148/500 total: 3.71252, -LL: 3.67221, prior: 0.79652, aleatoric unc.: 9.81311 Epoch 148/500 total: 3.71252, -LL: 3.67221, prior: 0.79652, aleatoric unc.: 9.81311
Epoch 149/500 total: 3.73219, -LL: 3.57769, prior: 0.79471, aleatoric unc.: 9.85562 Epoch 149/500 total: 3.73219, -LL: 3.57769, prior: 0.79471, aleatoric unc.: 9.85562
Epoch 150/500 total: 3.72208, -LL: 3.67267, prior: 0.78948, aleatoric unc.: 9.87345 Epoch 150/500 total: 3.72208, -LL: 3.67267, prior: 0.78948, aleatoric unc.: 9.87345
Epoch 151/500 total: 3.71389, -LL: 3.65466, prior: 0.79006, aleatoric unc.: 9.87153 Epoch 151/500 total: 3.71389, -LL: 3.65466, prior: 0.79006, aleatoric unc.: 9.87153
Epoch 152/500 total: 3.72419, -LL: 3.67915, prior: 0.79090, aleatoric unc.: 9.87917 Epoch 152/500 total: 3.72419, -LL: 3.67915, prior: 0.79090, aleatoric unc.: 9.87917
Epoch 153/500 total: 3.71665, -LL: 3.62692, prior: 0.79997, aleatoric unc.: 9.88276 Epoch 153/500 total: 3.71665, -LL: 3.62692, prior: 0.79997, aleatoric unc.: 9.88276
Epoch 154/500 total: 3.72019, -LL: 3.58589, prior: 0.79751, aleatoric unc.: 9.88690 Epoch 154/500 total: 3.72019, -LL: 3.58589, prior: 0.79751, aleatoric unc.: 9.88690
Epoch 155/500 total: 3.71445, -LL: 3.61408, prior: 0.80268, aleatoric unc.: 9.88311 Epoch 155/500 total: 3.71445, -LL: 3.61408, prior: 0.80268, aleatoric unc.: 9.88311
Epoch 156/500 total: 3.70221, -LL: 3.68263, prior: 0.80404, aleatoric unc.: 9.85331 Epoch 156/500 total: 3.70221, -LL: 3.68263, prior: 0.80404, aleatoric unc.: 9.85331
Epoch 157/500 total: 3.70698, -LL: 3.70702, prior: 0.80339, aleatoric unc.: 9.83723 Epoch 157/500 total: 3.70698, -LL: 3.70702, prior: 0.80339, aleatoric unc.: 9.83723
Epoch 158/500 total: 3.72310, -LL: 3.59278, prior: 0.80445, aleatoric unc.: 9.85699 Epoch 158/500 total: 3.72310, -LL: 3.59278, prior: 0.80445, aleatoric unc.: 9.85699
Epoch 159/500 total: 3.71378, -LL: 3.73156, prior: 0.80416, aleatoric unc.: 9.85512 Epoch 159/500 total: 3.71378, -LL: 3.73156, prior: 0.80416, aleatoric unc.: 9.85512
Epoch 160/500 total: 3.70706, -LL: 3.63655, prior: 0.80595, aleatoric unc.: 9.84014 Epoch 160/500 total: 3.70706, -LL: 3.63655, prior: 0.80595, aleatoric unc.: 9.84014
Epoch 161/500 total: 3.71271, -LL: 3.57877, prior: 0.80410, aleatoric unc.: 9.84335 Epoch 161/500 total: 3.71271, -LL: 3.57877, prior: 0.80410, aleatoric unc.: 9.84335
Epoch 162/500 total: 3.71768, -LL: 3.63120, prior: 0.80659, aleatoric unc.: 9.85032 Epoch 162/500 total: 3.71768, -LL: 3.63120, prior: 0.80659, aleatoric unc.: 9.85032
Epoch 163/500 total: 3.71903, -LL: 3.65138, prior: 0.80638, aleatoric unc.: 9.85763 Epoch 163/500 total: 3.71903, -LL: 3.65138, prior: 0.80638, aleatoric unc.: 9.85763
Epoch 164/500 total: 3.71713, -LL: 3.64704, prior: 0.80705, aleatoric unc.: 9.86186 Epoch 164/500 total: 3.71713, -LL: 3.64704, prior: 0.80705, aleatoric unc.: 9.86186
Epoch 165/500 total: 3.71419, -LL: 3.64462, prior: 0.80874, aleatoric unc.: 9.86155 Epoch 165/500 total: 3.71419, -LL: 3.64462, prior: 0.80874, aleatoric unc.: 9.86155
Epoch 166/500 total: 3.70272, -LL: 3.60857, prior: 0.80695, aleatoric unc.: 9.83837 Epoch 166/500 total: 3.70272, -LL: 3.60857, prior: 0.80695, aleatoric unc.: 9.83837
Epoch 167/500 total: 3.71688, -LL: 3.60935, prior: 0.81090, aleatoric unc.: 9.84529 Epoch 167/500 total: 3.71688, -LL: 3.60935, prior: 0.81090, aleatoric unc.: 9.84529
Epoch 168/500 total: 3.70292, -LL: 3.60577, prior: 0.80637, aleatoric unc.: 9.82553 Epoch 168/500 total: 3.70292, -LL: 3.60577, prior: 0.80637, aleatoric unc.: 9.82553
Epoch 169/500 total: 3.71970, -LL: 3.63841, prior: 0.80866, aleatoric unc.: 9.83715 Epoch 169/500 total: 3.71970, -LL: 3.63841, prior: 0.80866, aleatoric unc.: 9.83715
Epoch 170/500 total: 3.71758, -LL: 3.67275, prior: 0.81033, aleatoric unc.: 9.84604 Epoch 170/500 total: 3.71758, -LL: 3.67275, prior: 0.81033, aleatoric unc.: 9.84604
Epoch 171/500 total: 3.72500, -LL: 3.59006, prior: 0.81392, aleatoric unc.: 9.86759 Epoch 171/500 total: 3.72500, -LL: 3.59006, prior: 0.81392, aleatoric unc.: 9.86759
Epoch 172/500 total: 3.70126, -LL: 3.64450, prior: 0.81116, aleatoric unc.: 9.84247 Epoch 172/500 total: 3.70126, -LL: 3.64450, prior: 0.81116, aleatoric unc.: 9.84247
Epoch 173/500 total: 3.72101, -LL: 3.64824, prior: 0.81156, aleatoric unc.: 9.84976 Epoch 173/500 total: 3.72101, -LL: 3.64824, prior: 0.81156, aleatoric unc.: 9.84976
Epoch 174/500 total: 3.71152, -LL: 3.69934, prior: 0.81262, aleatoric unc.: 9.84919 Epoch 174/500 total: 3.71152, -LL: 3.69934, prior: 0.81262, aleatoric unc.: 9.84919
Epoch 175/500 total: 3.71467, -LL: 3.68428, prior: 0.81517, aleatoric unc.: 9.85176 Epoch 175/500 total: 3.71467, -LL: 3.68428, prior: 0.81517, aleatoric unc.: 9.85176
Epoch 176/500 total: 3.70291, -LL: 3.63773, prior: 0.81583, aleatoric unc.: 9.83082 Epoch 176/500 total: 3.70291, -LL: 3.63773, prior: 0.81583, aleatoric unc.: 9.83082
Epoch 177/500 total: 3.70896, -LL: 3.66547, prior: 0.81045, aleatoric unc.: 9.82290 Epoch 177/500 total: 3.70896, -LL: 3.66547, prior: 0.81045, aleatoric unc.: 9.82290
Epoch 178/500 total: 3.71117, -LL: 3.68764, prior: 0.81337, aleatoric unc.: 9.82000 Epoch 178/500 total: 3.71117, -LL: 3.68764, prior: 0.81337, aleatoric unc.: 9.82000
Epoch 179/500 total: 3.70928, -LL: 3.61341, prior: 0.81607, aleatoric unc.: 9.81955 Epoch 179/500 total: 3.70928, -LL: 3.61341, prior: 0.81607, aleatoric unc.: 9.81955
Epoch 180/500 total: 3.70466, -LL: 3.65554, prior: 0.81575, aleatoric unc.: 9.80751 Epoch 180/500 total: 3.70466, -LL: 3.65554, prior: 0.81575, aleatoric unc.: 9.80751
Epoch 181/500 total: 3.70420, -LL: 3.64272, prior: 0.81693, aleatoric unc.: 9.79344 Epoch 181/500 total: 3.70420, -LL: 3.64272, prior: 0.81693, aleatoric unc.: 9.79344
Epoch 182/500 total: 3.71046, -LL: 3.64984, prior: 0.81584, aleatoric unc.: 9.79500 Epoch 182/500 total: 3.71046, -LL: 3.64984, prior: 0.81584, aleatoric unc.: 9.79500
Epoch 183/500 total: 3.71474, -LL: 3.66882, prior: 0.81325, aleatoric unc.: 9.80760 Epoch 183/500 total: 3.71474, -LL: 3.66882, prior: 0.81325, aleatoric unc.: 9.80760
Epoch 184/500 total: 3.71527, -LL: 3.61031, prior: 0.81348, aleatoric unc.: 9.82014 Epoch 184/500 total: 3.71527, -LL: 3.61031, prior: 0.81348, aleatoric unc.: 9.82014
Epoch 185/500 total: 3.70042, -LL: 3.66272, prior: 0.81257, aleatoric unc.: 9.80154 Epoch 185/500 total: 3.70042, -LL: 3.66272, prior: 0.81257, aleatoric unc.: 9.80154
Epoch 186/500 total: 3.71282, -LL: 3.61934, prior: 0.81781, aleatoric unc.: 9.80770 Epoch 186/500 total: 3.71282, -LL: 3.61934, prior: 0.81781, aleatoric unc.: 9.80770
Epoch 187/500 total: 3.71327, -LL: 3.67081, prior: 0.81615, aleatoric unc.: 9.81168 Epoch 187/500 total: 3.71327, -LL: 3.67081, prior: 0.81615, aleatoric unc.: 9.81168
Epoch 188/500 total: 3.70315, -LL: 3.66320, prior: 0.81042, aleatoric unc.: 9.80011 Epoch 188/500 total: 3.70315, -LL: 3.66320, prior: 0.81042, aleatoric unc.: 9.80011
Epoch 189/500 total: 3.69962, -LL: 3.63048, prior: 0.81293, aleatoric unc.: 9.78159 Epoch 189/500 total: 3.69962, -LL: 3.63048, prior: 0.81293, aleatoric unc.: 9.78159
Epoch 190/500 total: 3.71830, -LL: 3.68563, prior: 0.81299, aleatoric unc.: 9.80476 Epoch 190/500 total: 3.71830, -LL: 3.68563, prior: 0.81299, aleatoric unc.: 9.80476
Epoch 191/500 total: 3.71706, -LL: 3.64913, prior: 0.81458, aleatoric unc.: 9.81613 Epoch 191/500 total: 3.71706, -LL: 3.64913, prior: 0.81458, aleatoric unc.: 9.81613
Epoch 192/500 total: 3.70988, -LL: 3.62455, prior: 0.81865, aleatoric unc.: 9.81723 Epoch 192/500 total: 3.70988, -LL: 3.62455, prior: 0.81865, aleatoric unc.: 9.81723
Epoch 193/500 total: 3.70870, -LL: 3.63595, prior: 0.81586, aleatoric unc.: 9.81325 Epoch 193/500 total: 3.70870, -LL: 3.63595, prior: 0.81586, aleatoric unc.: 9.81325
Epoch 194/500 total: 3.71062, -LL: 3.63609, prior: 0.81832, aleatoric unc.: 9.81209 Epoch 194/500 total: 3.71062, -LL: 3.63609, prior: 0.81832, aleatoric unc.: 9.81209
Epoch 195/500 total: 3.72410, -LL: 3.62442, prior: 0.81648, aleatoric unc.: 9.83962 Epoch 195/500 total: 3.72410, -LL: 3.62442, prior: 0.81648, aleatoric unc.: 9.83962
Epoch 196/500 total: 3.71915, -LL: 3.66803, prior: 0.81624, aleatoric unc.: 9.84151 Epoch 196/500 total: 3.71915, -LL: 3.66803, prior: 0.81624, aleatoric unc.: 9.84151
Epoch 197/500 total: 3.71432, -LL: 3.64483, prior: 0.81001, aleatoric unc.: 9.85190 Epoch 197/500 total: 3.71432, -LL: 3.64483, prior: 0.81001, aleatoric unc.: 9.85190
Epoch 198/500 total: 3.71493, -LL: 3.59165, prior: 0.81170, aleatoric unc.: 9.85324 Epoch 198/500 total: 3.71493, -LL: 3.59165, prior: 0.81170, aleatoric unc.: 9.85324
Epoch 199/500 total: 3.70331, -LL: 3.60793, prior: 0.80896, aleatoric unc.: 9.83340 Epoch 199/500 total: 3.70331, -LL: 3.60793, prior: 0.80896, aleatoric unc.: 9.83340
Epoch 200/500 total: 3.70812, -LL: 3.59062, prior: 0.81499, aleatoric unc.: 9.82334 Epoch 200/500 total: 3.70812, -LL: 3.59062, prior: 0.81499, aleatoric unc.: 9.82334
Epoch 201/500 total: 3.71478, -LL: 3.62452, prior: 0.81519, aleatoric unc.: 9.82977 Epoch 201/500 total: 3.71478, -LL: 3.62452, prior: 0.81519, aleatoric unc.: 9.82977
Epoch 202/500 total: 3.71517, -LL: 3.56803, prior: 0.81508, aleatoric unc.: 9.83826 Epoch 202/500 total: 3.71517, -LL: 3.56803, prior: 0.81508, aleatoric unc.: 9.83826
Epoch 203/500 total: 3.69960, -LL: 3.63084, prior: 0.81742, aleatoric unc.: 9.81023 Epoch 203/500 total: 3.69960, -LL: 3.63084, prior: 0.81742, aleatoric unc.: 9.81023
Epoch 204/500 total: 3.70639, -LL: 3.66974, prior: 0.81871, aleatoric unc.: 9.80200 Epoch 204/500 total: 3.70639, -LL: 3.66974, prior: 0.81871, aleatoric unc.: 9.80200
Epoch 205/500 total: 3.70826, -LL: 3.62572, prior: 0.81504, aleatoric unc.: 9.79858 Epoch 205/500 total: 3.70826, -LL: 3.62572, prior: 0.81504, aleatoric unc.: 9.79858
Epoch 206/500 total: 3.71623, -LL: 3.64392, prior: 0.81659, aleatoric unc.: 9.81016 Epoch 206/500 total: 3.71623, -LL: 3.64392, prior: 0.81659, aleatoric unc.: 9.81016
Epoch 207/500 total: 3.71774, -LL: 3.65070, prior: 0.81498, aleatoric unc.: 9.82834 Epoch 207/500 total: 3.71774, -LL: 3.65070, prior: 0.81498, aleatoric unc.: 9.82834
Epoch 208/500 total: 3.69666, -LL: 3.65249, prior: 0.81455, aleatoric unc.: 9.79752 Epoch 208/500 total: 3.69666, -LL: 3.65249, prior: 0.81455, aleatoric unc.: 9.79752
Epoch 209/500 total: 3.70594, -LL: 3.67450, prior: 0.81606, aleatoric unc.: 9.79126 Epoch 209/500 total: 3.70594, -LL: 3.67450, prior: 0.81606, aleatoric unc.: 9.79126
Epoch 210/500 total: 3.69925, -LL: 3.68792, prior: 0.81400, aleatoric unc.: 9.77435 Epoch 210/500 total: 3.69925, -LL: 3.68792, prior: 0.81400, aleatoric unc.: 9.77435
Epoch 211/500 total: 3.70082, -LL: 3.60530, prior: 0.81230, aleatoric unc.: 9.76606 Epoch 211/500 total: 3.70082, -LL: 3.60530, prior: 0.81230, aleatoric unc.: 9.76606
Epoch 212/500 total: 3.71195, -LL: 3.64706, prior: 0.81609, aleatoric unc.: 9.77611 Epoch 212/500 total: 3.71195, -LL: 3.64706, prior: 0.81609, aleatoric unc.: 9.77611
Epoch 213/500 total: 3.70414, -LL: 3.64134, prior: 0.81544, aleatoric unc.: 9.77201 Epoch 213/500 total: 3.70414, -LL: 3.64134, prior: 0.81544, aleatoric unc.: 9.77201
Epoch 214/500 total: 3.71389, -LL: 3.63395, prior: 0.81622, aleatoric unc.: 9.78563 Epoch 214/500 total: 3.71389, -LL: 3.63395, prior: 0.81622, aleatoric unc.: 9.78563
Epoch 215/500 total: 3.71683, -LL: 3.66099, prior: 0.81664, aleatoric unc.: 9.80171 Epoch 215/500 total: 3.71683, -LL: 3.66099, prior: 0.81664, aleatoric unc.: 9.80171
Epoch 216/500 total: 3.72204, -LL: 3.66836, prior: 0.81515, aleatoric unc.: 9.82727 Epoch 216/500 total: 3.72204, -LL: 3.66836, prior: 0.81515, aleatoric unc.: 9.82727
Epoch 217/500 total: 3.70703, -LL: 3.63465, prior: 0.81890, aleatoric unc.: 9.82033 Epoch 217/500 total: 3.70703, -LL: 3.63465, prior: 0.81890, aleatoric unc.: 9.82033
Epoch 218/500 total: 3.70247, -LL: 3.71451, prior: 0.81721, aleatoric unc.: 9.80064 Epoch 218/500 total: 3.70247, -LL: 3.71451, prior: 0.81721, aleatoric unc.: 9.80064
Epoch 219/500 total: 3.71433, -LL: 3.63672, prior: 0.82165, aleatoric unc.: 9.80686 Epoch 219/500 total: 3.71433, -LL: 3.63672, prior: 0.82165, aleatoric unc.: 9.80686
Epoch 220/500 total: 3.70779, -LL: 3.61201, prior: 0.82151, aleatoric unc.: 9.80850 Epoch 220/500 total: 3.70779, -LL: 3.61201, prior: 0.82151, aleatoric unc.: 9.80850
Epoch 221/500 total: 3.72726, -LL: 3.65804, prior: 0.82183, aleatoric unc.: 9.83772 Epoch 221/500 total: 3.72726, -LL: 3.65804, prior: 0.82183, aleatoric unc.: 9.83772
Epoch 222/500 total: 3.71228, -LL: 3.65686, prior: 0.81864, aleatoric unc.: 9.84025 Epoch 222/500 total: 3.71228, -LL: 3.65686, prior: 0.81864, aleatoric unc.: 9.84025
Epoch 223/500 total: 3.72339, -LL: 3.64982, prior: 0.82202, aleatoric unc.: 9.85690 Epoch 223/500 total: 3.72339, -LL: 3.64982, prior: 0.82202, aleatoric unc.: 9.85690
Epoch 224/500 total: 3.70915, -LL: 3.63755, prior: 0.81988, aleatoric unc.: 9.85046 Epoch 224/500 total: 3.70915, -LL: 3.63755, prior: 0.81988, aleatoric unc.: 9.85046
Epoch 225/500 total: 3.70552, -LL: 3.63990, prior: 0.82102, aleatoric unc.: 9.82943 Epoch 225/500 total: 3.70552, -LL: 3.63990, prior: 0.82102, aleatoric unc.: 9.82943
Epoch 226/500 total: 3.70323, -LL: 3.64443, prior: 0.82060, aleatoric unc.: 9.81781 Epoch 226/500 total: 3.70323, -LL: 3.64443, prior: 0.82060, aleatoric unc.: 9.81781
Epoch 227/500 total: 3.70019, -LL: 3.65555, prior: 0.82249, aleatoric unc.: 9.79478 Epoch 227/500 total: 3.70019, -LL: 3.65555, prior: 0.82249, aleatoric unc.: 9.79478
Epoch 228/500 total: 3.71216, -LL: 3.69579, prior: 0.82333, aleatoric unc.: 9.79841 Epoch 228/500 total: 3.71216, -LL: 3.69579, prior: 0.82333, aleatoric unc.: 9.79841
Epoch 229/500 total: 3.71140, -LL: 3.62137, prior: 0.82447, aleatoric unc.: 9.80692 Epoch 229/500 total: 3.71140, -LL: 3.62137, prior: 0.82447, aleatoric unc.: 9.80692
Epoch 230/500 total: 3.70575, -LL: 3.64446, prior: 0.82529, aleatoric unc.: 9.79880 Epoch 230/500 total: 3.70575, -LL: 3.64446, prior: 0.82529, aleatoric unc.: 9.79880
Epoch 231/500 total: 3.71456, -LL: 3.75829, prior: 0.82439, aleatoric unc.: 9.80728 Epoch 231/500 total: 3.71456, -LL: 3.75829, prior: 0.82439, aleatoric unc.: 9.80728
Epoch 232/500 total: 3.70966, -LL: 3.63882, prior: 0.82492, aleatoric unc.: 9.80583 Epoch 232/500 total: 3.70966, -LL: 3.63882, prior: 0.82492, aleatoric unc.: 9.80583
Epoch 233/500 total: 3.71351, -LL: 3.61270, prior: 0.82613, aleatoric unc.: 9.81784 Epoch 233/500 total: 3.71351, -LL: 3.61270, prior: 0.82613, aleatoric unc.: 9.81784
Epoch 234/500 total: 3.70172, -LL: 3.62794, prior: 0.81822, aleatoric unc.: 9.79805 Epoch 234/500 total: 3.70172, -LL: 3.62794, prior: 0.81822, aleatoric unc.: 9.79805
Epoch 235/500 total: 3.71109, -LL: 3.64488, prior: 0.82053, aleatoric unc.: 9.80066 Epoch 235/500 total: 3.71109, -LL: 3.64488, prior: 0.82053, aleatoric unc.: 9.80066
Epoch 236/500 total: 3.70431, -LL: 3.62616, prior: 0.82161, aleatoric unc.: 9.78883 Epoch 236/500 total: 3.70431, -LL: 3.62616, prior: 0.82161, aleatoric unc.: 9.78883
Epoch 237/500 total: 3.70279, -LL: 3.68337, prior: 0.82063, aleatoric unc.: 9.77969 Epoch 237/500 total: 3.70279, -LL: 3.68337, prior: 0.82063, aleatoric unc.: 9.77969
Epoch 238/500 total: 3.71613, -LL: 3.67027, prior: 0.82166, aleatoric unc.: 9.79706 Epoch 238/500 total: 3.71613, -LL: 3.67027, prior: 0.82166, aleatoric unc.: 9.79706
Epoch 239/500 total: 3.70064, -LL: 3.66745, prior: 0.82222, aleatoric unc.: 9.78118 Epoch 239/500 total: 3.70064, -LL: 3.66745, prior: 0.82222, aleatoric unc.: 9.78118
Epoch 240/500 total: 3.70537, -LL: 3.60819, prior: 0.82710, aleatoric unc.: 9.77992 Epoch 240/500 total: 3.70537, -LL: 3.60819, prior: 0.82710, aleatoric unc.: 9.77992
Epoch 241/500 total: 3.71476, -LL: 3.61006, prior: 0.82356, aleatoric unc.: 9.79421 Epoch 241/500 total: 3.71476, -LL: 3.61006, prior: 0.82356, aleatoric unc.: 9.79421
Epoch 242/500 total: 3.71145, -LL: 3.64876, prior: 0.82434, aleatoric unc.: 9.79779 Epoch 242/500 total: 3.71145, -LL: 3.64876, prior: 0.82434, aleatoric unc.: 9.79779
Epoch 243/500 total: 3.70802, -LL: 3.58574, prior: 0.82809, aleatoric unc.: 9.79796 Epoch 243/500 total: 3.70802, -LL: 3.58574, prior: 0.82809, aleatoric unc.: 9.79796
Epoch 244/500 total: 3.70571, -LL: 3.65710, prior: 0.82588, aleatoric unc.: 9.79101 Epoch 244/500 total: 3.70571, -LL: 3.65710, prior: 0.82588, aleatoric unc.: 9.79101
Epoch 245/500 total: 3.70188, -LL: 3.62964, prior: 0.82486, aleatoric unc.: 9.77658 Epoch 245/500 total: 3.70188, -LL: 3.62964, prior: 0.82486, aleatoric unc.: 9.77658
Epoch 246/500 total: 3.72030, -LL: 3.63991, prior: 0.82122, aleatoric unc.: 9.80154 Epoch 246/500 total: 3.72030, -LL: 3.63991, prior: 0.82122, aleatoric unc.: 9.80154
Epoch 247/500 total: 3.71523, -LL: 3.61676, prior: 0.82447, aleatoric unc.: 9.81099 Epoch 247/500 total: 3.71523, -LL: 3.61676, prior: 0.82447, aleatoric unc.: 9.81099
Epoch 248/500 total: 3.71168, -LL: 3.59765, prior: 0.82207, aleatoric unc.: 9.81821 Epoch 248/500 total: 3.71168, -LL: 3.59765, prior: 0.82207, aleatoric unc.: 9.81821
Epoch 249/500 total: 3.71399, -LL: 3.63747, prior: 0.82239, aleatoric unc.: 9.82253 Epoch 249/500 total: 3.71399, -LL: 3.63747, prior: 0.82239, aleatoric unc.: 9.82253
Epoch 250/500 total: 3.70465, -LL: 3.64624, prior: 0.82184, aleatoric unc.: 9.81008 Epoch 250/500 total: 3.70465, -LL: 3.64624, prior: 0.82184, aleatoric unc.: 9.81008
Epoch 251/500 total: 3.69921, -LL: 3.60426, prior: 0.82150, aleatoric unc.: 9.78902 Epoch 251/500 total: 3.69921, -LL: 3.60426, prior: 0.82150, aleatoric unc.: 9.78902
Epoch 252/500 total: 3.71118, -LL: 3.67592, prior: 0.82288, aleatoric unc.: 9.79504 Epoch 252/500 total: 3.71118, -LL: 3.67592, prior: 0.82288, aleatoric unc.: 9.79504
Epoch 253/500 total: 3.70237, -LL: 3.62898, prior: 0.82680, aleatoric unc.: 9.78291 Epoch 253/500 total: 3.70237, -LL: 3.62898, prior: 0.82680, aleatoric unc.: 9.78291
Epoch 254/500 total: 3.70794, -LL: 3.64745, prior: 0.82541, aleatoric unc.: 9.78084 Epoch 254/500 total: 3.70794, -LL: 3.64745, prior: 0.82541, aleatoric unc.: 9.78084
Epoch 255/500 total: 3.70302, -LL: 3.62780, prior: 0.82535, aleatoric unc.: 9.77530 Epoch 255/500 total: 3.70302, -LL: 3.62780, prior: 0.82535, aleatoric unc.: 9.77530
Epoch 256/500 total: 3.71729, -LL: 3.63795, prior: 0.82601, aleatoric unc.: 9.79276 Epoch 256/500 total: 3.71729, -LL: 3.63795, prior: 0.82601, aleatoric unc.: 9.79276
Epoch 257/500 total: 3.70993, -LL: 3.63274, prior: 0.82741, aleatoric unc.: 9.79693 Epoch 257/500 total: 3.70993, -LL: 3.63274, prior: 0.82741, aleatoric unc.: 9.79693
Epoch 258/500 total: 3.71243, -LL: 3.65305, prior: 0.82601, aleatoric unc.: 9.80389 Epoch 258/500 total: 3.71243, -LL: 3.65305, prior: 0.82601, aleatoric unc.: 9.80389
Epoch 259/500 total: 3.70296, -LL: 3.64940, prior: 0.82313, aleatoric unc.: 9.79133 Epoch 259/500 total: 3.70296, -LL: 3.64940, prior: 0.82313, aleatoric unc.: 9.79133
Epoch 260/500 total: 3.71093, -LL: 3.59715, prior: 0.82558, aleatoric unc.: 9.79233 Epoch 260/500 total: 3.71093, -LL: 3.59715, prior: 0.82558, aleatoric unc.: 9.79233
Epoch 261/500 total: 3.71045, -LL: 3.63395, prior: 0.82771, aleatoric unc.: 9.79831 Epoch 261/500 total: 3.71045, -LL: 3.63395, prior: 0.82771, aleatoric unc.: 9.79831
Epoch 262/500 total: 3.69963, -LL: 3.67210, prior: 0.82712, aleatoric unc.: 9.77961 Epoch 262/500 total: 3.69963, -LL: 3.67210, prior: 0.82712, aleatoric unc.: 9.77961
Epoch 263/500 total: 3.71113, -LL: 3.65016, prior: 0.82726, aleatoric unc.: 9.78695 Epoch 263/500 total: 3.71113, -LL: 3.65016, prior: 0.82726, aleatoric unc.: 9.78695
Epoch 264/500 total: 3.70665, -LL: 3.60633, prior: 0.83064, aleatoric unc.: 9.78443 Epoch 264/500 total: 3.70665, -LL: 3.60633, prior: 0.83064, aleatoric unc.: 9.78443
Epoch 265/500 total: 3.71054, -LL: 3.65116, prior: 0.83137, aleatoric unc.: 9.79059 Epoch 265/500 total: 3.71054, -LL: 3.65116, prior: 0.83137, aleatoric unc.: 9.79059
Epoch 266/500 total: 3.69886, -LL: 3.60721, prior: 0.83100, aleatoric unc.: 9.77077 Epoch 266/500 total: 3.69886, -LL: 3.60721, prior: 0.83100, aleatoric unc.: 9.77077
Epoch 267/500 total: 3.70686, -LL: 3.65023, prior: 0.83398, aleatoric unc.: 9.77222 Epoch 267/500 total: 3.70686, -LL: 3.65023, prior: 0.83398, aleatoric unc.: 9.77222
Epoch 268/500 total: 3.69995, -LL: 3.71522, prior: 0.83159, aleatoric unc.: 9.75908 Epoch 268/500 total: 3.69995, -LL: 3.71522, prior: 0.83159, aleatoric unc.: 9.75908
Epoch 269/500 total: 3.70539, -LL: 3.64799, prior: 0.83221, aleatoric unc.: 9.75858 Epoch 269/500 total: 3.70539, -LL: 3.64799, prior: 0.83221, aleatoric unc.: 9.75858
Epoch 270/500 total: 3.70420, -LL: 3.63578, prior: 0.83041, aleatoric unc.: 9.75445 Epoch 270/500 total: 3.70420, -LL: 3.63578, prior: 0.83041, aleatoric unc.: 9.75445
Epoch 271/500 total: 3.70999, -LL: 3.68182, prior: 0.82908, aleatoric unc.: 9.76730 Epoch 271/500 total: 3.70999, -LL: 3.68182, prior: 0.82908, aleatoric unc.: 9.76730
Epoch 272/500 total: 3.70793, -LL: 3.63381, prior: 0.83376, aleatoric unc.: 9.76915 Epoch 272/500 total: 3.70793, -LL: 3.63381, prior: 0.83376, aleatoric unc.: 9.76915
Epoch 273/500 total: 3.70842, -LL: 3.64926, prior: 0.83353, aleatoric unc.: 9.77610 Epoch 273/500 total: 3.70842, -LL: 3.64926, prior: 0.83353, aleatoric unc.: 9.77610
Epoch 274/500 total: 3.70973, -LL: 3.62482, prior: 0.83395, aleatoric unc.: 9.77950 Epoch 274/500 total: 3.70973, -LL: 3.62482, prior: 0.83395, aleatoric unc.: 9.77950
Epoch 275/500 total: 3.71090, -LL: 3.60236, prior: 0.83533, aleatoric unc.: 9.78900 Epoch 275/500 total: 3.71090, -LL: 3.60236, prior: 0.83533, aleatoric unc.: 9.78900
Epoch 276/500 total: 3.70221, -LL: 3.67710, prior: 0.83294, aleatoric unc.: 9.77734 Epoch 276/500 total: 3.70221, -LL: 3.67710, prior: 0.83294, aleatoric unc.: 9.77734
Epoch 277/500 total: 3.70494, -LL: 3.64316, prior: 0.83875, aleatoric unc.: 9.77393 Epoch 277/500 total: 3.70494, -LL: 3.64316, prior: 0.83875, aleatoric unc.: 9.77393
Epoch 278/500 total: 3.71155, -LL: 3.61530, prior: 0.83769, aleatoric unc.: 9.78493 Epoch 278/500 total: 3.71155, -LL: 3.61530, prior: 0.83769, aleatoric unc.: 9.78493
Epoch 279/500 total: 3.72149, -LL: 3.63981, prior: 0.83667, aleatoric unc.: 9.81001 Epoch 279/500 total: 3.72149, -LL: 3.63981, prior: 0.83667, aleatoric unc.: 9.81001
Epoch 280/500 total: 3.70467, -LL: 3.62610, prior: 0.83533, aleatoric unc.: 9.80246 Epoch 280/500 total: 3.70467, -LL: 3.62610, prior: 0.83533, aleatoric unc.: 9.80246
Epoch 281/500 total: 3.70502, -LL: 3.66885, prior: 0.83573, aleatoric unc.: 9.79123 Epoch 281/500 total: 3.70502, -LL: 3.66885, prior: 0.83573, aleatoric unc.: 9.79123
Epoch 282/500 total: 3.70405, -LL: 3.63700, prior: 0.83748, aleatoric unc.: 9.78283 Epoch 282/500 total: 3.70405, -LL: 3.63700, prior: 0.83748, aleatoric unc.: 9.78283
Epoch 283/500 total: 3.70889, -LL: 3.63902, prior: 0.83531, aleatoric unc.: 9.78254 Epoch 283/500 total: 3.70889, -LL: 3.63902, prior: 0.83531, aleatoric unc.: 9.78254
Epoch 284/500 total: 3.71741, -LL: 3.60456, prior: 0.83570, aleatoric unc.: 9.80172 Epoch 284/500 total: 3.71741, -LL: 3.60456, prior: 0.83570, aleatoric unc.: 9.80172
Epoch 285/500 total: 3.71138, -LL: 3.68161, prior: 0.83478, aleatoric unc.: 9.80546 Epoch 285/500 total: 3.71138, -LL: 3.68161, prior: 0.83478, aleatoric unc.: 9.80546
Epoch 286/500 total: 3.70675, -LL: 3.59507, prior: 0.83560, aleatoric unc.: 9.80310 Epoch 286/500 total: 3.70675, -LL: 3.59507, prior: 0.83560, aleatoric unc.: 9.80310
Epoch 287/500 total: 3.70698, -LL: 3.61376, prior: 0.83594, aleatoric unc.: 9.79633 Epoch 287/500 total: 3.70698, -LL: 3.61376, prior: 0.83594, aleatoric unc.: 9.79633
Epoch 288/500 total: 3.70635, -LL: 3.60290, prior: 0.83656, aleatoric unc.: 9.78862 Epoch 288/500 total: 3.70635, -LL: 3.60290, prior: 0.83656, aleatoric unc.: 9.78862
Epoch 289/500 total: 3.71560, -LL: 3.64837, prior: 0.83419, aleatoric unc.: 9.80237 Epoch 289/500 total: 3.71560, -LL: 3.64837, prior: 0.83419, aleatoric unc.: 9.80237
Epoch 290/500 total: 3.70476, -LL: 3.65473, prior: 0.83240, aleatoric unc.: 9.79643 Epoch 290/500 total: 3.70476, -LL: 3.65473, prior: 0.83240, aleatoric unc.: 9.79643
Epoch 291/500 total: 3.70503, -LL: 3.67415, prior: 0.83116, aleatoric unc.: 9.78742 Epoch 291/500 total: 3.70503, -LL: 3.67415, prior: 0.83116, aleatoric unc.: 9.78742
Epoch 292/500 total: 3.69757, -LL: 3.58311, prior: 0.82735, aleatoric unc.: 9.77046 Epoch 292/500 total: 3.69757, -LL: 3.58311, prior: 0.82735, aleatoric unc.: 9.77046
Epoch 293/500 total: 3.70371, -LL: 3.64588, prior: 0.82783, aleatoric unc.: 9.76296 Epoch 293/500 total: 3.70371, -LL: 3.64588, prior: 0.82783, aleatoric unc.: 9.76296
Epoch 294/500 total: 3.70312, -LL: 3.62281, prior: 0.83122, aleatoric unc.: 9.76033 Epoch 294/500 total: 3.70312, -LL: 3.62281, prior: 0.83122, aleatoric unc.: 9.76033
Epoch 295/500 total: 3.70929, -LL: 3.64419, prior: 0.83021, aleatoric unc.: 9.76300 Epoch 295/500 total: 3.70929, -LL: 3.64419, prior: 0.83021, aleatoric unc.: 9.76300
Epoch 296/500 total: 3.70208, -LL: 3.64338, prior: 0.83049, aleatoric unc.: 9.76104 Epoch 296/500 total: 3.70208, -LL: 3.64338, prior: 0.83049, aleatoric unc.: 9.76104
Epoch 297/500 total: 3.70218, -LL: 3.61723, prior: 0.83026, aleatoric unc.: 9.75453 Epoch 297/500 total: 3.70218, -LL: 3.61723, prior: 0.83026, aleatoric unc.: 9.75453
Epoch 298/500 total: 3.70252, -LL: 3.64148, prior: 0.83147, aleatoric unc.: 9.74977 Epoch 298/500 total: 3.70252, -LL: 3.64148, prior: 0.83147, aleatoric unc.: 9.74977
Epoch 299/500 total: 3.70108, -LL: 3.68672, prior: 0.83107, aleatoric unc.: 9.74179 Epoch 299/500 total: 3.70108, -LL: 3.68672, prior: 0.83107, aleatoric unc.: 9.74179
Epoch 300/500 total: 3.70634, -LL: 3.67040, prior: 0.82994, aleatoric unc.: 9.75107 Epoch 300/500 total: 3.70634, -LL: 3.67040, prior: 0.82994, aleatoric unc.: 9.75107
Epoch 301/500 total: 3.69355, -LL: 3.60485, prior: 0.82971, aleatoric unc.: 9.73195 Epoch 301/500 total: 3.69355, -LL: 3.60485, prior: 0.82971, aleatoric unc.: 9.73195
Epoch 302/500 total: 3.70322, -LL: 3.63580, prior: 0.83051, aleatoric unc.: 9.73157 Epoch 302/500 total: 3.70322, -LL: 3.63580, prior: 0.83051, aleatoric unc.: 9.73157
Epoch 303/500 total: 3.70026, -LL: 3.70596, prior: 0.82929, aleatoric unc.: 9.72599 Epoch 303/500 total: 3.70026, -LL: 3.70596, prior: 0.82929, aleatoric unc.: 9.72599
Epoch 304/500 total: 3.70397, -LL: 3.65822, prior: 0.83103, aleatoric unc.: 9.72934 Epoch 304/500 total: 3.70397, -LL: 3.65822, prior: 0.83103, aleatoric unc.: 9.72934
Epoch 305/500 total: 3.70864, -LL: 3.56731, prior: 0.83211, aleatoric unc.: 9.74369 Epoch 305/500 total: 3.70864, -LL: 3.56731, prior: 0.83211, aleatoric unc.: 9.74369
Epoch 306/500 total: 3.70876, -LL: 3.59099, prior: 0.83176, aleatoric unc.: 9.75274 Epoch 306/500 total: 3.70876, -LL: 3.59099, prior: 0.83176, aleatoric unc.: 9.75274
Epoch 307/500 total: 3.70129, -LL: 3.63883, prior: 0.82881, aleatoric unc.: 9.74951 Epoch 307/500 total: 3.70129, -LL: 3.63883, prior: 0.82881, aleatoric unc.: 9.74951
Epoch 308/500 total: 3.70983, -LL: 3.63237, prior: 0.82886, aleatoric unc.: 9.75686 Epoch 308/500 total: 3.70983, -LL: 3.63237, prior: 0.82886, aleatoric unc.: 9.75686
Epoch 309/500 total: 3.70741, -LL: 3.60030, prior: 0.83055, aleatoric unc.: 9.76448 Epoch 309/500 total: 3.70741, -LL: 3.60030, prior: 0.83055, aleatoric unc.: 9.76448
Epoch 310/500 total: 3.70709, -LL: 3.59931, prior: 0.83434, aleatoric unc.: 9.76765 Epoch 310/500 total: 3.70709, -LL: 3.59931, prior: 0.83434, aleatoric unc.: 9.76765
Epoch 311/500 total: 3.69747, -LL: 3.63639, prior: 0.83337, aleatoric unc.: 9.74833 Epoch 311/500 total: 3.69747, -LL: 3.63639, prior: 0.83337, aleatoric unc.: 9.74833
Epoch 312/500 total: 3.71152, -LL: 3.62594, prior: 0.83660, aleatoric unc.: 9.76304 Epoch 312/500 total: 3.71152, -LL: 3.62594, prior: 0.83660, aleatoric unc.: 9.76304
Epoch 313/500 total: 3.70416, -LL: 3.64673, prior: 0.83517, aleatoric unc.: 9.76183 Epoch 313/500 total: 3.70416, -LL: 3.64673, prior: 0.83517, aleatoric unc.: 9.76183
Epoch 314/500 total: 3.69919, -LL: 3.58278, prior: 0.83536, aleatoric unc.: 9.75043 Epoch 314/500 total: 3.69919, -LL: 3.58278, prior: 0.83536, aleatoric unc.: 9.75043
Epoch 315/500 total: 3.70675, -LL: 3.60159, prior: 0.83583, aleatoric unc.: 9.75515 Epoch 315/500 total: 3.70675, -LL: 3.60159, prior: 0.83583, aleatoric unc.: 9.75515
Epoch 316/500 total: 3.70445, -LL: 3.64417, prior: 0.83832, aleatoric unc.: 9.75135 Epoch 316/500 total: 3.70445, -LL: 3.64417, prior: 0.83832, aleatoric unc.: 9.75135
Epoch 317/500 total: 3.70155, -LL: 3.64957, prior: 0.83247, aleatoric unc.: 9.74824 Epoch 317/500 total: 3.70155, -LL: 3.64957, prior: 0.83247, aleatoric unc.: 9.74824
Epoch 318/500 total: 3.70251, -LL: 3.61336, prior: 0.83256, aleatoric unc.: 9.74411 Epoch 318/500 total: 3.70251, -LL: 3.61336, prior: 0.83256, aleatoric unc.: 9.74411
Epoch 319/500 total: 3.70864, -LL: 3.61781, prior: 0.83391, aleatoric unc.: 9.75591 Epoch 319/500 total: 3.70864, -LL: 3.61781, prior: 0.83391, aleatoric unc.: 9.75591
Epoch 320/500 total: 3.69483, -LL: 3.62589, prior: 0.83023, aleatoric unc.: 9.73673 Epoch 320/500 total: 3.69483, -LL: 3.62589, prior: 0.83023, aleatoric unc.: 9.73673
Epoch 321/500 total: 3.70477, -LL: 3.65918, prior: 0.83147, aleatoric unc.: 9.73801 Epoch 321/500 total: 3.70477, -LL: 3.65918, prior: 0.83147, aleatoric unc.: 9.73801
Epoch 322/500 total: 3.71006, -LL: 3.68025, prior: 0.83056, aleatoric unc.: 9.75054 Epoch 322/500 total: 3.71006, -LL: 3.68025, prior: 0.83056, aleatoric unc.: 9.75054
Epoch 323/500 total: 3.70612, -LL: 3.63204, prior: 0.82882, aleatoric unc.: 9.75776 Epoch 323/500 total: 3.70612, -LL: 3.63204, prior: 0.82882, aleatoric unc.: 9.75776
Epoch 324/500 total: 3.70220, -LL: 3.63502, prior: 0.82885, aleatoric unc.: 9.75287 Epoch 324/500 total: 3.70220, -LL: 3.63502, prior: 0.82885, aleatoric unc.: 9.75287
Epoch 325/500 total: 3.70991, -LL: 3.67634, prior: 0.82949, aleatoric unc.: 9.75935 Epoch 325/500 total: 3.70991, -LL: 3.67634, prior: 0.82949, aleatoric unc.: 9.75935
Epoch 326/500 total: 3.70031, -LL: 3.67920, prior: 0.83210, aleatoric unc.: 9.74968 Epoch 326/500 total: 3.70031, -LL: 3.67920, prior: 0.83210, aleatoric unc.: 9.74968
Epoch 327/500 total: 3.71157, -LL: 3.65393, prior: 0.83033, aleatoric unc.: 9.76567 Epoch 327/500 total: 3.71157, -LL: 3.65393, prior: 0.83033, aleatoric unc.: 9.76567
Epoch 328/500 total: 3.70492, -LL: 3.64040, prior: 0.82936, aleatoric unc.: 9.76520 Epoch 328/500 total: 3.70492, -LL: 3.64040, prior: 0.82936, aleatoric unc.: 9.76520
Epoch 329/500 total: 3.70176, -LL: 3.60859, prior: 0.83023, aleatoric unc.: 9.75874 Epoch 329/500 total: 3.70176, -LL: 3.60859, prior: 0.83023, aleatoric unc.: 9.75874
Epoch 330/500 total: 3.70091, -LL: 3.61544, prior: 0.83437, aleatoric unc.: 9.74670 Epoch 330/500 total: 3.70091, -LL: 3.61544, prior: 0.83437, aleatoric unc.: 9.74670
Epoch 331/500 total: 3.70303, -LL: 3.65750, prior: 0.83323, aleatoric unc.: 9.74530 Epoch 331/500 total: 3.70303, -LL: 3.65750, prior: 0.83323, aleatoric unc.: 9.74530
Epoch 332/500 total: 3.70879, -LL: 3.59750, prior: 0.82948, aleatoric unc.: 9.75656 Epoch 332/500 total: 3.70879, -LL: 3.59750, prior: 0.82948, aleatoric unc.: 9.75656
Epoch 333/500 total: 3.70656, -LL: 3.65474, prior: 0.82868, aleatoric unc.: 9.76092 Epoch 333/500 total: 3.70656, -LL: 3.65474, prior: 0.82868, aleatoric unc.: 9.76092
Epoch 334/500 total: 3.69965, -LL: 3.63443, prior: 0.83218, aleatoric unc.: 9.74758 Epoch 334/500 total: 3.69965, -LL: 3.63443, prior: 0.83218, aleatoric unc.: 9.74758
Epoch 335/500 total: 3.70277, -LL: 3.63288, prior: 0.83110, aleatoric unc.: 9.74762 Epoch 335/500 total: 3.70277, -LL: 3.63288, prior: 0.83110, aleatoric unc.: 9.74762
Epoch 336/500 total: 3.70189, -LL: 3.66539, prior: 0.83201, aleatoric unc.: 9.74239 Epoch 336/500 total: 3.70189, -LL: 3.66539, prior: 0.83201, aleatoric unc.: 9.74239
Epoch 337/500 total: 3.69537, -LL: 3.65592, prior: 0.82954, aleatoric unc.: 9.72731 Epoch 337/500 total: 3.69537, -LL: 3.65592, prior: 0.82954, aleatoric unc.: 9.72731
Epoch 338/500 total: 3.69802, -LL: 3.60642, prior: 0.82822, aleatoric unc.: 9.72182 Epoch 338/500 total: 3.69802, -LL: 3.60642, prior: 0.82822, aleatoric unc.: 9.72182
Epoch 339/500 total: 3.71005, -LL: 3.67063, prior: 0.82849, aleatoric unc.: 9.73660 Epoch 339/500 total: 3.71005, -LL: 3.67063, prior: 0.82849, aleatoric unc.: 9.73660
Epoch 340/500 total: 3.70104, -LL: 3.60019, prior: 0.83237, aleatoric unc.: 9.73406 Epoch 340/500 total: 3.70104, -LL: 3.60019, prior: 0.83237, aleatoric unc.: 9.73406
Epoch 341/500 total: 3.70567, -LL: 3.62259, prior: 0.83000, aleatoric unc.: 9.74208 Epoch 341/500 total: 3.70567, -LL: 3.62259, prior: 0.83000, aleatoric unc.: 9.74208
Epoch 342/500 total: 3.70195, -LL: 3.65865, prior: 0.82917, aleatoric unc.: 9.73657 Epoch 342/500 total: 3.70195, -LL: 3.65865, prior: 0.82917, aleatoric unc.: 9.73657
Epoch 343/500 total: 3.70123, -LL: 3.62708, prior: 0.83080, aleatoric unc.: 9.73533 Epoch 343/500 total: 3.70123, -LL: 3.62708, prior: 0.83080, aleatoric unc.: 9.73533
Epoch 344/500 total: 3.70437, -LL: 3.61342, prior: 0.82911, aleatoric unc.: 9.73596 Epoch 344/500 total: 3.70437, -LL: 3.61342, prior: 0.82911, aleatoric unc.: 9.73596
Epoch 345/500 total: 3.70366, -LL: 3.60328, prior: 0.83045, aleatoric unc.: 9.74204 Epoch 345/500 total: 3.70366, -LL: 3.60328, prior: 0.83045, aleatoric unc.: 9.74204
Epoch 346/500 total: 3.70356, -LL: 3.62689, prior: 0.83147, aleatoric unc.: 9.73873 Epoch 346/500 total: 3.70356, -LL: 3.62689, prior: 0.83147, aleatoric unc.: 9.73873
Epoch 347/500 total: 3.69629, -LL: 3.64269, prior: 0.82647, aleatoric unc.: 9.72898 Epoch 347/500 total: 3.69629, -LL: 3.64269, prior: 0.82647, aleatoric unc.: 9.72898
Epoch 348/500 total: 3.69996, -LL: 3.65626, prior: 0.82871, aleatoric unc.: 9.72221 Epoch 348/500 total: 3.69996, -LL: 3.65626, prior: 0.82871, aleatoric unc.: 9.72221
Epoch 349/500 total: 3.70417, -LL: 3.63705, prior: 0.82662, aleatoric unc.: 9.72849 Epoch 349/500 total: 3.70417, -LL: 3.63705, prior: 0.82662, aleatoric unc.: 9.72849
Epoch 350/500 total: 3.70177, -LL: 3.65497, prior: 0.82323, aleatoric unc.: 9.72953 Epoch 350/500 total: 3.70177, -LL: 3.65497, prior: 0.82323, aleatoric unc.: 9.72953
Epoch 351/500 total: 3.70254, -LL: 3.67832, prior: 0.82772, aleatoric unc.: 9.73065 Epoch 351/500 total: 3.70254, -LL: 3.67832, prior: 0.82772, aleatoric unc.: 9.73065
Epoch 352/500 total: 3.70162, -LL: 3.63990, prior: 0.83119, aleatoric unc.: 9.72532 Epoch 352/500 total: 3.70162, -LL: 3.63990, prior: 0.83119, aleatoric unc.: 9.72532
Epoch 353/500 total: 3.70578, -LL: 3.60402, prior: 0.83188, aleatoric unc.: 9.73709 Epoch 353/500 total: 3.70578, -LL: 3.60402, prior: 0.83188, aleatoric unc.: 9.73709
Epoch 354/500 total: 3.70272, -LL: 3.62950, prior: 0.82930, aleatoric unc.: 9.73685 Epoch 354/500 total: 3.70272, -LL: 3.62950, prior: 0.82930, aleatoric unc.: 9.73685
Epoch 355/500 total: 3.70043, -LL: 3.62746, prior: 0.82995, aleatoric unc.: 9.73355 Epoch 355/500 total: 3.70043, -LL: 3.62746, prior: 0.82995, aleatoric unc.: 9.73355
Epoch 356/500 total: 3.69946, -LL: 3.61545, prior: 0.83072, aleatoric unc.: 9.72545 Epoch 356/500 total: 3.69946, -LL: 3.61545, prior: 0.83072, aleatoric unc.: 9.72545
Epoch 357/500 total: 3.70071, -LL: 3.67021, prior: 0.83015, aleatoric unc.: 9.72398 Epoch 357/500 total: 3.70071, -LL: 3.67021, prior: 0.83015, aleatoric unc.: 9.72398
Epoch 358/500 total: 3.70548, -LL: 3.63582, prior: 0.82771, aleatoric unc.: 9.73216 Epoch 358/500 total: 3.70548, -LL: 3.63582, prior: 0.82771, aleatoric unc.: 9.73216
Epoch 359/500 total: 3.69811, -LL: 3.60202, prior: 0.82758, aleatoric unc.: 9.72475 Epoch 359/500 total: 3.69811, -LL: 3.60202, prior: 0.82758, aleatoric unc.: 9.72475
Epoch 360/500 total: 3.69686, -LL: 3.64973, prior: 0.82581, aleatoric unc.: 9.71591 Epoch 360/500 total: 3.69686, -LL: 3.64973, prior: 0.82581, aleatoric unc.: 9.71591
Epoch 361/500 total: 3.70717, -LL: 3.65881, prior: 0.82941, aleatoric unc.: 9.72619 Epoch 361/500 total: 3.70717, -LL: 3.65881, prior: 0.82941, aleatoric unc.: 9.72619
Epoch 362/500 total: 3.70385, -LL: 3.65302, prior: 0.82832, aleatoric unc.: 9.73025 Epoch 362/500 total: 3.70385, -LL: 3.65302, prior: 0.82832, aleatoric unc.: 9.73025
Epoch 363/500 total: 3.70743, -LL: 3.64934, prior: 0.82749, aleatoric unc.: 9.74160 Epoch 363/500 total: 3.70743, -LL: 3.64934, prior: 0.82749, aleatoric unc.: 9.74160
Epoch 364/500 total: 3.70326, -LL: 3.58934, prior: 0.82753, aleatoric unc.: 9.74316 Epoch 364/500 total: 3.70326, -LL: 3.58934, prior: 0.82753, aleatoric unc.: 9.74316
Epoch 365/500 total: 3.70402, -LL: 3.62665, prior: 0.82742, aleatoric unc.: 9.74414 Epoch 365/500 total: 3.70402, -LL: 3.62665, prior: 0.82742, aleatoric unc.: 9.74414
Epoch 366/500 total: 3.70585, -LL: 3.68176, prior: 0.82632, aleatoric unc.: 9.74891 Epoch 366/500 total: 3.70585, -LL: 3.68176, prior: 0.82632, aleatoric unc.: 9.74891
Epoch 367/500 total: 3.70568, -LL: 3.62934, prior: 0.82647, aleatoric unc.: 9.75294 Epoch 367/500 total: 3.70568, -LL: 3.62934, prior: 0.82647, aleatoric unc.: 9.75294
Epoch 368/500 total: 3.69735, -LL: 3.63394, prior: 0.82584, aleatoric unc.: 9.73833 Epoch 368/500 total: 3.69735, -LL: 3.63394, prior: 0.82584, aleatoric unc.: 9.73833
Epoch 369/500 total: 3.70665, -LL: 3.60627, prior: 0.82508, aleatoric unc.: 9.74506 Epoch 369/500 total: 3.70665, -LL: 3.60627, prior: 0.82508, aleatoric unc.: 9.74506
Epoch 370/500 total: 3.70283, -LL: 3.64191, prior: 0.82496, aleatoric unc.: 9.74463 Epoch 370/500 total: 3.70283, -LL: 3.64191, prior: 0.82496, aleatoric unc.: 9.74463
Epoch 371/500 total: 3.69990, -LL: 3.63656, prior: 0.82942, aleatoric unc.: 9.73714 Epoch 371/500 total: 3.69990, -LL: 3.63656, prior: 0.82942, aleatoric unc.: 9.73714
Epoch 372/500 total: 3.70170, -LL: 3.65815, prior: 0.82560, aleatoric unc.: 9.73377 Epoch 372/500 total: 3.70170, -LL: 3.65815, prior: 0.82560, aleatoric unc.: 9.73377
Epoch 373/500 total: 3.70135, -LL: 3.66173, prior: 0.82487, aleatoric unc.: 9.73149 Epoch 373/500 total: 3.70135, -LL: 3.66173, prior: 0.82487, aleatoric unc.: 9.73149
Epoch 374/500 total: 3.70539, -LL: 3.60903, prior: 0.82451, aleatoric unc.: 9.74010 Epoch 374/500 total: 3.70539, -LL: 3.60903, prior: 0.82451, aleatoric unc.: 9.74010
Epoch 375/500 total: 3.69980, -LL: 3.63325, prior: 0.82713, aleatoric unc.: 9.73296 Epoch 375/500 total: 3.69980, -LL: 3.63325, prior: 0.82713, aleatoric unc.: 9.73296
Epoch 376/500 total: 3.69921, -LL: 3.60023, prior: 0.82868, aleatoric unc.: 9.72724 Epoch 376/500 total: 3.69921, -LL: 3.60023, prior: 0.82868, aleatoric unc.: 9.72724
Epoch 377/500 total: 3.69869, -LL: 3.66106, prior: 0.82731, aleatoric unc.: 9.72021 Epoch 377/500 total: 3.69869, -LL: 3.66106, prior: 0.82731, aleatoric unc.: 9.72021
Epoch 378/500 total: 3.69702, -LL: 3.62505, prior: 0.82719, aleatoric unc.: 9.71488 Epoch 378/500 total: 3.69702, -LL: 3.62505, prior: 0.82719, aleatoric unc.: 9.71488
Epoch 379/500 total: 3.69819, -LL: 3.60678, prior: 0.82717, aleatoric unc.: 9.71085 Epoch 379/500 total: 3.69819, -LL: 3.60678, prior: 0.82717, aleatoric unc.: 9.71085
Epoch 380/500 total: 3.69677, -LL: 3.62688, prior: 0.82746, aleatoric unc.: 9.70245 Epoch 380/500 total: 3.69677, -LL: 3.62688, prior: 0.82746, aleatoric unc.: 9.70245
Epoch 381/500 total: 3.70501, -LL: 3.60229, prior: 0.82751, aleatoric unc.: 9.71364 Epoch 381/500 total: 3.70501, -LL: 3.60229, prior: 0.82751, aleatoric unc.: 9.71364
Epoch 382/500 total: 3.69922, -LL: 3.57855, prior: 0.83100, aleatoric unc.: 9.71044 Epoch 382/500 total: 3.69922, -LL: 3.57855, prior: 0.83100, aleatoric unc.: 9.71044
Epoch 383/500 total: 3.69865, -LL: 3.63590, prior: 0.82836, aleatoric unc.: 9.70802 Epoch 383/500 total: 3.69865, -LL: 3.63590, prior: 0.82836, aleatoric unc.: 9.70802
Epoch 384/500 total: 3.70344, -LL: 3.59795, prior: 0.82755, aleatoric unc.: 9.71421 Epoch 384/500 total: 3.70344, -LL: 3.59795, prior: 0.82755, aleatoric unc.: 9.71421
Epoch 385/500 total: 3.70533, -LL: 3.64661, prior: 0.82573, aleatoric unc.: 9.72334 Epoch 385/500 total: 3.70533, -LL: 3.64661, prior: 0.82573, aleatoric unc.: 9.72334
Epoch 386/500 total: 3.70198, -LL: 3.61014, prior: 0.82611, aleatoric unc.: 9.72450 Epoch 386/500 total: 3.70198, -LL: 3.61014, prior: 0.82611, aleatoric unc.: 9.72450
Epoch 387/500 total: 3.70406, -LL: 3.64215, prior: 0.82555, aleatoric unc.: 9.73054 Epoch 387/500 total: 3.70406, -LL: 3.64215, prior: 0.82555, aleatoric unc.: 9.73054
Epoch 388/500 total: 3.70490, -LL: 3.63847, prior: 0.82456, aleatoric unc.: 9.73703 Epoch 388/500 total: 3.70490, -LL: 3.63847, prior: 0.82456, aleatoric unc.: 9.73703
Epoch 389/500 total: 3.69739, -LL: 3.60615, prior: 0.82374, aleatoric unc.: 9.72495 Epoch 389/500 total: 3.69739, -LL: 3.60615, prior: 0.82374, aleatoric unc.: 9.72495
Epoch 390/500 total: 3.70646, -LL: 3.59768, prior: 0.82653, aleatoric unc.: 9.73402 Epoch 390/500 total: 3.70646, -LL: 3.59768, prior: 0.82653, aleatoric unc.: 9.73402
Epoch 391/500 total: 3.69914, -LL: 3.65293, prior: 0.82449, aleatoric unc.: 9.72576 Epoch 391/500 total: 3.69914, -LL: 3.65293, prior: 0.82449, aleatoric unc.: 9.72576
Epoch 392/500 total: 3.70214, -LL: 3.62112, prior: 0.82444, aleatoric unc.: 9.72820 Epoch 392/500 total: 3.70214, -LL: 3.62112, prior: 0.82444, aleatoric unc.: 9.72820
Epoch 393/500 total: 3.70022, -LL: 3.57499, prior: 0.82906, aleatoric unc.: 9.72694 Epoch 393/500 total: 3.70022, -LL: 3.57499, prior: 0.82906, aleatoric unc.: 9.72694
Epoch 394/500 total: 3.69050, -LL: 3.60771, prior: 0.82554, aleatoric unc.: 9.70561 Epoch 394/500 total: 3.69050, -LL: 3.60771, prior: 0.82554, aleatoric unc.: 9.70561
Epoch 395/500 total: 3.69285, -LL: 3.68116, prior: 0.82755, aleatoric unc.: 9.69149 Epoch 395/500 total: 3.69285, -LL: 3.68116, prior: 0.82755, aleatoric unc.: 9.69149
Epoch 396/500 total: 3.70537, -LL: 3.61828, prior: 0.82465, aleatoric unc.: 9.70394 Epoch 396/500 total: 3.70537, -LL: 3.61828, prior: 0.82465, aleatoric unc.: 9.70394
Epoch 397/500 total: 3.69655, -LL: 3.61455, prior: 0.82317, aleatoric unc.: 9.70052 Epoch 397/500 total: 3.69655, -LL: 3.61455, prior: 0.82317, aleatoric unc.: 9.70052
Epoch 398/500 total: 3.70530, -LL: 3.62109, prior: 0.82612, aleatoric unc.: 9.71183 Epoch 398/500 total: 3.70530, -LL: 3.62109, prior: 0.82612, aleatoric unc.: 9.71183
Epoch 399/500 total: 3.70086, -LL: 3.61040, prior: 0.82481, aleatoric unc.: 9.71438 Epoch 399/500 total: 3.70086, -LL: 3.61040, prior: 0.82481, aleatoric unc.: 9.71438
Epoch 400/500 total: 3.70211, -LL: 3.64791, prior: 0.82474, aleatoric unc.: 9.71751 Epoch 400/500 total: 3.70211, -LL: 3.64791, prior: 0.82474, aleatoric unc.: 9.71751
Epoch 401/500 total: 3.70449, -LL: 3.66220, prior: 0.82469, aleatoric unc.: 9.72471 Epoch 401/500 total: 3.70449, -LL: 3.66220, prior: 0.82469, aleatoric unc.: 9.72471
Epoch 402/500 total: 3.69559, -LL: 3.68538, prior: 0.82521, aleatoric unc.: 9.71284 Epoch 402/500 total: 3.69559, -LL: 3.68538, prior: 0.82521, aleatoric unc.: 9.71284
Epoch 403/500 total: 3.70262, -LL: 3.61727, prior: 0.82641, aleatoric unc.: 9.71845 Epoch 403/500 total: 3.70262, -LL: 3.61727, prior: 0.82641, aleatoric unc.: 9.71845
Epoch 404/500 total: 3.70541, -LL: 3.65800, prior: 0.82866, aleatoric unc.: 9.72628 Epoch 404/500 total: 3.70541, -LL: 3.65800, prior: 0.82866, aleatoric unc.: 9.72628
Epoch 405/500 total: 3.70215, -LL: 3.66135, prior: 0.82954, aleatoric unc.: 9.72948 Epoch 405/500 total: 3.70215, -LL: 3.66135, prior: 0.82954, aleatoric unc.: 9.72948
Epoch 406/500 total: 3.69729, -LL: 3.64724, prior: 0.82800, aleatoric unc.: 9.72026 Epoch 406/500 total: 3.69729, -LL: 3.64724, prior: 0.82800, aleatoric unc.: 9.72026
Epoch 407/500 total: 3.69719, -LL: 3.63176, prior: 0.82775, aleatoric unc.: 9.71121 Epoch 407/500 total: 3.69719, -LL: 3.63176, prior: 0.82775, aleatoric unc.: 9.71121
Epoch 408/500 total: 3.70596, -LL: 3.66872, prior: 0.82940, aleatoric unc.: 9.72126 Epoch 408/500 total: 3.70596, -LL: 3.66872, prior: 0.82940, aleatoric unc.: 9.72126
Epoch 409/500 total: 3.70204, -LL: 3.63414, prior: 0.83020, aleatoric unc.: 9.72397 Epoch 409/500 total: 3.70204, -LL: 3.63414, prior: 0.83020, aleatoric unc.: 9.72397
Epoch 410/500 total: 3.70021, -LL: 3.64946, prior: 0.83012, aleatoric unc.: 9.72245 Epoch 410/500 total: 3.70021, -LL: 3.64946, prior: 0.83012, aleatoric unc.: 9.72245
Epoch 411/500 total: 3.69402, -LL: 3.61718, prior: 0.82774, aleatoric unc.: 9.70823 Epoch 411/500 total: 3.69402, -LL: 3.61718, prior: 0.82774, aleatoric unc.: 9.70823
Epoch 412/500 total: 3.70567, -LL: 3.61743, prior: 0.82851, aleatoric unc.: 9.71849 Epoch 412/500 total: 3.70567, -LL: 3.61743, prior: 0.82851, aleatoric unc.: 9.71849
Epoch 413/500 total: 3.70361, -LL: 3.64554, prior: 0.82617, aleatoric unc.: 9.72331 Epoch 413/500 total: 3.70361, -LL: 3.64554, prior: 0.82617, aleatoric unc.: 9.72331
Epoch 414/500 total: 3.69468, -LL: 3.60278, prior: 0.82620, aleatoric unc.: 9.71184 Epoch 414/500 total: 3.69468, -LL: 3.60278, prior: 0.82620, aleatoric unc.: 9.71184
Epoch 415/500 total: 3.70510, -LL: 3.64241, prior: 0.82747, aleatoric unc.: 9.72071 Epoch 415/500 total: 3.70510, -LL: 3.64241, prior: 0.82747, aleatoric unc.: 9.72071
Epoch 416/500 total: 3.70178, -LL: 3.59014, prior: 0.83144, aleatoric unc.: 9.72354 Epoch 416/500 total: 3.70178, -LL: 3.59014, prior: 0.83144, aleatoric unc.: 9.72354
Epoch 417/500 total: 3.69967, -LL: 3.63795, prior: 0.83206, aleatoric unc.: 9.71923 Epoch 417/500 total: 3.69967, -LL: 3.63795, prior: 0.83206, aleatoric unc.: 9.71923
Epoch 418/500 total: 3.69828, -LL: 3.60308, prior: 0.82796, aleatoric unc.: 9.71243 Epoch 418/500 total: 3.69828, -LL: 3.60308, prior: 0.82796, aleatoric unc.: 9.71243
Epoch 419/500 total: 3.70307, -LL: 3.66544, prior: 0.82524, aleatoric unc.: 9.71768 Epoch 419/500 total: 3.70307, -LL: 3.66544, prior: 0.82524, aleatoric unc.: 9.71768
Epoch 420/500 total: 3.70183, -LL: 3.61180, prior: 0.82254, aleatoric unc.: 9.72192 Epoch 420/500 total: 3.70183, -LL: 3.61180, prior: 0.82254, aleatoric unc.: 9.72192
Epoch 421/500 total: 3.69221, -LL: 3.63945, prior: 0.82239, aleatoric unc.: 9.70423 Epoch 421/500 total: 3.69221, -LL: 3.63945, prior: 0.82239, aleatoric unc.: 9.70423
Epoch 422/500 total: 3.69980, -LL: 3.61604, prior: 0.82302, aleatoric unc.: 9.70583 Epoch 422/500 total: 3.69980, -LL: 3.61604, prior: 0.82302, aleatoric unc.: 9.70583
Epoch 423/500 total: 3.69928, -LL: 3.63499, prior: 0.81947, aleatoric unc.: 9.70428 Epoch 423/500 total: 3.69928, -LL: 3.63499, prior: 0.81947, aleatoric unc.: 9.70428
Epoch 424/500 total: 3.70064, -LL: 3.64438, prior: 0.82250, aleatoric unc.: 9.70499 Epoch 424/500 total: 3.70064, -LL: 3.64438, prior: 0.82250, aleatoric unc.: 9.70499
Epoch 425/500 total: 3.69506, -LL: 3.61971, prior: 0.82166, aleatoric unc.: 9.69757 Epoch 425/500 total: 3.69506, -LL: 3.61971, prior: 0.82166, aleatoric unc.: 9.69757
Epoch 426/500 total: 3.70398, -LL: 3.62597, prior: 0.82278, aleatoric unc.: 9.70802 Epoch 426/500 total: 3.70398, -LL: 3.62597, prior: 0.82278, aleatoric unc.: 9.70802
Epoch 427/500 total: 3.70216, -LL: 3.65498, prior: 0.81911, aleatoric unc.: 9.71137 Epoch 427/500 total: 3.70216, -LL: 3.65498, prior: 0.81911, aleatoric unc.: 9.71137
Epoch 428/500 total: 3.70190, -LL: 3.64064, prior: 0.81833, aleatoric unc.: 9.71579 Epoch 428/500 total: 3.70190, -LL: 3.64064, prior: 0.81833, aleatoric unc.: 9.71579
Epoch 429/500 total: 3.70573, -LL: 3.66083, prior: 0.81893, aleatoric unc.: 9.72434 Epoch 429/500 total: 3.70573, -LL: 3.66083, prior: 0.81893, aleatoric unc.: 9.72434
Epoch 430/500 total: 3.69885, -LL: 3.65177, prior: 0.82123, aleatoric unc.: 9.72031 Epoch 430/500 total: 3.69885, -LL: 3.65177, prior: 0.82123, aleatoric unc.: 9.72031
Epoch 431/500 total: 3.70198, -LL: 3.67658, prior: 0.81868, aleatoric unc.: 9.72134 Epoch 431/500 total: 3.70198, -LL: 3.67658, prior: 0.81868, aleatoric unc.: 9.72134
Epoch 432/500 total: 3.70692, -LL: 3.63311, prior: 0.81943, aleatoric unc.: 9.73138 Epoch 432/500 total: 3.70692, -LL: 3.63311, prior: 0.81943, aleatoric unc.: 9.73138
Epoch 433/500 total: 3.70009, -LL: 3.59875, prior: 0.81951, aleatoric unc.: 9.72938 Epoch 433/500 total: 3.70009, -LL: 3.59875, prior: 0.81951, aleatoric unc.: 9.72938
Epoch 434/500 total: 3.69864, -LL: 3.63859, prior: 0.82302, aleatoric unc.: 9.72290 Epoch 434/500 total: 3.69864, -LL: 3.63859, prior: 0.82302, aleatoric unc.: 9.72290
Epoch 435/500 total: 3.69655, -LL: 3.66607, prior: 0.82137, aleatoric unc.: 9.71333 Epoch 435/500 total: 3.69655, -LL: 3.66607, prior: 0.82137, aleatoric unc.: 9.71333
Epoch 436/500 total: 3.69973, -LL: 3.65163, prior: 0.81886, aleatoric unc.: 9.71366 Epoch 436/500 total: 3.69973, -LL: 3.65163, prior: 0.81886, aleatoric unc.: 9.71366
Epoch 437/500 total: 3.69769, -LL: 3.68498, prior: 0.81805, aleatoric unc.: 9.70730 Epoch 437/500 total: 3.69769, -LL: 3.68498, prior: 0.81805, aleatoric unc.: 9.70730
Epoch 438/500 total: 3.69672, -LL: 3.64131, prior: 0.81733, aleatoric unc.: 9.70155 Epoch 438/500 total: 3.69672, -LL: 3.64131, prior: 0.81733, aleatoric unc.: 9.70155
Epoch 439/500 total: 3.69554, -LL: 3.62049, prior: 0.82319, aleatoric unc.: 9.69490 Epoch 439/500 total: 3.69554, -LL: 3.62049, prior: 0.82319, aleatoric unc.: 9.69490
Epoch 440/500 total: 3.69749, -LL: 3.62081, prior: 0.82234, aleatoric unc.: 9.69433 Epoch 440/500 total: 3.69749, -LL: 3.62081, prior: 0.82234, aleatoric unc.: 9.69433
Epoch 441/500 total: 3.70329, -LL: 3.59532, prior: 0.82169, aleatoric unc.: 9.70249 Epoch 441/500 total: 3.70329, -LL: 3.59532, prior: 0.82169, aleatoric unc.: 9.70249
Epoch 442/500 total: 3.69808, -LL: 3.62759, prior: 0.82254, aleatoric unc.: 9.70274 Epoch 442/500 total: 3.69808, -LL: 3.62759, prior: 0.82254, aleatoric unc.: 9.70274
Epoch 443/500 total: 3.70494, -LL: 3.65422, prior: 0.82270, aleatoric unc.: 9.71221 Epoch 443/500 total: 3.70494, -LL: 3.65422, prior: 0.82270, aleatoric unc.: 9.71221
Epoch 444/500 total: 3.69976, -LL: 3.65673, prior: 0.82426, aleatoric unc.: 9.71188 Epoch 444/500 total: 3.69976, -LL: 3.65673, prior: 0.82426, aleatoric unc.: 9.71188
Epoch 445/500 total: 3.70325, -LL: 3.64046, prior: 0.82422, aleatoric unc.: 9.71596 Epoch 445/500 total: 3.70325, -LL: 3.64046, prior: 0.82422, aleatoric unc.: 9.71596
Epoch 446/500 total: 3.70594, -LL: 3.65127, prior: 0.82255, aleatoric unc.: 9.72795 Epoch 446/500 total: 3.70594, -LL: 3.65127, prior: 0.82255, aleatoric unc.: 9.72795
Epoch 447/500 total: 3.70256, -LL: 3.65205, prior: 0.82299, aleatoric unc.: 9.72941 Epoch 447/500 total: 3.70256, -LL: 3.65205, prior: 0.82299, aleatoric unc.: 9.72941
Epoch 448/500 total: 3.70255, -LL: 3.66160, prior: 0.82127, aleatoric unc.: 9.73019 Epoch 448/500 total: 3.70255, -LL: 3.66160, prior: 0.82127, aleatoric unc.: 9.73019
Epoch 449/500 total: 3.70477, -LL: 3.65332, prior: 0.82186, aleatoric unc.: 9.73749 Epoch 449/500 total: 3.70477, -LL: 3.65332, prior: 0.82186, aleatoric unc.: 9.73749
Epoch 450/500 total: 3.69821, -LL: 3.62343, prior: 0.82277, aleatoric unc.: 9.72872 Epoch 450/500 total: 3.69821, -LL: 3.62343, prior: 0.82277, aleatoric unc.: 9.72872
Epoch 451/500 total: 3.69862, -LL: 3.61988, prior: 0.81971, aleatoric unc.: 9.72094 Epoch 451/500 total: 3.69862, -LL: 3.61988, prior: 0.81971, aleatoric unc.: 9.72094
Epoch 452/500 total: 3.69956, -LL: 3.64692, prior: 0.82058, aleatoric unc.: 9.71754 Epoch 452/500 total: 3.69956, -LL: 3.64692, prior: 0.82058, aleatoric unc.: 9.71754
Epoch 453/500 total: 3.68746, -LL: 3.61281, prior: 0.81960, aleatoric unc.: 9.69424 Epoch 453/500 total: 3.68746, -LL: 3.61281, prior: 0.81960, aleatoric unc.: 9.69424
Epoch 454/500 total: 3.69136, -LL: 3.61534, prior: 0.82130, aleatoric unc.: 9.67839 Epoch 454/500 total: 3.69136, -LL: 3.61534, prior: 0.82130, aleatoric unc.: 9.67839
Epoch 455/500 total: 3.70947, -LL: 3.63629, prior: 0.82276, aleatoric unc.: 9.70095 Epoch 455/500 total: 3.70947, -LL: 3.63629, prior: 0.82276, aleatoric unc.: 9.70095
Epoch 456/500 total: 3.70455, -LL: 3.62145, prior: 0.82456, aleatoric unc.: 9.71287 Epoch 456/500 total: 3.70455, -LL: 3.62145, prior: 0.82456, aleatoric unc.: 9.71287
Epoch 457/500 total: 3.69434, -LL: 3.64094, prior: 0.82408, aleatoric unc.: 9.70524 Epoch 457/500 total: 3.69434, -LL: 3.64094, prior: 0.82408, aleatoric unc.: 9.70524
Epoch 458/500 total: 3.69588, -LL: 3.62607, prior: 0.82356, aleatoric unc.: 9.69614 Epoch 458/500 total: 3.69588, -LL: 3.62607, prior: 0.82356, aleatoric unc.: 9.69614
Epoch 459/500 total: 3.69965, -LL: 3.67129, prior: 0.82353, aleatoric unc.: 9.69692 Epoch 459/500 total: 3.69965, -LL: 3.67129, prior: 0.82353, aleatoric unc.: 9.69692
Epoch 460/500 total: 3.69278, -LL: 3.64340, prior: 0.82238, aleatoric unc.: 9.68522 Epoch 460/500 total: 3.69278, -LL: 3.64340, prior: 0.82238, aleatoric unc.: 9.68522
Epoch 461/500 total: 3.69769, -LL: 3.66663, prior: 0.82365, aleatoric unc.: 9.68477 Epoch 461/500 total: 3.69769, -LL: 3.66663, prior: 0.82365, aleatoric unc.: 9.68477
Epoch 462/500 total: 3.70174, -LL: 3.64461, prior: 0.82374, aleatoric unc.: 9.69469 Epoch 462/500 total: 3.70174, -LL: 3.64461, prior: 0.82374, aleatoric unc.: 9.69469
Epoch 463/500 total: 3.69422, -LL: 3.62371, prior: 0.82634, aleatoric unc.: 9.68775 Epoch 463/500 total: 3.69422, -LL: 3.62371, prior: 0.82634, aleatoric unc.: 9.68775
Epoch 464/500 total: 3.70086, -LL: 3.65318, prior: 0.82430, aleatoric unc.: 9.69298 Epoch 464/500 total: 3.70086, -LL: 3.65318, prior: 0.82430, aleatoric unc.: 9.69298
Epoch 465/500 total: 3.69787, -LL: 3.66150, prior: 0.81959, aleatoric unc.: 9.69327 Epoch 465/500 total: 3.69787, -LL: 3.66150, prior: 0.81959, aleatoric unc.: 9.69327
Epoch 466/500 total: 3.69348, -LL: 3.64051, prior: 0.81754, aleatoric unc.: 9.68517 Epoch 466/500 total: 3.69348, -LL: 3.64051, prior: 0.81754, aleatoric unc.: 9.68517
Epoch 467/500 total: 3.70298, -LL: 3.65390, prior: 0.81789, aleatoric unc.: 9.69441 Epoch 467/500 total: 3.70298, -LL: 3.65390, prior: 0.81789, aleatoric unc.: 9.69441
Epoch 468/500 total: 3.69817, -LL: 3.65901, prior: 0.81541, aleatoric unc.: 9.69768 Epoch 468/500 total: 3.69817, -LL: 3.65901, prior: 0.81541, aleatoric unc.: 9.69768
Epoch 469/500 total: 3.69301, -LL: 3.63411, prior: 0.81612, aleatoric unc.: 9.68669 Epoch 469/500 total: 3.69301, -LL: 3.63411, prior: 0.81612, aleatoric unc.: 9.68669
Epoch 470/500 total: 3.70037, -LL: 3.61972, prior: 0.81542, aleatoric unc.: 9.69086 Epoch 470/500 total: 3.70037, -LL: 3.61972, prior: 0.81542, aleatoric unc.: 9.69086
Epoch 471/500 total: 3.70139, -LL: 3.60899, prior: 0.81560, aleatoric unc.: 9.69815 Epoch 471/500 total: 3.70139, -LL: 3.60899, prior: 0.81560, aleatoric unc.: 9.69815
Epoch 472/500 total: 3.70118, -LL: 3.68662, prior: 0.81407, aleatoric unc.: 9.70126 Epoch 472/500 total: 3.70118, -LL: 3.68662, prior: 0.81407, aleatoric unc.: 9.70126
Epoch 473/500 total: 3.70042, -LL: 3.64964, prior: 0.81481, aleatoric unc.: 9.70599 Epoch 473/500 total: 3.70042, -LL: 3.64964, prior: 0.81481, aleatoric unc.: 9.70599
Epoch 474/500 total: 3.70516, -LL: 3.62561, prior: 0.81684, aleatoric unc.: 9.71484 Epoch 474/500 total: 3.70516, -LL: 3.62561, prior: 0.81684, aleatoric unc.: 9.71484
Epoch 475/500 total: 3.69569, -LL: 3.62760, prior: 0.81538, aleatoric unc.: 9.70852 Epoch 475/500 total: 3.69569, -LL: 3.62760, prior: 0.81538, aleatoric unc.: 9.70852
Epoch 476/500 total: 3.69344, -LL: 3.63575, prior: 0.81317, aleatoric unc.: 9.69601 Epoch 476/500 total: 3.69344, -LL: 3.63575, prior: 0.81317, aleatoric unc.: 9.69601
Epoch 477/500 total: 3.69725, -LL: 3.66014, prior: 0.81203, aleatoric unc.: 9.69300 Epoch 477/500 total: 3.69725, -LL: 3.66014, prior: 0.81203, aleatoric unc.: 9.69300
Epoch 478/500 total: 3.70594, -LL: 3.65619, prior: 0.81437, aleatoric unc.: 9.70621 Epoch 478/500 total: 3.70594, -LL: 3.65619, prior: 0.81437, aleatoric unc.: 9.70621
Epoch 479/500 total: 3.69992, -LL: 3.61516, prior: 0.81284, aleatoric unc.: 9.70868 Epoch 479/500 total: 3.69992, -LL: 3.61516, prior: 0.81284, aleatoric unc.: 9.70868
Epoch 480/500 total: 3.69582, -LL: 3.64135, prior: 0.81684, aleatoric unc.: 9.70021 Epoch 480/500 total: 3.69582, -LL: 3.64135, prior: 0.81684, aleatoric unc.: 9.70021
Epoch 481/500 total: 3.69739, -LL: 3.60619, prior: 0.81517, aleatoric unc.: 9.70025 Epoch 481/500 total: 3.69739, -LL: 3.60619, prior: 0.81517, aleatoric unc.: 9.70025
Epoch 482/500 total: 3.70488, -LL: 3.63401, prior: 0.81520, aleatoric unc.: 9.70766 Epoch 482/500 total: 3.70488, -LL: 3.63401, prior: 0.81520, aleatoric unc.: 9.70766
Epoch 483/500 total: 3.69736, -LL: 3.61727, prior: 0.81478, aleatoric unc.: 9.70594 Epoch 483/500 total: 3.69736, -LL: 3.61727, prior: 0.81478, aleatoric unc.: 9.70594
Epoch 484/500 total: 3.70235, -LL: 3.63803, prior: 0.81578, aleatoric unc.: 9.71074 Epoch 484/500 total: 3.70235, -LL: 3.63803, prior: 0.81578, aleatoric unc.: 9.71074
Epoch 485/500 total: 3.70031, -LL: 3.64106, prior: 0.81726, aleatoric unc.: 9.71344 Epoch 485/500 total: 3.70031, -LL: 3.64106, prior: 0.81726, aleatoric unc.: 9.71344
Epoch 486/500 total: 3.70281, -LL: 3.65334, prior: 0.81765, aleatoric unc.: 9.71675 Epoch 486/500 total: 3.70281, -LL: 3.65334, prior: 0.81765, aleatoric unc.: 9.71675
Epoch 487/500 total: 3.69665, -LL: 3.62427, prior: 0.81798, aleatoric unc.: 9.71068 Epoch 487/500 total: 3.69665, -LL: 3.62427, prior: 0.81798, aleatoric unc.: 9.71068
Epoch 488/500 total: 3.69354, -LL: 3.60938, prior: 0.81778, aleatoric unc.: 9.69865 Epoch 488/500 total: 3.69354, -LL: 3.60938, prior: 0.81778, aleatoric unc.: 9.69865
Epoch 489/500 total: 3.69664, -LL: 3.66371, prior: 0.81562, aleatoric unc.: 9.69401 Epoch 489/500 total: 3.69664, -LL: 3.66371, prior: 0.81562, aleatoric unc.: 9.69401
Epoch 490/500 total: 3.70017, -LL: 3.61261, prior: 0.81859, aleatoric unc.: 9.69709 Epoch 490/500 total: 3.70017, -LL: 3.61261, prior: 0.81859, aleatoric unc.: 9.69709
Epoch 491/500 total: 3.70181, -LL: 3.64116, prior: 0.81858, aleatoric unc.: 9.70037 Epoch 491/500 total: 3.70181, -LL: 3.64116, prior: 0.81858, aleatoric unc.: 9.70037
Epoch 492/500 total: 3.69507, -LL: 3.62122, prior: 0.81482, aleatoric unc.: 9.69759 Epoch 492/500 total: 3.69507, -LL: 3.62122, prior: 0.81482, aleatoric unc.: 9.69759
Epoch 493/500 total: 3.69890, -LL: 3.65301, prior: 0.81908, aleatoric unc.: 9.69581 Epoch 493/500 total: 3.69890, -LL: 3.65301, prior: 0.81908, aleatoric unc.: 9.69581
Epoch 494/500 total: 3.70058, -LL: 3.65166, prior: 0.81909, aleatoric unc.: 9.69850 Epoch 494/500 total: 3.70058, -LL: 3.65166, prior: 0.81909, aleatoric unc.: 9.69850
Epoch 495/500 total: 3.69788, -LL: 3.62267, prior: 0.81817, aleatoric unc.: 9.70015 Epoch 495/500 total: 3.69788, -LL: 3.62267, prior: 0.81817, aleatoric unc.: 9.70015
Epoch 496/500 total: 3.69267, -LL: 3.64171, prior: 0.81855, aleatoric unc.: 9.68762 Epoch 496/500 total: 3.69267, -LL: 3.64171, prior: 0.81855, aleatoric unc.: 9.68762
Epoch 497/500 total: 3.70465, -LL: 3.66362, prior: 0.81682, aleatoric unc.: 9.70045 Epoch 497/500 total: 3.70465, -LL: 3.66362, prior: 0.81682, aleatoric unc.: 9.70045
Epoch 498/500 total: 3.70106, -LL: 3.66015, prior: 0.81733, aleatoric unc.: 9.70227 Epoch 498/500 total: 3.70106, -LL: 3.66015, prior: 0.81733, aleatoric unc.: 9.70227
Epoch 499/500 total: 3.69741, -LL: 3.64281, prior: 0.81737, aleatoric unc.: 9.70153 Epoch 499/500 total: 3.69741, -LL: 3.64281, prior: 0.81737, aleatoric unc.: 9.70153
%% Cell type:markdown id:15f3f633 tags: %% Cell type:markdown id:15f3f633 tags:
To evaluate the effect of the uncertainty, we perform the prediction many times for the same data and take the average and root-mean-squared-error of the predictions, since each prediction performed with the Bayesian Neural Network leads to a different results, using a different weight, selected from the final Gaussian. To evaluate the effect of the uncertainty, we perform the prediction many times for the same data and take the average and root-mean-squared-error of the predictions, since each prediction performed with the Bayesian Neural Network leads to a different results, using a different weight, selected from the final Gaussian.
%% Cell type:code id:a3d244e2 tags: %% Cell type:code id:a3d244e2 tags:
``` python ``` python
b_predicted = list() b_predicted = list()
for k in range(10): for k in range(10):
p = b_network(torch.from_numpy(test_data[:,0:1])).detach().numpy() p = b_network(torch.from_numpy(test_data[:,0:1])).detach().numpy()
b_predicted.append(p[:,0]) b_predicted.append(p[:,0])
b_predicted = np.stack(b_predicted, axis=1) b_predicted = np.stack(b_predicted, axis=1)
``` ```
%% Cell type:markdown id:85fed034 tags: %% Cell type:markdown id:85fed034 tags:
We can now take the average result for each sample, and their root-mean-squared-error as an estimate of the mean and epistemic uncertainty for the results. We can now take the average result for each sample, and their root-mean-squared-error as an estimate of the mean and epistemic uncertainty for the results.
The aleatoric uncertainty is fitted as an independent parameter. Since we assume the aleatoric uncertainty is independent, we can calculate the total uncertainty as the sum of squares of the epistemic and aleatoric uncertainty. The aleatoric uncertainty is fitted as an independent parameter. Since we assume the aleatoric uncertainty is independent, we can calculate the total uncertainty as the sum of squares of the epistemic and aleatoric uncertainty.
%% Cell type:code id:4a56960d tags: %% Cell type:code id:4a56960d tags:
``` python ``` python
b_mean = np.mean(b_predicted, axis=1) b_mean = np.mean(b_predicted, axis=1)
b_sigma = np.std(b_predicted, axis=1) b_sigma = np.std(b_predicted, axis=1)
aleatoric_uncertainty = b_network.aleatoric_uncertainty().detach().numpy() aleatoric_uncertainty = b_network.aleatoric_uncertainty().detach().numpy()
total_uncertainty = (b_sigma**2 + aleatoric_uncertainty**2)**0.5 total_uncertainty = (b_sigma**2 + aleatoric_uncertainty**2)**0.5
``` ```
%% Cell type:markdown id:6ef3d430 tags: %% Cell type:markdown id:6ef3d430 tags:
Let's check how big are those uncertainties found: Let's check how big are those uncertainties found:
%% Cell type:code id:4d01c41f tags: %% Cell type:code id:4d01c41f tags:
``` python ``` python
print("Average epistemic uncertainty: ", np.mean(b_sigma)) print("Average epistemic uncertainty: ", np.mean(b_sigma))
``` ```
%% Output %% Output
Average epistemic uncertainty: 0.83653456 Average epistemic uncertainty: 0.83653456
%% Cell type:code id:67d456a1 tags: %% Cell type:code id:67d456a1 tags:
``` python ``` python
print("Aleatoric uncertainty: ", aleatoric_uncertainty) print("Aleatoric uncertainty: ", aleatoric_uncertainty)
``` ```
%% Output %% Output
Aleatoric uncertainty: 9.7015295 Aleatoric uncertainty: 9.7015295
%% Cell type:markdown id:364bdcd7 tags: %% Cell type:markdown id:364bdcd7 tags:
Note that the aleatoric uncertainty is very close to the standard deviation of the $\epsilon$ component of the model we created in the beginning! Clearly the model could fit the uncertainty coming from that component of the noise. Note that the aleatoric uncertainty is very close to the standard deviation of the $\epsilon$ component of the model we created in the beginning! Clearly the model could fit the uncertainty coming from that component of the noise.
It is not easy to estimate the effect of the epistemic uncertainty, as it is different for every data point (as it is scaled by $x^2$), but we can plot it to take a look at its effect. It is not easy to estimate the effect of the epistemic uncertainty, as it is different for every data point (as it is scaled by $x^2$), but we can plot it to take a look at its effect.
Note that the uncertainties are the standard deviations of Gaussian models and therefore they correspond to a $1\sigma$ quantile band, which is a 67% confidence band. The quantile corresponding to $2\sigma$ corresponds to a 95% confidence band in a Gaussian model. Note that the uncertainties are the standard deviations of Gaussian models and therefore they correspond to a $1\sigma$ quantile band, which is a 67% confidence band. The quantile corresponding to $2\sigma$ corresponds to a 95% confidence band in a Gaussian model.
%% Cell type:code id:8b9142e8 tags: %% Cell type:code id:8b9142e8 tags:
``` python ``` python
fig = plt.figure() fig = plt.figure()
ax = fig.add_subplot(111) ax = fig.add_subplot(111)
ax.scatter(test_data[:, 0], test_data[:, 1], alpha=0.5, label="Test data") ax.scatter(test_data[:, 0], test_data[:, 1], alpha=0.5, label="Test data")
ax.errorbar(test_data[:, 0], b_mean, yerr=2*total_uncertainty, alpha=0.5, fmt='or', label="95% band total unc.") ax.errorbar(test_data[:, 0], b_mean, yerr=2*total_uncertainty, alpha=0.5, fmt='or', label="95% band total unc.")
ax.errorbar(test_data[:, 0], b_mean, yerr=2*b_sigma, alpha=0.5, fmt='og', label="95% band epistemic unc.") ax.errorbar(test_data[:, 0], b_mean, yerr=2*b_sigma, alpha=0.5, fmt='og', label="95% band epistemic unc.")
ax.set(xlabel="$x$", ylabel="$f(x)$") ax.set(xlabel="$x$", ylabel="$f(x)$")
#ax.set_yscale('log') #ax.set_yscale('log')
plt.legend(frameon=False) plt.legend(frameon=False)
plt.show() plt.show()
``` ```
%% Output %% Output
%% Cell type:markdown id:ccecaea7 tags: %% Cell type:markdown id:ccecaea7 tags:
### Contact us at the EuXFEL Data Analysis group at any time if you need help analysing your data! ### Contact us at the EuXFEL Data Analysis group at any time if you need help analysing your data!
#### Danilo Ferreira de Lima: danilo.enoque.ferreira.de.lima@xfel.eu #### Danilo Ferreira de Lima: danilo.enoque.ferreira.de.lima@xfel.eu
#### Arman Davtyan: arman.davtyan@xfel.eu #### Arman Davtyan: arman.davtyan@xfel.eu
%% Cell type:code id:55635d1f tags: %% Cell type:code id:55635d1f tags:
``` python ``` python
``` ```
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment