Geiger Bernhard, Kubin Gernot
2021
This Special Issue aims to investigate the properties of the information bottleneck (IB) functional in its new context in deep learning and to propose learning mechanisms inspired by the IB framework. More specifically, we invited authors to submit manuscripts that provide novel insight into the properties of the IB functional that apply the IB principle for training deep, i.e., multi-layer machine learning structures such as NNs and that investigate the learning behavior of NNs using the IBframework. To cover the breadth of the current literature, we also solicited manuscripts that discuss frameworks inspired by the IB principle, but that depart from them in a well-motivated manner.
Chiancone Alessandro, Cuder Gerald, Geiger Bernhard, Harzl Annemarie, Tanzer Thomas, Kern Roman
2019
This paper presents a hybrid model for the prediction of magnetostriction in power transformers by leveraging the strengths of a data-driven approach and a physics-based model. Specifically, a non-linear physics-based model for magnetostriction as a function of the magnetic field is employed, the parameters of which are estimated as linear combinations of electrical coil measurements and coil dimensions. The model is validated in a practical scenario with coil data from two different suppliers, showing that the proposed approach captures the different magnetostrictive properties of the two suppliers and provides an estimation of magnetostriction in agreement with the measurement system in place. It is argued that the combination of a non-linear physics-based model with few parameters and a linear data-driven model to estimate these parameters is attractive both in terms of model accuracy and because it allows training the data-driven part with comparably small datasets.
Geiger Bernhard
2018
This short note presents results about the symmetric Jensen-Shannon divergence between two discrete mixture distributions p1 and p2. Specifically, for i=1,2, pi is the mixture of a common distribution q and a distribution p̃ i with mixture proportion λi. In general, p̃ 1≠p̃ 2 and λ1≠λ2. We provide experimental and theoretical insight to the behavior of the symmetric Jensen-Shannon divergence between p1 and p2 as the mixture proportions or the divergence between p̃ 1 and p̃ 2 change. We also provide insight into scenarios where the supports of the distributions p̃ 1, p̃ 2, and q do not coincide.
Geiger Bernhard
2018
This entry for the 2018 MDPI English Writing Prize has been published as a chapter of "The Global Benefits of Open Research", edited by Martyn Rittman.