My personal blog

Machine learning, computer vision, languages

Uncertainty estimation in neural networks
14 August 2020
In this blog post, I will implement some common methods for uncertainty estimation. My main focus lies on classification and segmentation. Therefore, regression-specific methods such as Pinball loss are not covered here.

Metrics for uncertainty estimation
07 August 2020
Predictions are not just about accuracy, but also about probability. In lots of applications it is important to know how sure a neural network is of a prediction. However, the softmax probabilities in neural networks are not always calibrated and don’t necessarily measure uncertainty. In this blog post, I will implement the most common metrics to evaluate the output probabilities of neural networks.

Implementing Poincaré Embeddings in PyTorch
24 July 2020
After having introduced Riemannian SGD in the last blog post, here I will give a concrete application for this optimization method. Poincaré embeddings [1][2] are hierarchical word embeddings which map integer-encoded words to the hyperbolic space.

Riemannian SGD in PyTorch
23 July 2020
A lot of recent papers use different spaces than the regular Euclidean space. This trend is sometimes called geometric deep learning. There is a growing interest particularly in the domain of word embeddings and graphs. Since geometric neural networks perform optimization in a different space, it is not possible to simply apply stochastic gradient descent.

Computing Gromov Hyperbolicity
22 July 2020
Gromov Hyperbolicity measures the “tree-likeness” of a dataset. This metric is an indicator of how well hierarchical embeddings such as Poincaré embeddings [1] would work on a dataset. Some papers which use this metric are [2] and [3]. A Gromov Hyperbolicity of approximately zero means a high tree-likeness.

Older Posts →