My personal blog

Machine learning, computer vision, languages

Riemannian SGD in PyTorch
23 July 2020
A lot of recent papers use different spaces than the regular Euclidean space. This trend is sometimes called geometric deep learning. There is a growing interest particularly in the domain of word embeddings and graphs. Since geometric neural networks perform optimization in a different space, it is not possible to simply apply stochastic gradient descent.

Computing Gromov Hyperbolicity
22 July 2020
Gromov Hyperbolicity measures the “tree-likeness” of a dataset. This metric is an indicator of how well hierarchical embeddings such as Poincaré embeddings [1] would work on a dataset. Some papers which use this metric are [2] and [3]. A Gromov Hyperbolicity of approximately zero means a high tree-likeness.

New Blog
21 July 2020
I decided to update my blog and replace minimal-mistakes by my own Jekyll theme. My goal was to increase the space for content and reduce the amount of personal information the reader sees. I took inspiration from the Bootstrap theme Clean Blog. Some of my older posts need updating, so I removed them for now.

Loss Functions For Segmentation
27 September 2018
In this post, I will implement some of the most common loss functions for image segmentation in Keras/TensorFlow. I will only consider the case of two classes (i.e. binary).

Portuguese Lemmatizers (2020 update)
08 May 2018
In this post, I will compare some lemmatizers for Portuguese. In order to do the comparison, I downloaded subtitles from various television programs. The sentences are written in European Portuguese (EP).

Newer Posts ←