Home / News / Article: The Entropy of Words – Learnability and Expressivity across More than 1000 Languages


cover-entropy-v19-i6 A new research article by DFG Center’s Research Fellow Dr. Christian Bentz has been published on Entropy.
In this article three parallel corpora, encompassing ca. 450 million words in 1916 texts and 1259 languages are analyzed, to tackle some of the major conceptual and practical problems of word entropy estimation: dependence on text size, register, style and estimation method, as well as non-independence of words in co-text.


Upcoming Events