Bücher Wenner
Vorlesetag - Das Schaf Rosa liebt Rosa
15.11.2024 um 15:00 Uhr
Universal Estimation of Information Measures for Analog Sources
von Qing Wang, Sanjeev R. Kulkarni, Sergio Verdú
Verlag: Now Publishers Inc
Hardcover
ISBN: 978-1-60198-230-8
Erschienen am 26.05.2009
Sprache: Englisch
Format: 234 mm [H] x 156 mm [B] x 6 mm [T]
Gewicht: 172 Gramm
Umfang: 104 Seiten

Preis: 98,80 €
keine Versandkosten (Inland)


Dieser Titel wird erst bei Bestellung gedruckt. Eintreffen bei uns daher ca. am 20. November.

Der Versand innerhalb der Stadt erfolgt in Regel am gleichen Tag.
Der Versand nach außerhalb dauert mit Post/DHL meistens 1-2 Tage.

98,80 €
merken
klimaneutral
Der Verlag produziert nach eigener Angabe noch nicht klimaneutral bzw. kompensiert die CO2-Emissionen aus der Produktion nicht. Daher übernehmen wir diese Kompensation durch finanzielle Förderung entsprechender Projekte. Mehr Details finden Sie in unserer Klimabilanz.
Klappentext

Entropy, mutual information and divergence measure the randomness, dependence and dissimilarity, respectively, of random objects. In addition to their prominent role in information theory, they have found numerous applications, among others, in probability theory statistics, physics, chemistry, molecular biology, ecology, bioinformatics, neuroscience, machine learning, linguistics, and finance. Many of these applications require a universal estimate of information measures which does not assume knowledge of the statistical properties of the observed data. Over the past few decades, several non-parametric algorithms have been proposed to estimate information measures.
Universal Estimation of Information Measures for Analog Sources presents a comprehensive survey of universal estimation of information measures for memoryless analog (real-valued or real vector-valued) sources with an emphasis on the estimation of mutual information and divergence and their applications. The book reviews the consistency of the universal algorithms and the corresponding sufficient conditions as well as their speed of convergence.
Universal Estimation of Information Measures for Analog Sources provides a comprehensive review of an increasingly important topic in Information Theory. It will be of interest to students, practitioners and researchers working in Information Theory.