# Bernhard Geiger

# Biography

I was born in Graz, Austria, in April 1984, where I studied Electrical Engineering with focus on Communications between 2004 and 2009 at Graz University of Technology. After receiving my Dipl.-Ing. in November 2009, I joined the Signal Processing and Speech Communication Lab (SPSC) as a project assistant, working on a software-defined GPS receiver. In 2010 I started my PhD thesis, taking a position as a research & teaching associate in the lab. After graduating in June 2014, I joined the Institute of Communications Engineering in November 2014.

My *h-*factor (as of August 2015) is 5 (excluding self citations). My Erdös number is 3, thanks to a joint "publication" with Wolfgang Woess in the Research News of Graz University of Technology.

In my leisure time (or during the commute to and from work) I enjoy reading a good book. My other hobbies are running, Geocaching, and playing Go (~15 kyu; you can challenge me - sliver1984 - at DGS).

# Research

Already during my PhD, in which I investigated the information loss in deterministic systems, I became increasingly interested in state space reduction for Markov chains. Based on a results obtained together various collaborators (Christoph Temmel, Tatjana Petrov, Heinz Koeppl), during the next few years I would like to continue deveoping information-theoretic methods for state space reduction for Markov and hidden Markov models.

# Links

For my most recent publications, please take a look at

My PhD thesis can be downloaded from the EURASIP database.

If you are interested in my teaching experience, you can either visit my old website at the SPSC Lab (incomplete) or request a detailled CV!

## Theses in Progress

Clemens Bloechl: Master Thesis - Aggregation of Hidden Markov Models - Theory and Applications | ||

The topic of the thesis is to develop and analyze information-theoretic aggregation methods to reduce the state space and/or the observation space of hidden Markov models. Using the Kullback-Leibler divergence rate as cost function, we expect connections with the information bottleneck method, lumpability, and spectral aggregation techniques. In the second part of the thesis, the developed methods shall be applied to practical examples, such as speech recognition systems. As a further example, the techniques shall be used to collapse output states of a discrete memoryless channel, without affecting the error probability of a Viterbi algorithm decoding a convolutional code. |
||

Supervisors: Bernhard Geiger |

Edward Wall: Forschungspraxis (12 ECTS) - Finite-Precision Gaussian Mixture Models | ||

In practical systems, Gaussian mixture models can only be presented with finite-precision. The first goal of this research internship is to survey the literature about how this problem is usually dealt with: Can we trade parameter precision for the number of mixture components? Can we restrict the covariance matrix to be diagonal? To be an identity matrix? What kind of cost functions are used to characterize these trade-offs? As a second goal, relative entropy shall be used as a cost function. For a simple multivariate Gaussian distribution and a given finite precision, the Gaussian distribution with quantized parameters minimizing relative entropy shall be computed. | ||

Supervisors: Bernhard Geiger |

Muhammad Firas Hammosh: Forschungspraxis (12 ECTS) - Is Online PCA Information-Preserving? | ||

In this research internship, and overview over existing online (i.e., iterative, recursive, etc.) algorithms for Prinicipal Components Analysis (PCA) should be given. We try to find our which (if any) of these algorithms is invertible in the sense that one can reconstruct the original data from only looking at the rotated data. For those algorithms for which this is not possible, the (relative) information loss should be computed. This work thus builds the bridge between PCA given knowledge of the covariance matrix (given-statistics) and PCA given only the sample covariance matrix (given-data). While no information is lost in the former, the information loss in the latter was shown to be substantial. We believe that the information loss of online PCA lies somewhere in between. |
||

Supervisors: Bernhard Geiger |

Muhammad Umer Anwaar: MSCE Internship - Coding Techniques for Natural Language Processing | ||

In this internship the student will review current state-of-the-art techniques for Natural Language Processing (with a focus on Machine Translation). Specifically, the student will check which of these techniques employ Hidden Markov Models, and whether they have connections to decoding algorithms for channel codes. Finally, the student should present a recommendation if, and how, list decoding methods can be applied in machine translation. If you are interested to pursue this topic for your Master's thesis that is also possible. | ||

Supervisors: Ali Amjad, Bernhard Geiger |

## Publications

### 2017

### 2016

- Geiger, B. C.:
**Information Theory for Markov Aggregation and Clustering**.*Invited Talk at Signal Processing Group, Universidad de Carlos III de Madrid*, 2016 - Geiger, B. C. and Amjad, R. A.:
**Hard Clusters Maximize Mutual Information – Some Results and an Open Problem**.*Internal LNT Workshop*, 2016 - Günlü O., Belkacem A., and Geiger B.:
**Quantizer and Code Design for Secret-key Binding to Physical Identifiers with Performance Guarantees**.*LNT Workshop*, Aug 2016 - Geiger, B. C.:
**The Fractality of Polar Codes**.*Second LNT & DLR Summer Workshop on Coding*, Jul 2016 - Geiger, B. C., Hofer-Temmel, C.:
**Graph-Based Lossless Markov Lumpings**.*Proc. IEEE Int. Sym. on Information Theory (ISIT)*, Jul 2016 - Geiger, B.C.; Böcherer, G.:
**Greedy Algorithms for Optimal Distribution Approximation**.*Entropy*, Vol. 18, Jul 2016, 262 - Geiger, B. C.:
**Information-Theoretic Methods for Aggregation of Markov Chains and Hidden Markov Models**.*Remote Sensing Technology Institute, German Aerospace Center*, Jun 2016 - Geiger, B. C.:
**The Fractality of Polar Codes**.*Int. Zurich Seminar on Communications (IZS)*, Mar 2016, 160-164 - Timo, R.; Saeedi Bidokhti, S.; Wigger, M.; Geiger, B.C.:
**A Rate-Distortion Approach to Caching**.*International Zurich Seminar (IZS), Switzerland;International Zurich Seminar on Communications (IZS)*, Mar 2016, 125-129 - Geiger, B. C.:
**Informationstheoretische Reduktion von Markov-Ketten und Hidden Markov Models**.*Bavarian Academy of Sciences and Humanities*, Jan 2016

### 2015

- Geiger, B. C.:
**The Fractality of Polar Codes**.*NEWCOM# Emerging Topics Workshop*, Jun 2015 - Geiger, B. C.; Petrov, T.; Kubin, G.; Koeppl, H.:
**Optimal Kullback-Leibler Aggregation via Information Bottleneck**.*IEEE Transactions on Automatic Control*, Vol. 60, No. 4, Apr 2015, 1010 - 1022 - Geiger, B. C.:
**Two Little (?) Problems**.*Joint Conference on Communications and Coding (JCCC)*, Mar 2015 - Geiger, B. C.:
**Markov State Space Aggregation via the Information Bottleneck Method**.*Theoretical Foundations of Machine Learning Conference*, Feb 2015

### 2014

- Geiger, B. C.:
**Markov State Space Aggregation via the Information Bottleneck Method**.*Schedae Informaticae*, Vol. 23, Dec 2014, 45–56 - Geiger, B. C.; Temmel, C.:
**Lumpings of Markov chains, entropy rate preservation, and higher-order lumpability**.*Journal of Applied Probability*, Vol. 51, No. 4, Dec 2014, 1114-1132