# Bernhard Geiger

# Biography

I was born in Graz, Austria, in April 1984, where I studied Electrical Engineering with focus on Communications between 2004 and 2009 at Graz University of Technology. After receiving my Dipl.-Ing. in November 2009, I joined the Signal Processing and Speech Communication Lab (SPSC) as a project assistant, working on a software-defined GPS receiver. In 2010 I started my PhD thesis, taking a position as a research & teaching associate in the lab. After graduating in June 2014, I joined the Institute of Communications Engineering in November 2014.

My *h-*factor (as of August 2015) is 5 (excluding self citations). My Erdös number is 3, thanks to a joint "publication" with Wolfgang Woess in the Research News of Graz University of Technology.

In my leisure time (or during the commute to and from work) I enjoy reading a good book. My other hobbies are running, Geocaching, and playing Go (~15 kyu; you can challenge me - sliver1984 - at DGS).

# Research

Already during my PhD, in which I investigated the information loss in deterministic systems, I became increasingly interested in state space reduction for Markov chains. Based on a results obtained together various collaborators (Christoph Temmel, Tatjana Petrov, Heinz Koeppl), during the next few years I would like to continue deveoping information-theoretic methods for state space reduction for Markov and hidden Markov models.

# Links

For my most recent publications, please take a look at

My PhD thesis can be downloaded from the EURASIP database.

If you are interested in my teaching experience, you can either visit my old website at the SPSC Lab (incomplete) or request a detailled CV!

## Theses in Progress

Clemens Bloechl: Master Thesis - Aggregation of (Hidden) Markov Models - Theory and Applications | ||

The topic of the thesis is to develop and analyze information-theoretic aggregation methods to reduce the state space and/or the observation space of hidden Markov models. Using the Kullback-Leibler divergence rate as cost function, we expect connections with the information bottleneck method, lumpability, and spectral aggregation techniques. In the second part of the thesis, the developed methods shall be applied to practical examples, such as speech recognition systems. As a further example, the techniques shall be used to collapse output states of a discrete memoryless channel, without affecting the error probability of a Viterbi algorithm decoding a convolutional code. |
||

Supervisors: Bernhard Geiger, Rana Ali Amjad |

Kairen Liu: Master Thesis - Information Theoretic Analysis of Neural Networks | ||

Various types of neural networks have gained a lot of attention in recent years and have found numerous practical applications with impressive results. Albeit their success, their behaviour is not very well understood mathematically. The aim of this thesis is to approach the topic from an information theoretic perspective and see if one can use insight from information and coding theory to analyze/design neural networks for specific applications. | ||

Supervisors: Rana Ali Amjad, Bernhard Geiger |

Muhammad Firas Hammosh: Forschungspraxis (12 ECTS) - Is Online PCA Information-Preserving? | ||

In this research internship, and overview over existing online (i.e., iterative, recursive, etc.) algorithms for Prinicipal Components Analysis (PCA) should be given. We try to find our which (if any) of these algorithms is invertible in the sense that one can reconstruct the original data from only looking at the rotated data. For those algorithms for which this is not possible, the (relative) information loss should be computed. This work thus builds the bridge between PCA given knowledge of the covariance matrix (given-statistics) and PCA given only the sample covariance matrix (given-data). While no information is lost in the former, the information loss in the latter was shown to be substantial. We believe that the information loss of online PCA lies somewhere in between. |
||

Supervisors: Bernhard Geiger |

Emna Ben Yacoub: Forschungspraxis (12 ECTS) - M-Type Approximation of Hidden Markov Models | ||

In this research project, we replace transition and observation probability matrices of hidden Markov models (HMMs) by matrices where each entry is an integer multiple of integer M (i.e., is "M-type"). The problem is an immediate extension of approximating finite-length probability vectors by M-type vectors. The Viterbi algorithm can be used to infer the state sequence from the observation sequence, given that the algorithm has knowledge of the transition and observation matrices. If, instead of the true matrices, the algorithm has knowledge only of their M-type approximations, this will lead to an increase in error probability. We try to find a connection between a probabilistic divergence measure between the true and the M-type model (e.g., Kullback-Leibler divergence rate, matrix norms, etc.) and this increase in error probability. |
||

Supervisors: Bernhard Geiger, Rana Ali Amjad |

## Publications

### 2017

- Geiger, B. C.; Amjad, Rana Ali.:
**Mutual Information-Based Clustering: Hard or Soft?**.*Proc. of 11th ITG Conf. on Systems, Communication and Coding (SCC)*(ITG-Fachbericht, Vol. 268, VDE, Feb 2017, 1-6 - Geiger, B. C.; Wu, Yuchen:
**Higher-Order Kullback-Leibler Aggregation of Markov Chains**.*Proc. of 11th ITG Conf. on Systems, Communication and Coding (SCC)*(ITG-Fachbericht, Vol. 268, VDE, Feb 2017, 1-6

### 2016

- Geiger, B. C., Kubin, G.:
**Information-Theoretic Analysis of Memoryless Deterministic Systems**.*Entropy*, Vol. 18, No. 11, Nov 2016 - Geiger, B. C.:
**Information Theory for Markov Aggregation and Clustering**.*Invited Talk at Signal Processing Group, Universidad de Carlos III de Madrid*, Sep 2016 - Geiger, B. C. and Amjad, R. A.:
**Hard Clusters Maximize Mutual Information – Some Results and an Open Problem**.*Internal LNT Workshop*, Aug 2016 - Günlü O., Belkacem A., and Geiger B.:
**Quantizer and Code Design for Secret-key Binding to Physical Identifiers with Performance Guarantees**.*LNT Workshop*, Aug 2016 - Geiger, B. C.:
**The Fractality of Polar Codes**.*Second LNT & DLR Summer Workshop on Coding*, Jul 2016 - Geiger, B. C., Hofer-Temmel, C.:
**Graph-Based Lossless Markov Lumpings**.*Proc. IEEE Int. Sym. on Information Theory (ISIT)*, Jul 2016 - Geiger, B.C.; Böcherer, G.:
**Greedy Algorithms for Optimal Distribution Approximation**.*Entropy*, Vol. 18, Jul 2016, 262 - Geiger, B. C.:
**Information-Theoretic Methods for Aggregation of Markov Chains and Hidden Markov Models**.*Remote Sensing Technology Institute, German Aerospace Center*, Jun 2016 - Geiger, B. C.:
**The Fractality of Polar Codes**.*Int. Zurich Seminar on Communications (IZS)*, Mar 2016, 160-164 - Timo, R.; Saeedi Bidokhti, S.; Wigger, M.; Geiger, B.C.:
**A Rate-Distortion Approach to Caching**.*International Zurich Seminar (IZS), Switzerland;International Zurich Seminar on Communications (IZS)*, Mar 2016, 125-129 - Geiger, B. C.:
**Informationstheoretische Reduktion von Markov-Ketten und Hidden Markov Models**.*Bavarian Academy of Sciences and Humanities*, Jan 2016

### 2015

- Geiger, B. C.:
**The Fractality of Polar Codes**.*NEWCOM# Emerging Topics Workshop*, Jun 2015 - Geiger, B. C.; Petrov, T.; Kubin, G.; Koeppl, H.:
**Optimal Kullback-Leibler Aggregation via Information Bottleneck**.*IEEE Transactions on Automatic Control*, Vol. 60, No. 4, Apr 2015, 1010 - 1022 - Geiger, B. C.:
**Two Little (?) Problems**.*Joint Conference on Communications and Coding (JCCC)*, Mar 2015 - Geiger, B. C.:
**Markov State Space Aggregation via the Information Bottleneck Method**.*Theoretical Foundations of Machine Learning Conference*, Feb 2015

### 2014

- Geiger, B. C.:
**Markov State Space Aggregation via the Information Bottleneck Method**.*Schedae Informaticae*, Vol. 23, Dec 2014, 45–56 - Geiger, B. C.; Temmel, C.:
**Lumpings of Markov chains, entropy rate preservation, and higher-order lumpability**.*Journal of Applied Probability*, Vol. 51, No. 4, Dec 2014, 1114-1132