I was born in Graz, Austria, in April 1984, where I studied Electrical Engineering with focus on Communications between 2004 and 2009 at Graz University of Technology. After receiving my Dipl.-Ing. in November 2009, I joined the Signal Processing and Speech Communication Lab (SPSC) as a project assistant, working on a software-defined GPS receiver. In 2010 I started my PhD thesis, taking a position as a research & teaching associate in the lab. After graduating in June 2014, I joined the Institute of Communications Engineering in November 2014.
My h-factor (as of August 2015) is 5 (excluding self citations). My Erdös number is 3, thanks to a joint "publication" with Wolfgang Woess in the Research News of Graz University of Technology.
In my leisure time (or during the commute to and from work) I enjoy reading a good book. My other hobbies are running, Geocaching, and playing Go (~15 kyu; you can challenge me - sliver1984 - at DGS).
Already during my PhD, in which I investigated the information loss in deterministic systems, I became increasingly interested in state space reduction for Markov chains. Based on a results obtained together various collaborators (Christoph Temmel, Tatjana Petrov, Heinz Koeppl), during the next few years I would like to continue deveoping information-theoretic methods for state space reduction for Markov and hidden Markov models.
For my most recent publications, please take a look at
My PhD thesis can be downloaded from the EURASIP database.
If you are interested in my teaching experience, you can either visit my old website at the SPSC Lab (incomplete) or request a detailled CV!
Theses in Progress
|Aya Ben Salha: Bachelor Thesis - n-Markov Language Generator|
|The aim of the thesis is to build a program that learns a language based on plain text and generates a markov chain that can be used for imitating language. Language can be german, english or symphonies.|
|Supervisors: Patrick Schulte, Bernhard Geiger|
|Simon Heine: Bachelor Thesis - Markov Aggregation via Clique Partitions|
|Take a non-invertible function of a Markov chain - what you get is a stochastic process on a smaller state space that, in general, has less information than the Markov chain: Since the function is not invertible, it aggregates states, and the original state cannot be inferred from its aggregate alone.
In , we found sufficient conditions for such a non-injective function to preserve information: The output process contains the same information as the Markov chain, and the latter can be reconstructed from the former. In a later work, we found a connection between this sufficient condition and zero-error information theory. Specifically, one can construct a graph from the Markov chain and find an information-preserving non-injective function via partitioning this graph into cliques .
This bachelor thesis should apply this theory to a few practical examples: A clique partition algorithm should be implemented, and the algorithm should be tested at large, sparse Markov chains as they are common in chemical reaction networks or natural language processing. The prospective student should have good programming skills (Octave, Matlab, etc.) and should be interested in stochastic processes.
: Geiger, Temmel, "Lumpings of Markov chains, entropy rate preservation, and higher-order lumpability"
: Geiger, Hofer-Temmel, "Graph-Based Lossless Markov Lumpings"
|Supervisors: Bernhard Geiger|
|Anes Belkacem: Bachelor Thesis - Reliability Analysis for Secret-key Binding for Physical Identifiers|
|Secret-key binding for Physical Unclonable Functions (PUFs) is a promising technique to provide security to Internet of Things (IoT) rather than storing a key in Non-volatile Memory (NVM). Key-binding schemes using PUFs require an error-correction step due to the noisy nature of PUF outputs. The PUF post-processing scheme that uses the Discrete Cosine Transform (DCT) to decorrelate the source is a suboptimal but efficient approach. The DCT-based scheme uses an algorithm to satisfy the secrecy and reliability constraints simultaneously by optimizing a reliability metric for each DCT coefficient so that a certain block error probability can be achieved for the secret-key by using a linear block code. This thesis aims to find a better reliability metric to give performance guarantees to extract a fixed number of bits. Real PUF outputs will be used to determine the source and channel models for the DCT-based scheme and to check the security and reliability performance of the scheme with the new metric.|
|Supervisors: Onur Günlü, Bernhard Geiger|
|Clemens Bloechl: Master Thesis - Aggregation of Hidden Markov Models - Theory and Applications|
|The topic of the thesis is to develop and analyze information-theoretic aggregation methods to reduce the state space and/or the observation space of hidden Markov models. Using the Kullback-Leibler divergence rate as cost function, we expect connections with the information bottleneck method, lumpability, and spectral aggregation techniques.
In the second part of the thesis, the developed methods shall be applied to practical examples, such as speech recognition systems. As a further example, the techniques shall be used to collapse output states of a discrete memoryless channel, without affecting the error probability of a Viterbi algorithm decoding a convolutional code.
|Supervisors: Bernhard Geiger|
|Edward Wall: Forschungspraxis (12 ECTS) - Finite-Precision Gaussian Mixture Models|
|In practical systems, Gaussian mixture models can only be presented with finite-precision. The first goal of this research internship is to survey the literature about how this problem is usually dealt with: Can we trade parameter precision for the number of mixture components? Can we restrict the covariance matrix to be diagonal? To be an identity matrix? What kind of cost functions are used to characterize these trade-offs? As a second goal, relative entropy shall be used as a cost function. For a simple multivariate Gaussian distribution and a given finite precision, the Gaussian distribution with quantized parameters minimizing relative entropy shall be computed.|
|Supervisors: Bernhard Geiger|
|Muhammad Umer Anwaar: MSCE Internship - Coding Techniques for Natural Language Processing|
|In this internship the student will review current state-of-the-art techniques for Natural Language Processing (with a focus on Machine Translation). Specifically, the student will check which of these techniques employ Hidden Markov Models, and whether they have connections to decoding algorithms for channel codes. Finally, the student should present a recommendation if, and how, list decoding methods can be applied in machine translation. If you are interested to pursue this topic for your Master's thesis that is also possible.|
|Supervisors: Ali Amjad, Bernhard Geiger|
- Amjad, Rana Ali; Geiger, B.: Information Theoretic Clustering. , 2016
- Geiger, B.C.; Böcherer, G.: Greedy Algorithms for Optimal Distribution Approximation. Entropy, Vol. 18, 2016, 262
- Geiger, B. C.: The Fractality of Polar Codes. Second LNT & DLR Summer Workshop on Coding, Jul 2016
- Geiger, B. C.: Information-Theoretic Methods for Aggregation of Markov Chains and Hidden Markov Models. Remote Sensing Technology Institute, German Aerospace Center, Jun 2016
- Geiger, B. C.: The Fractality of Polar Codes. Int. Zurich Seminar on Communications (IZS), Mar 2016, 160-164
- Timo, R.; Saeedi Bidokhti, S.; Wigger, M.; Geiger, B.C.: A Rate-Distortion Approach to Caching. International Zurich Seminar (IZS), Switzerland;International Zurich Seminar on Communications (IZS), Mar 2016, 125-129
- Geiger, B. C.: Informationstheoretische Reduktion von Markov-Ketten und Hidden Markov Models. Bavarian Academy of Sciences and Humanities, Jan 2016
- Geiger, B. C.: The Fractality of Polar Codes. NEWCOM# Emerging Topics Workshop, Jun 2015
- Geiger, B. C.; Petrov, T.; Kubin, G.; Koeppl, H.: Optimal Kullback-Leibler Aggregation via Information Bottleneck. IEEE Transactions on Automatic Control, Vol. 60, No. 4, Apr 2015, 1010 - 1022
- Geiger, B. C.: Two Little (?) Problems. Joint Conference on Communications and Coding (JCCC) , Mar 2015
- Geiger, B. C.: Markov State Space Aggregation via the Information Bottleneck Method. Theoretical Foundations of Machine Learning Conference, Feb 2015
- Geiger, B. C.: Markov State Space Aggregation via the Information Bottleneck Method. Schedae Informaticae, Vol. 23, Dec 2014, 45–56
- Geiger, B. C.; Temmel, C.: Lumpings of Markov chains, entropy rate preservation, and higher-order lumpability. Journal of Applied Probability, Vol. 51, No. 4, Dec 2014, 1114-1132