Since the inception of the Mathematical Theory of Communication by Claude Shannon in 1938, a vast theory has unfolded to formulate and optimise the quantification of information and its transfer across communication systems. Applications have since ranged from signal processing and information flow over computer networks to eliciting predictive ability across complex natural systems. Information Theory became a scientific field in its own right, having more recently ventured onto the quantum world.
Underneath the statistical principles of Information Theory lie fundamental physics. The information laws are not arbitrary, but rather the reflex of core principles in Thermodynamics and Statistical Physics. These stem back to earlier foundations from such diverse contributors as Boltzmann, Fermi, Fokker, Planck and Einstein, along with kinematic geometric advances from Kolmogorov, Sinai, Pesin, Ruelle and Young.
In Information Physics, the concept of information and its characterisation are formulated from an underlying unified theoretical physics. This brings a deeper understanding about why information is the way it is and behaves the way it does, placing statistical attributes into a first principle physical background aiming at universality. It further provides more robust methodologies for information quantification, storage, transmission and optimisation, opening new avenues laid down by the very same physics that governs information dynamics in the first place.
The course starts with an overview on Information Theory, classical and quantum, then progressing to dynamical system theories and kinematic geometry. The last sector entails the theoretical physics of information and complexity being developed by the course coordinator, along with applications spanning from signal processing, machine learning, telecommunications, network design and optimisation in structurally evolving complex systems.