Maximilian Beck
Working on efficient Large Language Models architectures.
ELLIS PhD Student at Johannes Kepler University Linz, Institute for Machine Learning
I am a final-year PhD student at the Institute for Machine Learning at Johannes Kepler University (JKU) Linz, advised by Sepp Hochreiter, and a PhD Researcher at NXAI.
My research focuses on efficient architectures for Large Language Models (LLMs) with sub-quadratic complexity. I’m particularly interested in optimizing training and inference efficiency, understanding scaling laws, and exploring the application of these architectures in domains such as computer vision and robotics.
From May to October 2025, I was interning at Meta FAIR in the CodeGen team. During my internship I worked on LLMs for code generation and understanding. Specifically, I explored architectures for code world models, evaluated code execution prediction capabilities and worked on neural debuggers.
Before starting my PhD, I completed both my Bachelor’s and Master’s studies in Mechatronics and Information Technology at the Karlsruhe Institute of Technology (KIT), with a focus on Control Theory, graduating in 2017 and 2021.
news
| Mar 19, 2026 | I have successfully defended my PhD thesis with title xLSTM: Recurrent Neural Network Architectures for Scalable and Efficient Large Language Models Special thanks to Sepp Hochreiter for his excellent guidance throughout my PhD. (Pre-Recording) (Slides) (PhD-Thesis) |
|---|---|
| Mar 10, 2026 | We have published our paper Towards a Neural Debugger for Python on arxiv. I had a lot of fun building this during my internship at Meta FAIR Paris last summer. Huge thanks to mentors and collaborators Jonas Gehring, Jannik Kossen and Gabriel Synnaeve!
|
| Jan 22, 2026 | Our two papers xLSTM Scaling Laws: Competitive Performance with Linear Time-Complexity and Short window attention enables long-term memorization are accepted at ICLR 2026 in Rio de Janeiro! |
| Oct 24, 2025 | I have successfully finished my internship at Meta FAIR with Jonas Gehring. I am happy to have contributed and co-authored two papers, with one more in preparation - Stay tuned! |
| Oct 03, 2025 | We published our paper xLSTM Scaling Laws: Competitive Performance with Linear Time-Complexity on arxiv.
|