Maximilian Beck
Working on efficient Large Language Models architectures.

ELLIS PhD Student at Johannes Kepler University Linz, Institute for Machine Learning
I am a final-year PhD student at the Institute for Machine Learning at Johannes Kepler University (JKU) Linz, advised by Sepp Hochreiter, and a part-time PhD Researcher at NXAI.
My research focuses on efficient architectures for Large Language Models (LLMs) with sub-quadratic complexity. I’m particularly interested in optimizing training and inference efficiency, understanding scaling laws, and exploring the application of these architectures in domains such as computer vision and robotics.
Currently, I am interning at Meta FAIR in the CodeGen team, where I work on LLMs for code generation and understanding.
Before starting my PhD, I completed both my Bachelor’s and Master’s studies in Mechatronics and Information Technology at the Karlsruhe Institute of Technology (KIT), with a focus on Control Theory, graduating in 2017 and 2021.
news
Sep 06, 2025 | I have handed in my PhD pre-defense report for my PhD Thesis: xLSTM: Recurrent Neural Network Architectures for Scalable and Efficient Large Language Models |
---|---|
May 12, 2025 | I started a summer internship at Meta FAIR in Paris working on code Large Language Models. |
May 10, 2025 | Our two papers xLSTM 7B and A Large Recurrent Action Model are accepted at ICML 2025! |
Apr 28, 2025 | Two main and two workshop papers at ICLR 2025. |
Jan 22, 2025 | Two papers accepted at ICLR 2025! Our FlashRNN and Vision-LSTM paper will be presented in Singapore. |